Some thoughts on Samsung’s Galaxy S8 and its DeX docking cradle

Samsung’s newest Galaxy smartphone range was launched to great fanfare last week, but came with an unexpected surprise. No, not an exploding battery, but a docking cradle that turns it into a desktop computer.

The latest line-up of Samsung’s flagship handset comprises two models, the Galaxy S8 and Galaxy S8+, that are everything you would expect from the leading smartphone vendor. They offer a choice of display size – 5.8in and 6.2in – multi-core processors, support for the latest high-speed networks, and the Android 7.0 Nougat operating system.

Owners of the new devices can also pick up an unusual optional extra for their Galaxy S8 or Galaxy S8+, in the shape of the DeX. This looks rather like a fancy black ashtray, but turns out to be a docking cradle that comes with an Ethernet port, HDMI video output and full-size USB 2.0 and USB Type C ports.

As well as charging the phone, the DeX lets you connect up to a desktop monitor, keyboard and mouse, and presumably also to a LAN using the Ethernet port, effectively turning your phone into a desktop computer. When plugged in like this, the user interface switches to a desktop-style user interface to make use of the larger display.

Samsung Galaxy S8 and DeX desktop cradle

This is an interesting concept, and one that has been mooted before. The comparison many in the tech industry are making is to the Continuum feature of Windows 10, which is intended to offer a similar large-screen experience for users of phones running Windows 10, when connected to a monitor. But the Windows Phone platform is essentially going nowhere, and many mobile watchers have pretty much written it off as dead.

However, it is also a feature that Ubuntu firm Canonical touted for its proposed Edge “superphone” back in 2013. This was to have been a high-spec smartphone running Ubuntu Linux, which would switch between a mobile user interface and a full Linux desktop shell, depending on whether it was docked or not. Sadly, the Edge never met its crowdfunding target on Indiegogo, and so did not go into production.

Samsung’s implementation of the concept could potentially attract interest from professional users because of the fact that key apps such as Microsoft’s Office Mobile suite are available for Android, and someone whose role mostly involves document work in Microsoft Office could conceivably run this on a Galaxy S8 or Galaxy S8+ connected to a monitor, in place of a desktop computer.

Another potential use case is not as a PC per se, but as a thin client for accessing virtual desktop (VDI) sessions. Here, the user would have access to a full-blown Windows desktop running full Windows apps, with the docked Galaxy S8 or Galaxy S8+ serving as a terminal. Both Citrix and VMware offer Android versions of their VDI client software, which Dell Wyse used to deliver a pocket-sized thin client in the shape of the Cloud Connect device several years back.

Citrix also showed off its receiver client running on a smartphone connected to a monitor screen as far back as 2010, although this required a handset with an HDMI output to function, and few had this.

Although Samsung seems to have made a splash with DeX at the Galaxy S8 launch, is there actually much call for this usage model? While the ability to turn your smartphone into a desktop client system is a neat trick, who would actually use such a system?

Most mobile professionals currently use a laptop, and plug that into a desktop display when they are in the office. The laptop has a decent sized screen that you can take out on the road with you, while with the Samsung DeX, you leave your big screen and keyboard behind in the office when you go roaming.

Samsung DeX desktop cradle

While it is conceivable that there may be some mobile worker roles in which a big screen is needed only in the office, and a smartphone is sufficient while out on the road, these would seem to be a bit of a niche. Few people would suggest that a smartphone with its small screen and lack of a physical keyboard would be suitable for intensive work – they tend to be used for checking emails or looking up information.

Then there is the fact that the suggested price of the DeX docking cradle – $149.99 in the US – is not much lower than many existing thin client terminals from established vendors in this market such as HP and Dell Wyse, so anyone choosing to equip their staff with a Galaxy device and a DeX to use as a VDI client would not really be saving much.

In addition, the lifecycle of a smartphone tends to be much shorter than that of a corporate device like a thin client terminal. Users tend to upgrade their phones every couple of years, whereas thin clients are often good for up to seven years of use.

Another potential pitfall is that you may invest in a bunch of Galaxy S8 or Galaxy S8+ handsets and DeX cradles, only to find that Samsung may no longer support DeX with any successor generation of Galaxy devices.

While Samsung is to be applauded for exploring a novel use of smartphones with the DeX hardware, it may find that the main market for this will be found among consumers or tech enthusiasts rather than business customers.

Windows Server on ARM: what does it mean?

Qualcomm Centriq ARM server

The demonstration of a version of Windows Server running on ARM-based servers came as a shock to many, especially as this is something that Microsoft has expressly ruled out in the past. But look closely and there is no suggestion as yet that this will lead to commercial availability of such products.

This first public demonstration of Windows Server running on ARM-based systems came at the Open Compute Project (OCP) Summit 2017 in Santa Clara. It was conducted by chipmaker Qualcomm, using its Centriq 2400 platform that boasts up to 48 cores based on ARM’s 64-bit ARMv8 architecture.

Microsoft is also working with at least one other chipmaker, Cavium, which trumpeted its own involvement in a statement saying it was “collaborating with Microsoft on evaluating and enabling a variety of cloud workloads running on Cavium’s flagship ThunderX2 ARMv8-A Data Center processor for the Microsoft Azure cloud platform”.

The key phrase here is “for the Microsoft Azure cloud platform”. This version of Windows Server, and the systems it is running on, seem to be part of an evaluation by Microsoft to see how well ARM-based servers can run some of its cloud services operated from its network of data centres.

ARM has been taking aim at the server market for at least the past five years, at least as far back as the launch of its 64-bit ARMv8 architecture. The proposition is that ARM cores are less complex and consume less energy than rival architectures, such as Intel’s x86 and IBM’s Power processors.

[For more on this, see my article on IDG Connect: No ARM in a bit of server market competition]

However, expert opinion has so far been that the economics of this would only really make sense for hyper-scale environments – typically meaning the large cloud service and internet companies such as Google, Facebook, AWS, and Microsoft, which operate tens of thousands or even millions of server nodes. These are the companies for whose requirements the OCP was started in the first place.

In a post on Microsoft’s Azure blog, Distinguished Engineer Leendert van Doorn confirmed that the ARM servers are currently for Microsoft’s internal use only:

“We have been working closely with multiple ARM server suppliers, including Qualcomm and Cavium, to optimize their silicon for our use. We have been running evaluations side by side with our production workloads and what we see is quite compelling.”

What this may mean is that Microsoft could be planning to migrate some of its cloud services over to ARM-based infrastructure at some point in the future. How worried should Intel be about this move?

The reality is that x86 systems are not going to go away, for the simple reason that the virtual machine workloads that Microsoft customers have hosted on Azure require an x86 server to run on: pretty much every enterprise in the world is run on x86 servers, and these customers expect any public cloud infrastructure-as-a-service (IaaS) to do the same for compatibility reasons.

Again, Microsoft confirms this in the blog:

“One of the biggest hurdles to enable ARM servers is the software. Rather than trying to port every one of our many software components, we looked at where ARM servers are applicable and where they provide value to us. We found that they provide the most value for our cloud services, specifically our internal cloud applications such as search and indexing, storage, databases, big data and machine learning.”

“To enable these cloud services, we’ve ported a version of Windows Server, for our internal use only, to run on ARM architecture. We have ported language runtime systems and middleware components, and we have ported and evaluated applications, often running these workloads side-by-side with production workloads.”

So, don’t expect to see ARM-based Windows Servers anywhere except in hyper-scale cloud data centres for now. Of course, where Microsoft leads, others may follow, but the huge installed base of Intel-based servers out there means that the average company is not going to be buying ARM servers anytime soon.

Raspberry Pi Zero gets wireless to celebrate the Pi’s fifth birthday

The Raspberry Pi is officially five years old today, and the folks behind the low-cost single-board computer are celebrating with the release of a new model, which brings wireless connectivity to the smallest form factor Pi model on the market.

Known as the Raspberry Pi Zero W, the new model keeps the same dimensions as the existing Raspberry Pi Zero design, but adds in the wireless functionality (802.11 b/g/n WiFi and Bluetooth 4.1) that was introduced with the Raspberry Pi 3 Model B a year ago.

Raspberry Pi Zero W

Also unveiled today is a new injection-moulded case designed to snugly fit either the new Pi Zero or the existing model, and which comes which a choice of three interchangeable lids; one with a cut-out for the camera module accessory (the ribbon cable for this is also included), another with a cut-out exposing the 40-pin expansion connector, and the third one with no cut-out.

Pi Zero W case and lids

Pricing for the new Raspberry Pi Zero W is expected to be £9.60 inc VAT in the UK, while the new Pi Zero case is expected to cost somewhere in the region of £5. Both are expected to be available from the usual Pi outlets such as The Pi Hut, Pimoroni and Adafruit.

With wireless capability, the Raspberry Pi Zero W fixes what was the only major drawback of the Pi Zero: no network connectivity. True, you could plug in some kind of USB network adapter, but this would be cumbersome, especially as the Pi Zero has only a micro-USB port and needs an adapter to take standard USB devices.

Pi Zero and Pi Zero W

Top: original Pi Zero
Bottom: Pi Zero W

However, the lack of wireless connectivity has not stopped the Pi Zero from being used for a variety of projects, including build-it-yourself digital camera, when combined with the camera module.

With the addition of built-in WiFi and Bluetooth, the Raspberry Pi Zero W looks certain to find its way into numerous new devices, especially projects aimed at the Internet of Things (IoT), given the device’s small size. The wireless support means the new Pi Zero can connect to other devices or to the internet via a WiFi connection.

In fact, given the small size of the device – 65mm long by 30mm wide – we were at a loss to work out where the antenna is on the Raspberry Pi Zero W. Locating the actual wireless chip is relatively easy, as it is a small silvery package, as seen on the Raspberry Pi 3.

However, the Raspberry Pi 3 also had a small surface-mounted antenna, of which there is no sign on the Pi Zero W. It turns out that the new model has an antenna actually built into the circuit board, located between the miniature HDMI and USB connectors (see image below).

Pi Zero W antenna

This piece of high-tech wizardry comes from a Swedish firm called ProAnt, which specialises in antenna design, and is alluded to by text on the rear of the new Pi that says “antenna technology licensed from ProAnt”.

Here is the hardware specifications for the new Raspberry Pi Zero W, as detailed by the folks at Raspberry Pi.

– 1GHz, Single-core CPU

– 512MB RAM

– Mini HDMI and USB On-The-Go ports

– Micro USB power

– HAT-compatible 40-pin header

– Composite video and reset headers

– CSI camera connector

– 802.11n wireless LAN

– Bluetooth 4.0

These specifications are essentially the same as for the original Pi Zero, save for the addition of wireless, meaning that the new model has a less powerful processor than the Pi 3, but still more powerful than the original Raspberry Pi launched five years ago.

Satellite broadband: the unloved cousin of internet connectivity

SatelliteSatellite broadband has been around for many years now, and holds out the promise of internet access for users located pretty much anywhere in the UK that happens to be above ground. So why is this never mentioned when the issue of universal broadband access is raised?

Access to broadband internet is still an issue for many people in the UK, especially those living in remote or rural areas. The problem is that ADSL broadband delivered via a telephone line is limited by the distance to the telephone exchange; the further away you are, the more the signal quality becomes degraded.

This can be addressed in several ways, such as by using a fibre optic connection rather than the telephone line, but the latter can be costly to roll out, so providers like BT have focused first on areas where there are large numbers of subscribers, and may be reluctant to expand out to areas where there are few people to help them recoup the investment in infrastructure.

This can be frustrating for anybody in this situation, but especially for business users, for which the internet is increasingly vital for access to information and reaching customers. A recent episode of the BBC magazine programme Countryfile focused on farmers, who complained that vital information for their industry is increasingly delivered over the web, making internet access a necessity.

A common theme in features like this is simply to berate the government for its failure to stump up the money for broadband roll-out to rural areas. Few mention that such customers could be served by satellite broadband, at least until their area is served by an acceptable terrestrial broadband service.

Satellite broadband does what its name suggests: it delivers internet access via a two-way connection to a satellite in a geostationary orbit above the Earth. This is different from satellite TV, where a satellite simply broadcasts a television signal to everyone with a receiver within the satellite’s ground area footprint (Satellite TV companies such as Sky that also offer internet access provide this service via a telephone line).

To receive satellite broadband, a customer will require a satellite dish to be installed somewhere outside their premises. This is connected to a satellite modem, the equivalent of a broadband modem or router that you will see in a home with a typical broadband service.

So, the big advantage of satellite broadband is that you can access it pretty much anywhere. But what about disadvantages? Firstly, customers will have to stump up for an installation fee to have an engineer visit their premises, fit the satellite dish and orient it towards the satellite that delivers the service, as well as install the satellite modem. This typically starts at around £100 or so, but can be several hundred.

Secondly, potential customers should be aware that satellite internet access is subject to much longer latency than a terrestrial broadband connection. Latency is basically the “round trip” time taken for a signal from your computer to reach its destination and a response to come back.

Satellite broadband has this problem because data has to be transmitted from your computer up to the satellite in geostationary orbit, then back down again to the service provider’s ground station, from where it is routed onto the wider internet. The response has to return using the same route, and all of this adds delay.

For many applications such as browsing the web or downloading files, this latency is not really noticeable. Where it can become an issue is with videoconferencing or internet telephony, when users may notice a lag in the connection, and if you tried to play online action games, for example.

Another problem that satellite broadband customers may experience is that the signal quality can be affected by the weather conditions, especially rain or snow.

In terms of speed, most of the satellite internet providers can offer download speeds of up to 20Mbps, and upload speeds of 1Mbps up to about 6Mbps. The packages on offer from the providers vary, often based as much on the amount of data you are allowed to download each month as on the actual speed of the service, with some starting at just £10 per month.

What this all means is that satellite broadband is not for everyone, but if you do live in a remote or rural location and there are few other options, it is worth evaluating, rather than waiting for BT to get around to upgrading the infrastructure in your area.

Some satellite broadband providers covering the UK:

Tooway

Avonline

Europasat

Internet of Things or the Internet of hyberbole?

Internet of Things

The Internet of Things (or IoT, if you prefer) is one of those nebulous concepts that covers a multitude of things, rather like “cloud computing”, and thus gets hyped up and misappropriated, with vendors, marketers and journalists alike attaching the term to almost everything in order to attract attention to an otherwise me-too product or dull article.

Perhaps it is because there are so many wide-ranging use cases for the Internet of Things that it gets everybody confused. However, as I have opined before on more than one occasion in the past, the Internet of Things is basically just the internet, but with a whole lot more devices connected to it than before, and new applications.

The basic premise behind the Internet of Things is that internet access is now almost ubiquitous (at least for most people in the developed world), and reaches to almost anywhere. Instead of just using the internet so that millions of people can check their status updates on social media, why not also use it to connect up things that it would be handy to get data from, like weather stations, or traffic flow sensors, or anything that might not typically be connected previously?

In the past, if you wanted to collect temperature and rainfall data from a bunch of weather stations dotted around the landscape, you might have had to connect them up using some proprietary wireless system, or had a telephone line wired to each one so you could use a dial-up modem.

Nowadays, you can (relatively) easily connect anything to the internet, whether via WiFi, over a cellular network, and take advantage of ready-made protocol stacks for communicating with your central systems, making it cheaper and easier to build such a solution.

In indoor environments, it is typically even easier to connect things up, especially in offices where there is often WiFi signal coverage almost everywhere, or an Ethernet port within a few metres.

There are two other key factors that have enabled the recent uptake of projects and solutions that we can label as Internet of Things, and these are the increasing miniaturisation of compute hardware, and the growth of analytics tools that can sift through captured data and glean some useful insight from it.

Most people have likely heard of the Raspberry Pi, the credit-card sized device that was initially designed to help children to learn developer skills. This device and others like it now pack in a considerable amount of compute power at a low cost.

Meanwhile, collecting telemetry data from internet-enabled devices and hardware enables analysis to look for patterns, such as those indicating a potential fault is developing in machinery, for example.

However, the Internet of Things is also leading to a whole load of ill-considered, mostly consumer-focused products like internet-connected toasters, connected lightbulbs, and that evergreen cliché, the connected fridge.

Many of these products seem to offer little real advantage for the massive inconvenience they bring, by which I mean the need to configure and setup the connected wotsit and keep it up to date with the inevitable stream of patches and bugfixes that any self-respecting “smart” device seems to require these days.

Then there is the never-ending stream of hyperbole about the Internet of Things, such as the recent claim by one technology publication that every single thing you buy in future will be connected, and that buyers will have no choice in the matter. The publication in question even quoted a respected security researcher as implying it.

This is ridiculous for several reasons. Embedded compute hardware is cheap, but not so cheap that would cost less than the same device without it. And not least of all, there is ludicrous notion that users would have no choice about the device connecting itself to the internet and reporting on what kind of toast they eat, and so on.

How this is supposed to happen when even expert users still have problems getting some devices to connect to the internet is glossed over. Is it supposed to read your mind to get the WiFi password to your network?

Sadly, it looks like we can all expect more of this in future. The ‘things’ in the Internet of Things enables almost any device or application to be cast as a magic new gizmo that is going to make all of our lives better.

The reality is likely to be more prosaic. The real Internet of Things use cases are more likely to be applications such as building automation, traffic flow monitoring, and the aforementioned connecting of sensors to industrial equipment to monitor performance and allow for predictive maintenance rather than waiting for faults to cause downtime before addressing them.

Ten years of the iPhone

First generation Apple iPhoneApparently, it is now ten years since Steve Jobs stood up on stage and announced the worst kept secret of the decade: that Apple was about to enter the smartphone market, with a device that would become known as the iPhone.

That Apple was working on a mobile phone was widely reported and speculated on, but the precise details were kept under wraps, and thus the firm still managed to surprise everyone with the device that Jobs finally showed off on stage.

It is interesting now to note that at the time there was considerable debate over whether the iPhone would be a success for Apple or not. I can recall being in two minds about this, because the first iPhone was something of a brick compared with contemporary candybar smartphones, and it supported only 2G cellular communications, meaning that data access was painfully slow even by mobile standards.

However, I was also well aware of the “Apple effect”, which meant that almost anything the company produced was eagerly snapped up by adoring Apple fans.

But there were several significant new capabilities that Apple brought to the market with the original iPhone; it was the first with a capacitive touchscreen and a user interface designed to make the best use of it, and it was the first to enable gesture-based controls, such as pinch-to-zoom to make small text readable in the Safari browser.

In my opinion, it was the user interface that made the iPhone a success. Contemporary smartphones required the user to navigate on-screen options using a keypad, or with a crude stylus that Windows Mobile forced on users because its user interface was modelled on that of desktop Windows and the on-screen controls were often tiny and difficult to accurately reach.

This meant that, despite the glaring lack of support for 3G wireless, the relatively high cost of the device, and its chunky size, the iPhone was an attractive option for those who wanted a smartphone but didn’t want to have to do a PhD in Computer Science before they could operate it.

With later models, Apple fixed the lack of 3G support and delivered another crucial innovation in the shape of the App Store, enabling users to purchase and download new applications direct to their device at the touch of a button. In contrast, users of other smartphones typically had to install an app by downloading them to a PC before syncing it to their mobile device and running an installer.

This masterstroke benefited Apple, its users, and developers. Users got a trusted source for applications, developers got a pre-built store to showcase their wares, and Apple got to make revenue from every app sale. Many users now cite the broad range of apps as their main reason for choosing an iPhone.

To summarise, Apple didn’t invent the smartphone, but it was first with many of the features that many users now consider to be an indispensable part of the smartphone experience. The iPhone has shaped the smartphone market to such an extent that it is fair to say it gave the industry a kick up the backside, and led to the mobile world we see today.

My original review of the first iPhone, from IT Week in December of 2007 (content now moved to Computing) when the device finally made it to these shores , can still be read here.