About

This is the personal blog of Dan Robinson, a freelance technology writer covering the IT industry, especially enterprise IT and cloud computing.

I am an experienced technology writer, having worked for various computing titles both in print and online, and able to cover a wide range of technical topics, from processors to storage , the Internet of Things and data centre infrastructure.

My interest in IT goes back to my family getting hold of a self-assembly Sinclair ZX81 home computer kit way back in the mists of time, and from then on I was hooked on programming and electronics. I progressed to the ZX Spectrum, the BBC Model B, and then on to IBM PC compatibles.

I have been working as a technology journalist since 1992, but unlike many others in the industry, I have qualifications in the field, including a BSc (Hons) in Information Technology.

During my career I have worked for many publications (some now defunct), including What PC? PC Direct, IT Week and V3.co.uk. I have also had content published in Computing and The Inquirer.

More recently, I have contributed to online sites including Datacenter Dynamics, The Register, Computer Weekly, The Next Platform and IDG Connect.

To get in contact, email me: dan.w.robinson(at)googlemail.com

Follow my tweets at https://twitter.com/TheDanRobinson

Sample Articles:

Enterprise HPC: Why HPE is buying Cray (Computer Weekly)

HPE’s decision to acquire supercomputing pioneer Cray for $1.3bn serves to highlight the growing importance of high-performance computing (HPC) deployments in the enterprise market.

Broadening The Appeal Of Distributed NVM-Express Storage (The Next Platform)

NVM-Express has been creating a quiet revolution in servers for several years now, providing a way for flash storage to bypass the traditional storage stack and the limitations of interfaces such as SATA and SAS, and instead pipe data directly into and out of the CPU through the high-speed PCI Express bus.

Why Kubernetes is emerging as a critical enabler of multi-cloud (Computer Weekly)

There has been a lot of attention paid to containers over the past few years, as organisations turned to so-called “cloud-native” technologies for building new applications and services.

How server disaggregation could make cloud datacenters more efficient (IDG Connect)

The growth in cloud computing has shone a spotlight on datacenters, which already consume at least 7 percent of the global electricity supply and growing, according to some estimates. This has led the IT industry to search for ways of making infrastructure more efficient, including some efforts that attempt to rethink the way computers and datacenters are built in the first place.

Opening the racks (Datacenter Dynamics)

Servers in the data center get refreshed on a regular basis, like other equipment such as network switches and power distribution systems, but the one thing that stays constant is the physical infrastructure that houses it all: the rack.

Or does it? In the last few years many of the largest names on the Internet have started to look at whether the humble rack needs updating for the hyperscale era.

DevOps: Bloody hell, we’ve got to think about security too! Sigh. Who wants coffee? (The Register)

Imagine you’re an organisation that is looking to implement a DevOps approach to applications and services, or perhaps you’ve already started, but you’re worried about security.

DevOps is all about rapid iteration and continuous delivery, but your security folks still want to be able to do checks to ensure systems are as bulletproof as possible.

Why tape is still strong (Datacenter Dynamics)

Tape storage is one of those technological hangovers from the early days of computing, associated in the minds of many with rooms full of massive mainframe cabinets. Somewhat like the mainframe, tape shows no signs of going away just yet, and ironically, could even be handed a new lease of life thanks to the burgeoning volumes of data that are being accumulated in modern data centers.

You better explain yourself, mister: DARPA’s mission to make an accountable AI (The Register)

The US government’s mighty DARPA last year kicked off a research project designed to make systems controlled by artificial intelligence more accountable to their human users….the issue at the heart of the Explainable Artificial Intelligence (XAI) programme is that AI is starting to extend into many areas of everyday life yet the internal workings of such systems are often opaque, and could be concealing flaws in their decision-making processes.

Intel looks for a new direction amid a changing IT (IDG Connect)

Intel has long been one of the most influential companies in the IT industry, credited with producing the first commercially available microprocessor chip and helping to drive numerous other advances, such as the PC and widespread adoption of WiFi networking. In recent years, the PC market from which Intel has traditionally drawn a large chunk of its revenue has either stagnated, or at least has not seen the kind of growth it once used to, something that has caused many leading IT firms to reassess their business plans and seek a new direction in the pursuit of profitability.

Changing tracks: Examining VMware’s evolving hybrid cloud strategy (Computer Weekly)

VMware has long been one of the dominant players in the private cloud arena, and could even lay claim to have invented the concept as we now understand it. But the company’s efforts to expand into the public cloud have been less successful, prompting the firm to rethink its hybrid cloud delivery strategy, and sell-off its vCloud Air business.

Gemini PDA: will professionals favour the scion of Psion? (IDG Connect)

UK startup Planet Computers caused a sensation at the recent Mobile World Congress show in Barcelona with the announcement that is bringing to market a spiritual successor to the iconic British Psion handheld computers of the 1990s, aiming to offer techies and mobile workers a device with a fully tactile keyboard for doing real work.

Re-shaping data center servers (Datacenter Dynamics)

Servers are the linchpin of the modern data center, but the server market is in flux at the moment as trends such as cloud computing, social media and analytics are changing the demands on compute power, plus there is a growing need for greater energy efficiency and flexibility.

5G is coming… once boffins can figure out what it’s for (IDG Connect)

5G networks are moving closer to realisation with the launch of the first silicon designed to support next-generation mobile standards, but this glosses over the fact that those standards are nowhere near defined and, worse still, the telecoms industry has yet to even agree on what the key use case is that 5G networks are meant to target.

Moore’s Law is running out – but don’t panic (Computer Weekly)

Intel kicked off CES 2017 in Las Vegas with the declaration that Moore’s Law is still relevant as it slated its first 10nm (nanometre) processor chips for release later this year. Despite this, engineers are facing real issues in how to continue to push system performance to cope with the growing demands of new and emerging datacentre workloads.

On Premises Object Storage Mimics Big Public Clouds (The Next Platform)

Object storage is not a new concept, but this type of storage architecture is beginning to garner more attention from large organisations as they grapple with the difficulties of managing increasingly large volumes of unstructured data gathered from applications, social media, and a myriad other sources.

No ARM in a bit of server market competition (IDG Connect)

Qualcomm has conducted a live demonstration of its upcoming ARM-based server processor, driving home the chipmaker’s intentions to bring high performance ARM servers to market. The firm joins a growing list of vendors that have sought to make a mark with the ARM architecture in the datacentre, yet the industry continues to be dominated by Intel.

Intel’s hyperscale blueprint (Datacenter Dynamics)

Intel has been talking about its vision for the software-defined hyperscale data center for several years, but with few tangible results. Originally dubbed Rack Scale Architecture (RSA), this has now been rechristened as Intel Rack Scale Design (RSD), with the recent release of the version 1.0 specifications. Systems that are compliant with these are expected to be available from key vendors before the end of 2016.

Using cross-industry collaboration on open standards to address big data in the datacentre (Computer Weekly)

The Gen-Z Consortium was unveiled to the world in October 2016 with the aim of bringing to market a high-performance standard for connecting storage-class memory to processors in a server.

But Gen-Z joins a couple of other nascent standards announced this year – OpenCAPI and CCIX. The former is focused on delivering a standard for connecting hardware (such as GPUs), while the latter is focused on ensuring cache coherency across multiple devices.