Disruptive Technologies

Steve Streeting tweeted a few weeks ago: “Remember, experts are always wrong about disruptive tech, because it disrupts what they’re experts in.”. I’m happy I evangelise and work with such a disruptive technology and it will take time until it is bypassed by other technologies. And that other technologies will be probably be source-to-OpenCL-source compilers. At StreamHPC we therefore keep track of all these pre-compilers continuously.

Steve’s tweet got me triggered, since the stability-vs-progression-balance make changes quite hard (we see it all around us). Another reason was heard during the opening-speech of engineering world 2011 about “the cloud”, with a statement which went something like: “80% of today’s IT will be replaced by standardised cloud-solutions”. Most probably true; today any manager could and should click his/her “data from A to B”-report instead of buying a “oh, that’s very specialised and difficult” solution. But at the other side companies try to let their business live as long as possible. It’s therefore an intriguing balance.

So I came up with the idea to play my own devil’s advocate and try to disrupt GPGPU. I think it’s important to see what can disrupt the current parallel-kernel-execution model of OpenCL, CUDA and the others.

GPGPU’s disruption

It’s end 2012, and here’s the situation:

  • We did not get the world war III we were promised in movies and calendars (so we can put time&money in more useful stuff).
  • Windows 8 for ARM and X86 was launched, trying to compete with the disruptive OS-technology ‘Android’.
  • Linux gets a massive 200% growth on the desktop-market giving it a massive 3% market share.
  • GPGPU brought the hybrid processors (CPU+GPU), which we call ‘the CPU’ again.
  • GPGPU is now just “parallel computing” again.
  • Parallel computing solutions are threads, messaging protocols and the evolved OpenCL 2.0.
  • X86s get hardware-extensions to have ARM-support., to be able to run both “mobile widgets” and “power apps” on one processor. [a wild guess]
  • Docking-stations (with keyboard, bigger screen, co-processor and local backup) for smart-phones and tablets are the most-sold desktops. [link]

For parallel computing StreamHPC does nice business worldwide and then “the unexpected” comes: the disruptive technology which will replace OpenCL and puts us out of business. What could do that?

GPUs brought us programmable shaders, so what is being developed now, that could be used in processor-technology now that could completely replace OpenCL in 2013?

  • Biological processors. Grown brain tissue trained for specific purposes. Or at least a fly brain to get your car-navigation optimal. AI-style computing.
  • Quantum computing. Really vague for straightforward thinking IT-folks, so great disruption.
  • FPGAs. Always waiting to take over the world, which they will once it gets easier to program them.

Stuff like tertiary processing instead of binary, optical computing, ARM-processors and inter-mesh (internet as one big thinking mesh) would not replace but support OpenCL, so we stick with the above three – the fourth being the real disruptive one for us. I just tip the top of the ideas, to give space for your own thoughts. Please comment what you think.

Biological Processors

The programmable brain, is that possible? If you read some scientific reports, it seems so.

First get is straight: for offices and homes most stuff is done; we don’t get many new revolutionary products in the office suite except maybe “MS Mind Map”. Instead we will see more done in functionality which assists the user in making less “dumb” mistakes. Also the “Traveller’s shortest route problem” and object-recognition pop up in many more situations. Therefore AI get a lot more important the coming years, which we know brains are best in. Many work has been done in the past years to create an interface between silicium and brain-tissue. There was much discussion what humans can own and what is owned by nature; the promising results (read: money-maker) gave some space to as long as the term “produced” was used instead of “grown”.

Disruptive

As the market of buses, platforms and OSes get more a matter of business than technique, the interaction with humans might get much more important. Like the interface of the iPhone influenced all interfaces on devices it opened up the market for tablet’s, because and PCs did not find the right means to get this interface in.

AI being much closer chaotic processes like human behaviour, the weather and other butterfly-effected systems, it could pick the winning platform. Even if AI is not the focus of the in-house technologies, it could have influences in a large market.

Non-disruptive

While biological processors might have much greater processing-power, they lack two things: precision and expected outcomes. This might be not a problem at all, but people need security and being able to trust technology. That’s why AI is expected to have greater growth in all that does something with probability and expectations, but not in (post) calculations. Also people will find it awkward to have live brains in their navigation-system, which will slow down acceptance.

Quantum Computing

I like the whole idea of using superposition (energy does not travel, but can be at several states + some probability calculations) to compute. What I’ve read the past years that is mostly great in laboratories. If you never heard of it, try to read the explanation on Wikipedia; it is too hard (for me) to explain just here. I’m not talking about quantum data-transport which can detect man-in-the-middle-attacks, really just computing.

Once the problems of this type of computer are solved (price, stability and get it actually to work), this can truly go fast in replacing certain types of computers. The CEO of D-Wave – one of the first companies who claimed real success with quantum-computing is quite quiet on his blog. But you know how this goes: this kind of things just need a few Eurekas to get there.

Disruptive

The promise of quantum computing is big. It could handle data with speed, not possible with those nanometer-approach. Like the biologic processor has a future in AI, quantum-computing will have a future in special subjects like encryption and other complex problems.

Non-disruptive

While it will have certain winners in terms of speed, it has the same problem as biological computers: trust. Getting this into mass-production might take another few years, so it will not disrupt the market on high pace. Compare to SSDs, while better than conventional harddisks, they have not replaced them.

FPGAs

Why program in software if you can program in hardware? Why send a signal around, while you can lay a path directly? FPGAs have proven themselves many times, but never got at the level of flexibility because they are hard to program. The lines of code needed to program comparable software, was in OpenCL a few factors(!) smaller than in FPGA-language VHDL.

There is spoken about OpenCL for FPGAs a lot. It makes sense, because it has great strength in parallel processing plus the OpenCL-method using kernels could describe FPGA-designs pretty well. Until now there still is not much news of FPGA-companies taking OpenCL really serious.

Disruptive

As OpenCL could be fit for FPGAs, GPU and CPU could be replaced by this approach. We know OpenCL is technology-portable, not performance-portable, but optimising-software could make a difference. The theoretical maximum speed of FPGAs can grow faster than GPUs since it is flexible. There have been many experiments with CPUs with a programmable part, which could now break through.

Non-disruptive

FPGAs have existed for years, why would they be able to replace current technologies now? They will keep finding their use in special purpose sectors, as they do now.

Hybrid Processors

The most disruptive technology in GPGPU from the GPU-perspective seems to be hybrid processors, like ARM’s SoCs, Intel SandyBridge and upcoming AMD Fusion. Which way will they develop? Which CPU-design will take lead? It seems to be very promising and has pushed NVIDIA to focus on ARM.

Disruptive

OpenCL has a background in GPUs, but it is not GPUs which will make it big. The distance between the CPU and the GPU, wired by a PCIe-bus, is a bottle-neck if data cannot be streamed. Around 20 clock-cycles are needed for a round-trip of data. “Integrated” has won many before.

Non-disruptive

The bus-speed and bandwidth could be increased a lot with new techniques. What if the GPU gets connected via a special bus? When there is a demand, there is a solution.

Conclusion

It will take some years before CPU-manufacturers will settle down with all those current big changes around OpenCL (for parallel calculations), SoCs (aka hybrid computing) and ARM (as a base CPU-technology), that it actually is a good time to put something new in it. Also the change from computers to devices (set by un-upgradeable phones, tablets and Apple-computers) combined with internet-services (aka ‘cloud’) make it possible to introduce new technologies with ease. You want a quantum or brainy car-navigation-set? In doubt? It has a familiar touch interface, works on 2 penlites and thinks ahead.

OpenCL is a disruptive technology: it increases the calculation-power of computers which have high-end GPUs with a few factors. As discussed above, hybrid computing is another one, giving the GPU-only companies on the X86-market (read: NVIDIA) a hard time. ARM-support by Android, Apple and Windows was the third in a row, making it hard for some companies to catch up.

For StreamHPC it is sometimes hard to explain in simple words why OpenCL will be a mainstream technology end of this year and it is a good and wise investment, because people are just extrapolating current technologies. I can tell you: we don’t live in solution 1.0 or 2.0 any more, we live in solution-alternative A or B now.

I hope I gave you food for thoughts. And if you need to get those standard reports sped up, you know where to find us.

http://www.dwavesys.com/

Related Posts

4 thoughts on “Disruptive Technologies

  1. rabit

    I do think FPGAs could very well become the “next big thing” for quite a few reasons. HDL languages like VHDL & Verilog are not hard to program, just very different from software. You just have to be aware you’re not programming for a single processing element chugging through lists of compiled operations one by one, but thousands of interconnected simple ones each running parts of your code simultaneously. Simple as that 😉 Well not really, but Xilinx and Altera provide free dev tools that make it very easy to get started.

    FPGAs a very complimentary of CPU and GPU tech. I’ve long believed that the coolest thing ever would be an FPGA sitting between a CPU and a GPU as a sort of macro pipeline processor, allow game devs to reduce unnecessary traffic by offloading that to an FPGA.

    FPGAs are getting cheaper and more powerful. They still have a long way to approach ASIC in terms of density but that isn’t important. There are countless operations that even a simple >250k gate FPGA could do for general computing. Filesystem data encryption/compression comes to mind.

    FPGAs are really the very first step into the other tech you described here. The key I think is getting away from the conventional idea that memory and processing have to be separate entities. People always talk about the future of 1024-core computing but don’t understand the huge constraints of conventional architecture, bus contention, etc. I think the future is in devices containing billions of simple logic elements and FPGAs are just the Model T of this new generation.

    • Vincent Hindriksen Post author

      I think I completely disagree. FPGAs have never proven themselves to become a mainstream technology while being quite old and trusted by many. They are currently good for prototyping and specialised products (i.e. in the financial market). You know shader-languages had the same problem (a hardware-language and not a software-language) and GPGPU changed that enough to get it lifted; you need GPFPGA. You noticed I subtly described OpenCL as a potential disruptive technology for VHDL and Verilog in the article? What do you think?

      The rest of the reasons you give (price, power) also count for current CPUs and GPUs; FPGAs might win in performance, but the coming years not in FLOPS/€ compared to off the shelf silicon. There is also a limit to effective cache-size (what you call separated memory here), which does not change the game in favour if FPGAs. I don’t get your last remark; any architecture that is in that meta-FPGA can be in conventional CPUs too.

      I do watch development in the sector closely. Don’t misunderstand; I think FPGAs rock, but not for mass-market (yet).

    • Vincent Hindriksen Post author

      Your article talks about disruptive strategies when entering a new market, while my article is from the perspective of the current stakeholders in such market. But nice article, it shows the stability-vs-progression-balance from another perspective.

Comments are closed.