Brendan Bouffler
AWS Research Cloud
The promise of the super-computing industry has always been that pioneering bleeding-edge technologies we build for the hyperscale filter down to become mainstream quickly enough to fuel exciting stories about how the phone in your pocket is as powerful as the world’s fastest machine from not so many years ago.
Whilst this is true of the hardware, there are very few scientists who’ll tell you that ease of use factors, capacity or accessibility has moved in such leaps and bounds. Moreover, whilst collaboration with others around the globe brings a lot of benefits, there are other factors like data security and privacy that can frequently – and suddenly – demand a lot of attention.
To keep our pace of scientific discovery going, however, our most pressing task as a community is to give working scientists access to the infrastructure they need in a form that’s easy to use, is secure for their data and scalable to the limits of their ideas – not limited by their laptop’s memory capacity.
This means an inherently different approach to what has come before. We’ll survey some of the work going on around the world and leverage lessons learned (both good and bad) to help target what needs to be a very innovative approach to computing, not just business as usual with more bandwidth.