December 2 & 3, 2019
Cambridge, MA
The fundamental nature of computing is changing. AI, neuromorphic chips, edge computing, 5G, quantum computing, and the internet of things are making it possible to solve problems in entirely new ways and unleash new business opportunities. But they could also create new security risks or even move the balance of geographical power.
Future Compute will prepare you to understand these trends and how they will affect your industry. After all, computing today isn't just hardware and software—it's everyware.
Sign up for Future Compute updates
Legal jargon here.
An executive overview on quantum computing and its implications for business in the future. Industry and academic experts will examine the current state of quantum computing, initial applications of the technology and opportunities for future disruption. This primer will prepare you for the quantum state change ahead.
Topics to include:
Presenting partner
At Future Compute, MIT Technology Review offers a curated executive summary of the computing landscape, tailored to decision-makers across all industries. We’ll examine the technologies that are poised to disrupt business over the next 24 months and decipher the risks and opportunities.
Topics will include:
Presenting partner
Future Compute happens at the MIT Media Lab in the heart of the MIT campus in Cambridge, Massachusetts. Here you can't help but feel the excitement and inspiration of being at the top university in the world, surrounded by the top technology minds anywhere.
MIT Media Lab
75 Amherst Street
Cambridge, MA 02139
The mission of MIT Technology Review is to make technology a greater force for good by bringing about better-informed, more conscious technology decisions through authoritative, influential, and trustworthy journalism.
Full conference registration includes access to all conference sessions across two days, networking opportunities, and a closing toast. Single day passes are also available.
All paid attendees are offered a complimentary one-year print subscription to MIT Technology Review magazine as well as registration for The Download, our email newsletter delivered each weekday to your inbox.
Come learn how seismic shifts in computing technology are going to affect every industry on the planet.
We’re gathering experts working on the cutting edge of advancements in artificial intelligence, 5G connectivity and quantum computing. Have someone you would like to hear from? Submit your nomination here.
Boston Marriott Cambridge
50 Broadway Cambridge, Massachusetts 02142
(5 minute walk to conference venue)
Discounted Rate: $189/night + tax
Deadline: November 11, 2019
Book online here or call 1-800-228-9290 or 617-494-6600 (mention MIT Technology Review Future Compute)
Senior Vice President, ERP Program Executive, and Chief Information Officer, Federal Reserve Bank of Boston
Chair, MIT Sloan CIO Symposium
EVP of R&D and Chief Product Officer, D-Wave
Chief Scientist, The Johns Hopkins University Applied Physics Laboratory
Head of Airbus Blue Sky, Airbus
CEO and Publisher, MIT Technology Review
Partner, Prelude Ventures
Director of Neuromorphic Computing Lab, Intel
Director of Advanced Cryptography, BlackHorse Solutions
President, RSA
Senior Research Scientist, Google
Graduate Student, MIT
Investment Manager, BASF Venture Capital
AI Reporter, MIT Technology Review
Partner, DCVC
Senior Editor for Cybersecurity, MIT Technology Review
Dean, MIT Schwarzman College of Computing
Delivery Lead IBMQ, IBM
Group Chief Information and Digital Officer, National Grid
Senior Vice President and General Manager, Silicon Engineering Group, Intel
Founder and CTO, Graphcore
Corporate Vice President and Head of Advanced Research, AMD
CEO and Cofounder, Akamai Technologies
Editor in Chief, MIT Technology Review
Cofounder and Chief Scientist, IonQ; Distinguished University Professor and the Bice Zorn Professor of Physics, University of Maryland
Principal Investigator, Engineering Quantum Systems Group, MIT
CEO and President, Quantum Xchange
Distinguished Engineer, Google
Associate Professor, MIT Media Lab
CEO Emeritus, CTRL-labs, Facebook Reality Labs
Senior Editor for Biomedicine, MIT Technology Review
Executive Editor, MIT Technology Review
Editor at Large, MIT Technology Review
CEO and Cofounder, Zapata Computing
Cofounder and Chief Scientist, Quantum Circuits; Sterling Professor of Applied Physics and Physics, Yale University
Research Scientist, Oak Ridge National Laboratory
Professor, MIT
Chief Information Officer, MIT Lincoln Laboratory
General Manager of Quantum Software, Microsoft
Cofounder and CEO, Neurala
Chief Technology Officer, KAYAK
8:00 a.m.
9:00 a.m.
A welcome from MIT Technology Review’s editor in chief.
Gideon Lichfield, MIT Technology Review
9:15 a.m.
Moore’s Law continues to fuel innovation, resulting in increased scale and complexity. How can abstraction at this level empower powerful and efficient compute?
Jim Keller, Intel
Gideon Lichfield, MIT Technology Review
9:45 a.m.
Semiconductors made from rolled up sheets of atom-thick carbon could run faster and consume less energy than silicon ones, but they are challenging to manufacture. Here we look at new advances that aim to propel the technology into the mainstream.
Max Shulaker, MIT
Gideon Lichfield, MIT Technology Review
10:10 a.m.
10:40 a.m.
Startups and chip makers are developing new types of processors specifically for AI and machine learning. Hear about new “intelligence processing units” that enable the massively parallel processing required by modern AI applications.
Simon Knowles, Graphcore
Karen Hao, MIT Technology Review
10:55 a.m.
By mimicking the neural systems of the human brain, neuromorphic computing aims to turbocharge AI applications and to operate at a fraction of the power required by conventional chips. Is this radical departure from conventional computing the future of AI?
Catherine Schuman, Oak Ridge National Laboratory
Karen Hao, MIT Technology Review
11:10 a.m.
Simon Knowles, Graphcore
Catherine Schuman, Oak Ridge National LaboratoryKaren Hao, MIT Technology Review
11:25 a.m.
Using a novel new approach, it is now possible to control computer input with neural interfaces – no brain implants required. Hear about the practical impact of controlling devices with the nervous system and the impacts this technology will have on the design of future human-computer interaction.
Thomas Reardon, Facebook Reality LabsAntonio Regalado, MIT Technology Review
11:50 a.m.
Through the use of AR, AI, and sensors, it is becoming possible to make the invisible visible, in real time. This session explores new thoughts on the use of AR and AI in the surgical theater as a tool for IA - intelligence augmentation, where AR becomes not just a display device, but a sensing device.
Ramesh Raskar, MIT Media LabAntonio Regalado, MIT Technology Review
12:15 p.m.
As computing power evolves, new ways of interacting with systems are emerging, turning human sensory experiences into viable computing interfaces. How effective are today’s interfaces and what might the future hold?
David Blodgett, The Johns Hopkins University Applied Physics LaboratoryAntonio Regalado, MIT Technology Review
12:40 p.m.
1:40 p.m.
AI has provided computers with near-human levels of data perception, and neuromorphic computing aims to take this a step further – chips directly inspired by biological neural circuits so they can process new knowledge, adapt, and learn in real time at low power levels. This technology has advanced rapidly in recent years and, today, Intel’s Loihi neuromorphic research chip has a growing body of quantitative results demonstrating outperformance versus conventional architectures. The results point to compelling scaling trends, as these systems are scaled up to millions of neurons, providing a roadmap for future breakthroughs in AI. This session shares the latest progress from Intel’s neuromorphic research: algorithmic innovations, community-wide collaboration, and emerging real-world applications from gesture recognition to robotic control to monitoring power grids and more.
Mike Davies, IntelElizabeth Bramson-Boudreau, MIT Technology Review
2:00 p.m.
With the rise of AI, IoT, and 5G technology, edge computing promises to address concerns with latency, bandwidth costs, security, and privacy. Understand how its rise will impact a variety of intelligent applications, like autonomous vehicles, AR/VR, personalized health care, and smart factories.
Tom Leighton, Akamai TechnologiesMichael Reilly, MIT Technology Review
2:15 p.m.
Growing volumes of data, smarter edge devices, and new, diverse workloads are causing demand for computing to grow at phenomenal rates. How do you respond to the current opportunities, exponentially increasing compute capacity at a fixed cost? In what ways are hardware and software converging? And what are the innovations and trends on the horizon that will shape the future computing landscape?
Parthasarathy Ranganathan, GoogleMichael Reilly, MIT Technology Review
2:30 p.m.
More massive data thefts. Ransomware attacks that lock down computers in cities and businesses. Software that targets safety systems. The litany of cyber threats we face gets ever longer—and more frightening. As we consider the future of computing, how can we make it more secure by design? And what else should we be doing to tackle the hacking plague?
Rohit Ghai, RSAMichael Reilly, MIT Technology Review
2:45 p.m.
Tom Leighton, Akamai Technologies
Parthasarathy Ranganathan, Google
Rohit Ghai, RSA
Michael Reilly, MIT Technology Review
3:05 p.m.
From evolving technology to the preservation of our social conscience, it will take a combined effort to educate, train, and normalize the workforce of the future.
Dan Huttenlocher, MIT Schwarzman College of ComputingDavid Rotman, MIT Technology Review
3:25 p.m.
3:55 p.m.
In every industry, technology is now a critical driver of success and the fundamental nature of computing is changing. AI, neuromorphic chips, edge and cloud computing, 5G, quantum computing, and the Internet of Things are making it possible to solve problems in entirely new ways and unleash new business opportunities. Here’s how senior executives are preparing for the coming shifts.
Don Anderson, Federal Reserve Bank of Boston
Adriana Karaboutis, National Grid
Robert Solis, MIT Lincoln Laboratory
Giorgos Zacharia, KAYAK
Lindsey Anderson, MIT Sloan CIO Symposium
4:40 p.m.
From esoteric tech to AI, visual analysis and deep learning systems are now transforming the innovation of intelligent products and devices, making enterprises more competitive and jobs more efficient. What does this this mean within the enterprise space? How does this translate into worksite and process efficiency?
Max Versace, NeuralaDavid Rotman, MIT Technology Review
5:05 p.m.
There is an intense race taking place between America, China, and other countries to develop ever more powerful supercomputers. And it may not be long before the world sees the first exascale machine—a computer capable of a billion billion calculations a second, or one exaflop. Why are supercomputers still important and what impact does their development have on the rest of the computing ecosystem?
Alan Lee, AMDGideon Lichfield, MIT Technology Review
5:30 p.m.
8:30 a.m.
9:00 a.m.
A welcome from MIT Technology Review’s editor in chief who will set the stage for the day’s program as we prepare for the Years to Quantum.
Gideon Lichfield, MIT Technology Review
9:10 a.m.
This session will provide a crash course on quantum computing: how it works, why it’s so powerful, and where quantum computing will have its greatest impact. This introduction to the world of quantum computing will provide a baseline understanding for the day’s upcoming sessions.
William Oliver, MITDavid Rotman, MIT Technology Review
9:45 a.m.
Beta or VHS? PC or Mac? In the early stages of any new technology, different technical implementations vie for commercial supremacy and quantum computing is no different. This session explores progress being made with several different approaches to building quantum machines.
Chris Monroe, IonQ, University of Maryland
Rob Schoelkopf, Quantum Circuits, Yale University
Krysta Svore, MicrosoftGideon Lichfield, MIT Technology Review
10:30 a.m.
10:55 a.m.
On October 23, Google revealed that Sycamore, their 54-qubit quantum computer, was able to complete a task in 200 seconds that it estimated would take over 10,000 years on a classical computer. This is the first time a quantum computer has carried out a specific calculation that is beyond the practical capabilities of today’s most powerful supercomputers. Learn about this milestone achievement and what it means for the future of quantum computing.
Marissa Giustina, Google
11:20 a.m.
We’re still in the early days of quantum computing, but it already shows huge promise for business applications across industries. This session provides a high-level overview of the capabilities it offers and highlights some areas likely to benefit from these first, including optimizing supply chain logistics and powering research into new materials and drugs.
Alan Baratz, D-WaveMichael Reilly, MIT Technology Review
11:45 a.m.
There is widespread agreement that quantum computers will rock current security protocols that protect global financial markets and the inner workings of government. This session explores the measures and countermeasures necessary to protect your data in the quantum world.
Roberta Faux, BlackHorse SolutionsPatrick Howell O'Neill, MIT Technology Review
12:10 p.m.
The threat posed by cyberattacks is forcing governments, militaries, and businesses to explore more secure ways of transmitting information. Quantum key distribution is in use today and could be the key to securing data for years to come. How does this mechanism work, and is it safe?
John Prisco, Quantum XchangePatrick Howell O'Neill, MIT Technology Review
12:30 p.m.
1:30 p.m.
Quantum experts and corporate executives provide actionable insights into how quantum technologies are being deployed — and will be deployed in future — to spur radical innovation across industries.
Current quantum AI networks are no match for neural networks running on powerful conventional computers today. But looking ahead, quantum machines may gain an edge for certain types of AI challenges. It’s time to start exploring this future.
Blake Johnson, IBMKaren Hao, MIT Technology Review
Quantum technologies are expected to create a massive paradigm shift in the way aircrafts are built and flown. Airbus is an early adopter of quantum technologies, attempting to enhance the performance of their products and services and solving the most complex aerospace challenges.
Thierry Botter, AirbusKaren Hao, MIT Technology Review
Simulation of chemical bonds and reactions is expected to be one of the first applications for at-scale quantum computers. Learn how quantum computing is enabling breakthroughs in chemistry that could lead to new materials, new batteries and new medicines.
Jacob Grose, BASF Venture CapitalChristopher Savoie, Zapata ComputingDavid Rotman, MIT Technology Review
3:00 p.m.
3:30 p.m.
Venture capital investment in quantum technologies has been ramping up. Hear from leading investors about where they think the field is heading, and what key trends to look out for over the next few years.
James Hardiman, DCVC
Mark Cupta, Prelude VenturesMichael Reilly, MIT Technology Review
4:10 p.m.
In the coming quantum era, organizations will need people with a unique mix of knowledge of physics, engineering, and software to help them profit from these exciting new technologies. But the number of researchers who currently fit this profile is quite small. How should we be developing the future quantum workforce, and what steps should businesses be taking to harness this rare talent now?
Chris Monroe, IonQ, University of MarylandBlake Johnson, IBMAmy Greene, MITGideon Lichfield, MIT Technology Review
4:40 p.m.
Closing remarks from MIT Technology Review’s editor in chief.
Gideon Lichfield, MIT Technology Review
4:45 p.m.