Executive summary

The two-day workshop brought together Fellows from CIFAR’s program in Quantum Information Science along with leaders in academia, industry and government to discuss, from a research and development perspective, the progress to date, major bottlenecks and future opportunities related to the development of quantum repeaters and their integration into large-scale quantum networks. Presentations, all of which were invited, focused on the state-of-the-art of the science and open problems. Topics included applications, protocols, components, interfaces, and strategy to create partnerships with industry and other stakeholders.

The backdrop for a discussion on quantum networks is the expectation that, with major efforts from academic researchers and industry players (e.g. Google, IBM, Microsoft, D-wave), quantum computers (QC) will become available in foreseeable future. This splits quantum network applications into one class that seeks to overcome the security issues caused by QC (such as QKD, secret sharing, etc), one class that seeks to utilize the unique capabilities of QC (such as quantum cloud-computing, blind quantum computing, etc.), and one class aimed at quantum-enhanced sensing. Important to note in these contexts is a) that mathematics-based approaches, which are also believed to protect privacy against QC, currently have more support than quantum cryptography and b) that the possibility to link modular quantum computing units is already being investigated in the QC community, e.g. to overcome physical constraints in dilution fridges.

(+) More

The quantum links making up a quantum network will, depending on distance and environments, likely consist of different types of channels e.g. free-space satellite and ground links as well as fibre-optic links. In recent years, capabilities of both satellite and real-world fibre-links have grown at a fast pace, with Canadian researchers at the forefront. For long-distance, fibre-based links quantum repeaters are needed, possibly using different protocols, e.g., based on discrete or continuous variables, each presenting a set of advantages and challenges.

The individual components for a network are at varying levels of maturity and quality. Single-photon sources based on spontaneous parametric down-conversion of four-wave mixing are extremely reliable but suffer from a trade-off between rate and purity. Single-emitter based sources, such as quantum-dots and diamond colour centres, have progressed at a fast pace over the past few years but still face a number of engineering problems to increase coupling efficiency, spectral stability, and yield. High efficiency single-emitter based sources operating at telecom wavelengths, though highly desired, have yet to be demonstrated.  With quantum dot sources having reached a certain level of maturity, development of ‘plug and play’ demonstrator units would go a long way in accelerating the implementation of these sources in quantum network test-beds.

The availability of fast and efficient superconducting detectors — even on the commercial market — has removed wavelength limitations of previously-used avalanche photo-detectors, thus facilitating the use of telecom-wavelength photons. Remaining challenges are to incorporate detection of multiplexed degrees of freedom e.g. spectrally resolving detectors. The development of suitable quantum memory — both based on single emitters (such as NV centres) as well as ensembles (e.g. rare-earth-ion-doped crystals) — is still challenging in that no currently operating memory meets all the benchmarks desirable in a quantum repeater/network.  Quantum transducers that link optically encoded quantum states to microwave qubits usable in superconducting QC are the least mature component. Several approaches are currently being pursued, with those based on nano-mechanical oscillators currently leading the way. However, no system has yet demonstrated transduction at the quantum level. It is likely that transducers will be the main bandwidth-limiting component in a network. We note that QC platforms based on, e.g., trapped ions and quantum dots possess optical transitions and thus eliminate the need for a transducer.

An organizational problem is that there is too little incentive to tackle engineering problems, optimize components to meet all requirements (not just excel at one) and develop and analyze complete (and integrated) systems. More collaborations and the creation of test-beds where researchers with different skills can come together would help to remove this obstacle. In addition, test-beds are a good way to showcase research to industry, decision-makers and the public. The prospect of job-creation and opportunities for generating spin-off enterprises must be communicated. A good example is the new European Quantum Technology Flagship Initiative. The Quantum Canada initiative, along with subsidiary fora, is currently exploring a similar approach in Canada.

Industry can provide both investment and a market for the technologies that researchers develop and leverage their connection with decision makers in the government. But for the private sector to get involved in the development of quantum networks, the academic research community and industry must learn how to work together and create mutual understanding of each other’s objectives and problems. Entities such as the Canadian Photonics Industry Consortium could have an important role along with intermediary organizations and University Industry offices. It may be that Canada’s funding ecosystem is missing a piece for strategic and ground-breaking innovation (similar to DARPA in the US) or an entity that will incubate and fund start-ups in the quantum technology space. Government also has an important role to play as a first adopter of new quantum technology, e.g. quantum secured communication links. This would then provide an incentive for an industrial ecosystem to develop.

To make a quantum network a truly compelling proposition to decision-makers and the public we must formulate a simple and clear grand vision as captured by the term “moon-shot”. This vision can feature a single clear and captivating goal, such as Teleporting from coast-to-coast, which is based on technologies that hold commercial and strategic value that can engage industry and government. Given the growing public attention to online privacy, a term like transforming cyber-security could also be used. It is important to have a message that projects imagined opportunities and is not based on technical terms. We must emphasize the urgency of investing now and pointing to the risk-management aspect in a future world with widely accessible quantum computers.

Summary of workshop sessions and discussions

(+) Session 1: Applications

Chaired by Ben Sussman with talks by Zac Dutton and Alexandre Blais

One projection that will guide all discussions about quantum networks is that quantum computers (QC) will be built. Thus, a quantum network should i) have the potential to link such QCs and ii) provide physics-based security for communication that would otherwise be compromised when using QC-vulnerable encryption protocols.

Starting with the latter (ii), small-scale QKD networks already exist in several locations. A future challenge is to extend the distances with quantum repeaters. Different approaches and protocols exist, but progress hinges on engineering challenges related to improving the performance of components. It is important to note that there currently is a clear preference in the cryptographic standards community to stick to mathematics-based encryption protocols (such as lattice -based encryption) although it is hard to imagine that they can be proven safe against QC attacks.

As for (i), the case for linking QC still has to be communicated more clearly. The most basic argument is that linked quantum computers could be significantly more powerful than individual ones. There is also a suite of quantum privacy algorithms such as blind quantum computing that would be enabled by a quantum network, but a compelling argument why someone would need this is still missing. From a practical point-of-view, some researchers, e.g. R.J. Schoelkopf and M. Devoret, C. Monroe, and J. Kim, are working on modular quantum computer architectures, mainly motivated by the finite capacity to accommodate superconducting qubits in a dilution fridge or ions in a trap. Such closely-spaced modules (containing 10-1000 qubits) would be connected using photons in the microwave or optical domain, which opens an avenue to also interconnect distant QCs.

The discussion following this session also focused on specifying the benefits of connecting QCs into networks. Although some interesting proposals exist for distributed quantum sensing, they do not yet present a compelling case for building a quantum network. Secrecy will be more and more important in many sensing applications, however, this boils once again down to a secure communication application. 

(+) Session 2: Quantum Repeater Protocols

Chaired by Zac Dutton with talks by Christoph Simon and Alexander Lvovsky

Quantum link architecture will depend on the distance that it bridges. For short distances (<500 km), direct ground-based links (maybe with trusted nodes) would be used, intermediate distances of (500-2000 km) would rely on ground-based quantum repeaters (QR), national/continental distances of 1000-5000 would employ low-earth orbit (LEO) satellites, while intercontinental distances would rely on geo-stationary (GEO) satellites or quantum repeater architectures with satellite links. Since satellite-based links require receivers in low-light (i.e. remote) locations, they would be supplemented by ground based (repeater) links to urban centres.

A quantum repeater (QR) allows the efficient distribution of entanglement over a long link by first distributing entanglement over several shorter sub-links. Using entanglement swapping, entanglement can then be shared between the end-nodes of the entire link. There is a number of distinctions with regards to QR protocols.

A first distinction is whether errors in the transmitted qubit states, either due to qubits being lost during transmission or during a possible purification operation at the nodes, are corrected by heralding (post-selection) or using error-correcting codes. The latter is in principle faster, but requires very short sub-link distances as it tolerates only 50% loss between nodes. Furthermore, it requires high-fidelity qubit gates. The former is more robust against loss and errors, but relies on two-way communication and quantum memory with comparably long storage times.

A second distinction is if the protocol uses i) discrete variables (single photons) or ii) continuous variables to encode quantum information. 

i) Discrete-variable QR can tolerate rather large amounts of loss and errors. It has been treated extensively in theory and is subject to extensive experimental efforts. For protocols that rely on quantum memory, one may employ single emitter/absorber systems such as neutral atoms in high-finesse cavities, trapped ions, and NV centres. Alternatively, one can use ensemble-based absorbers/emitters such as hot or cold atomic vapours and solid-state ensembles. NV centres and rare-earth ion ensembles doped in crystals will be covered more extensively in the components session.

ii) Continuous variable protocols allow Bell-state measurement (BSM) to be performed with unit efficiency. However, it relies on close-to perfect state preparation and quantum memory. Different protocols exist e.g. based on cat-states and EPR-states.

Key to advancing quantum repeaters is the demonstration of quantum supremacy i.e. a case where the repeater outperforms (in terms of, e.g., secret key rate) the best direct transmission of qubits.

In the discussion following this session, it was mentioned that a limiting factor for discrete-variable QR is the 50% efficient BSM. This value can in principle be increased to 100% by using ancillary photons or, e.g., a superconducting qubit-based quantum processor. However, if the latter entails using a transducer to convert an optically encoded qubit to a processor qubit, this may add loss or result in bandwidth limitations. It may be that different network configurations are optimal for different applications

The question of how to cover the total costs needed to construct a quantum network led to a number of suggestions. First, there is the possibility to spin off technologies developed as part of the research, but with applications to potentially completely different fields (case examples are Montana Instruments and S2-corp in Bozeman). Second, the network costs can be brought down by using existing infrastructure (such as fibre-optic networks) if it can be proven to co-exist with, i.e. impart minimal disturbance to, existing infrastructure (e.g. classical communications). Third, one has to consider that the cost of failure, i.e. breach of security, could be large enough to warrant even an expensive secure communication system.

One impediment to progress is that engineering challenges are not always valued in the quantum-science community, and that hence some important optimization problems are not solved. More collaboration and work towards a unified goal is needed. There would be great value in having government-funded demonstrator quantum networks because they would spur such developments.

(+) Sessions 3 & 4: Components

Chaired by Michal Bajcy with talks by Rob Thew, Dan Dalacu, Paul Barclay, Andy Sachrajda, Daniel Oblak, Charles Thiel, and Thomas Jennewein (incorporated for this report from session 5)

The components discussed in the talks ranged from sources, memories, channels and detectors/measurements of optical photons to qubit processors. Several of the presented systems can be tailored to operate as different components.

Sources: Sources of single and entangled photons are broadly grouped into probabilistic and deterministic types. To the former class belong spontaneous parametric down-conversion (SPDC) and four-wave mixing (FWM) sources, which excel in terms of reliability, tunability and indistinguishability of the emitted photons (as required for two-photon interference measurements in quantum repeaters). They suffer mainly from non-zero and multi photon-pair emissions, some of which can be suppressed by various memory-enhanced or multiplexing schemes.

As sources for single photons, quantum-dots have improved significantly over the last decade, and for a given purity they can now achieve higher rates compared to SPDC-based heralded single-photon sources. A major problem — caused by spectral diffusion, timing jitter and coupling to phonons — is to achieve good indistinguishability of photons emitted from different sources. Finally, it is rarely stated how good the yield is in experiments, i.e. what fraction of tested QDs actually possess similar properties as those presented in the papers. Entangled photon-pairs – typically in polarization – can be generated from QDs using a bi-exciton decay process. The challenges for achieving high-fidelity entangled states is the anisotropic exchange splitting of the two relaxation channels, in addition to those mentioned above for single-photon emission. Only recently has a Bell-inequality violation been demonstrated for QD-based entangled pair sources.

Colour centres in diamond, such as nitrogen-vacancy or silicon-vacancy centres, are also attractive candidates for single-photon sources. At cryogenic temperatures, colour centres have narrow and strongly coupled optical transitions. Efficient collection of photons in a well-defined spatial mode has been greatly helped by solid immersion lenses. The probability of emission into the zero-phonon line is small in NV centres but better for SiV. Both spectral and spatial mode selection can be enhanced by Purcell enhancement of the coupling of a colour centre to cavity mode, as is being pursued in several groups. It is worth pointing out that entangled photon pair sources can be built from sets of independent indistinguishable (efficient) single-photon sources.

Transmission channels: Different transmission channels, e.g. free-space of fibre-optical cables, of the optical qubits present different challenges for qubit collection and detection. In a quantum repeater, entanglement is swapped by performing a Bell-state measurement, which requires the partaking photons to be indistinguishable. Hence, in addition to the indistinguishability of the photons emitted by the sources, all perturbations during transmission must be compensated for. Such precise control of polarization, timing, frequency and spatial modes has been demonstrated for long-distance optical fibre transmission, but remains an open challenge for transmission through free-space.

Links based on free-space transmission do not necessarily require repeaters as loss, in the absence of atmospheric absorption, increases only quadratically (not exponentially, as in the case of fibres) with distance due to beam divergence. Challenges include background light (which can be overcome by placing terrestrial receivers farther than about 20 km from any major urban centres and operating the link at night), and fast movement and intermittency of a satellite (which, to some extent, can be overcome by state-of-the-art tracking systems and geo-stationary satellites or large number low-earth orbit satellites). The chief problems are unpredictable and rapid spatial mode fluctuations due to atmospheric turbulence.

Schemes for satellite links can be divided into two categories: i) downlink configuration, in which a photon pair source on a satellite transmits to terrestrial receivers, thereby entangling them. ii) uplink configuration, in which a Bell state measurement is performed on the satellite as part of a quantum repeater link or to do measurement-device-independent QKD. However, a BSM with photons that have travelled over free-space channels is a, so far, unmet experimental challenge.

Globally, the quantum space race is heating up, with Chinese and Singaporean group having launched quantum-hardware into space and, for the former, established a quantum down-link to two terrestrial stations. In addition to these, several proof-of-principle experiments have demonstrated aspects of a satellite based system. In Canada, funding for a satellite communication project has recently been announced by the federal government and a launch date in 3-4 years is anticipated. The satellite payload has been designed for uplink configuration, which will allow more flexibility in terms of experimental studies. For instance, different types of sources can be employed at the terrestrial stations. The long-term dream scenario would be a satellite with an entanglement source and long-lived quantum memory on-board, which would allow connecting any two stations on earth that the satellite passes over.

Detectors: Until recently, single photon detectors (SPDs) were generally based on avalanche-photo-diodes, which feature good properties (efficiency and dark counts) in the visible range but perform much less well at telecommunication wavelength at 1550 nm. This guided many of the experimental demonstrations and also led satellite-based schemes to avoid telecom photons. With the large improvement and commercialization of superconducting detectors such as transition-edge-sensors (TES) and superconducting nano-wire detectors (SNSPD), telecom wavelength photons can now also be detected efficiently. State-of-the-art arrayed detectors can overcome some of the remaining limitations in terms of jitter and dead-time, but rely on more resource-efficient readout schemes, such as the multiplexed readout of microwave kinetic inductance detectors (MKID).

Another consideration relates to multiplexed quantum repeater architectures. For these, detectors (and the BSM) must be able to resolve (de-multiplex) arriving in different modes. Currently, detectors can only efficiently distinguish temporal modes (in particular if dead-times can be reduced e.g. by using arrays), but not modes in the spectral domain. Hence, de-multiplexing must occur prior to the detector, e.g. by means of spectral filters, which adds loss and complexity.

Quantum memory: Quantum memory has been implemented in numerous systems. One distinction between memories is whether they work by i) emitting one photon that is entangled with the internal (long-lived) state of the memory or if they work by ii) reversibly mapping an optical signal to and from the memory. In this session two specific materials were discussed.

First, diamond colour centres, used as type i) memory, are currently being employed in an effort to build a quantum network in Holland with several crucial steps demonstrated in the past years. NV centres have long spin coherence times (several seconds at 77 K) at cryogenic temperatures, and even at room temperature, it can be ~30 ms. A major hurdle is the low collection efficiency of the emitted light (limited, e.g., through a small probability of emission into the zero-phonon line), however, as discussed in the context of single photon sources, several experimental efforts seek to address this.

Rare-earth ion (REI) doped crystals possess several key properties that make them useful for quantum memory and other quantum processing task. Specifically, they can feature very long coherence times – up to several ms on optical transitions and several hours on spin transitions – which allows optical storage that may be transferred to spin transitions. The existence of long-lived shelving states allows for spectral tailoring as is needed in many photon-echo type quantum memories. The collective enhancement leads to large photon absorption probabilities even in the absence of an optical cavity, and the inhomogeneous broadening translates into large memory bandwidths and multimode storage capacity. However, achieving all desired properties in a single crystal is challenging and requires a lot of detailed understanding of decoherence and coupling mechanisms. REI systems are complex even in theory, and, in addition, crystal parameters are affected by purity, growth methods and conditions, post-growth environment, and unknown factors. Better understanding of spin-dynamics, electronic wave-functions and microscopic disorder in crystals is needed to tailor/discover better REI crystals.

Quantum processors: Processors of quantum information are what make up a QC. Moreover, a quantum processor would allow one to perform a unity-efficiency BSM in a quantum repeater – contingent upon the transduction of the transmitted optical qubit to the degree of freedom and parameter space of the processor qubit.

In addition to the previously discussed superconductor-based qubit processors, other types, such as trapped ions and quantum-dots (QD) may be used as processors. In QD all the components needed for a processor (gates, interconnects, and readout) have already been demonstrated, albeit not optimally in a single device. A unique benefit is the ability to directly interact with optical signals, hence eliminating the need for an extra transduction element. Different QD systems have been studied both in terms of semi-conductor material, and carrier (hole or electron) qubit space (single QD, triplet states in multiple dots, etc.). Ways to interconnect different regions within a single processor have been studied in terms of surface acoustic waves (spatial transfer), coupling of charge via cavities, or via quantum superpositions.

The discussion following the components session focused on the merits and difficulties of component integration, a tedious task undertaken by some research groups, e.g. J. O’Brien at the U of Bristol. Integration would help making quantum network devices reliable and scalable and ideally reduces loss due to interfacing. One challenge is that coupling in-and-out of microscopic structures is tricky, although some groups seem to master it (e.g. O. Painter) and evanescent coupling to devices may also alleviate some loss. However, additional waveguide propagation loss is a problem. At present, silicon has the lowest loss, but it is not suitable for all wavelengths. It is clear that more work is needed in order to identify the best materials and designs for developing integrated components.

(+) Session 5: Interfaces

Chaired by Christoph Simon with talks by Konrad Lehnert and Nikolai Lauk

In the context of quantum networks, interfaces refer to the link between long distance quantum channels and quantum processors such as superconducting qubits operating at microwave frequencies. If the channel is a fibre link, the wavelength is of around 1550 nm (telecom band) whereas for free-space transmission, there are several wavelength “windows” in the atmosphere. Hence, the interface is a transducer of quantum information from the optical domain to the microwave domain. It is worth noting that, due to thermal radiation, it is not possible to transmit microwave signals outside a cryogenic environment. Quantum transducers must be quantum state preserving. This can be analyzed in terms of input-output scattering parameters, which must show it to be number preserving and bidirectional. Transducers have been proposed in several systems such as cold-atoms in the vicinity of superconducting circuits, optimized electro-optic converters, microwave and optically active spins in crystals (e.g. REI), piezoelectro-optomechanical devices (e.g. AlN), opto-mechanical crystals, parametric-down conversion, and semiconductor membranes (e.g. SiN).

Transduction via SiN membranes works by microwaves driving a resonant circuit that includes a capacitor formed by a SiN membrane. Thus, the microwaves induce mechanical oscillations in the membrane, which, in turn, perturb an optical cavity field that reflects off the same membrane. Hence, energy is transferred into frequency sidebands of the optical field. The process also works in reverse. In a 100 mK dilution fridge, 20% conversion efficiency with only 5 added noise photons has been demonstrated. Main challenges are the noise caused by the strong optical pump-field, which boils down to the requirement of strong coherent interaction while there can be absolutely no incoherent interaction.

For ensemble-based transduction the absorbers must possess transitions in both the optical and microwave spectrum and facilitate coherent transfer between these. Promising candidates for transduction via ensemble-based absorbers are diamond colour centres and rare-earth ion doped crystals. So far a number of experiments on both types of systems have demonstrated strong coupling with microwave fields. Focusing on rare-earth materials, some possess telecom transitions and also feature the combination of broad inhomogeneous and narrow homogeneous broadening. These properties have been exploited in a transducer proposal involving photon-echo type interaction.

A first comment in the discussion on interfaces was that superconducting qubits are challenging to network because they don’t possess any optical transitions and thus transducers are indispensable. Superconducting qubits are not the only choice for processors, but one cannot ignore that, at present, they are the choice of nearly all major commercial players.

Another comment was on the usefulness to focus on developing components that have good overall properties for a quantum network instead of focusing on developing them to beat the state-of-the-art with respect to a single or a few properties.

Subsequently the point was made that to make a quantum network a truly compelling proposition to decision-makers and the public, the quantum research community must formulate a clear grand vision as captured by the term “moon-shot”. It was suggested not to phrase it in terms of supremacy over other countries (as in the moon landing), but to focus on applications. One approach is to talk about all the capabilities of quantum computers and then add a network to those. One should also keep in mind that going to the moon was the advertised goal, but developing technology such as rockets was the actual valuable outcome and probably also the strategic goal of the decision makers. We should copy that notion.

Our grand-goal should be phrased in terms of things that we want to solve, as the quantum part itself is hard for people to understand. Teleporting from coast-to-coast could be a grand challenge that engages and captures the public. Developing tools for cyber security could be the technological capability that would convince decision makers and make it a good business proposition. However, this could still be too technological to captivate people outside the scientific/technological community. It is also worth drawing inspiration from other current grand-challenges such as genetic-based medicine, climate-change, and uncovering the building blocks of the universe (e.g. LIGO and CERN). Cyber security is a concern that will become increasingly relevant to people and perhaps able to engage them. A phrase that could be used is transforming cyber-security. When formulating these visions, it is a challenge for researchers to cross the valley between technical language and imagined opportunities.

It is important to be strategic about how this research is presented especially in terms of job-creation and commercial opportunities. Also, point out that solving some of the engineering challenges is what may lead to spin-offs. We must emphasize the urgency of investing now and pointing to the risk-management aspect in a future world with widely accessible quantum computers.

(+) Session 6 & 7: Applications

Chaired by Rob Thew and Alexandre Blais with talks by Robert Corriveau, David Danovich, Dan Gale, Rob Thew, and Ben Sussman

The Canadian Photonics Industry Consortium (CPIC) gives an industry perspective on how to strategically work together and to develop and identify opportunities. CPIC grew out of the Canadian Institute for Photonic Innovations, which had a focus on supporting and investing in technology development. This led to several company formations. The vision of the CPIC is to be a strategic engine for innovation of the photonics industry in Canada and has moved its focus to “Assist Canadian companies to optimize operations and improve profits by facilitating and accelerating the application of photonic technologies that improve quality, productivity and profitability”. Its members are a mix of industry, academia, and government organizations, and it receives a large portion of its funding from NSERC. The federal government’s call for proposals on “superclusters” has led to a proposal on sensing, i.e. gathering, processing, and communicating data.

A major study by the CPIC was presented in “Light Technologies – A strategic Economic Asset”, which describes and analyzes the views of different industries on the role of photonics in their businesses. The report is based on nation-wide workshops, each focused on a particular business-sector, e.g. Energy in Calgary, and with participation of all stakeholders. For each sector the report identifies business trends, photonic industry trends, and opportunities for Canada, all summarized in a SWOT (strength, weaknesses, opportunities, and threats) table for photonics in Canada.

CMC microsystems is a not-for-profit organization that works at the interface between academia and industry and has a sense of what opportunities they each hold for each other as well as the obstacles towards creating greater synergies. CMC is a creator and manager of Canada’s National Design Network. When establishing a collaboration or joint project, the first step is to check if there is alignment of the academics’ and industry’s objectives and expectations in terms of investment and outcomes. For example, industry is somewhat dissatisfied with signing letters of support that never lead to any tangible outcome. Also, industry operates according to roadmaps i.e. a time-schedule and a set of targets, which is less common in academia. To avoid surprises, it is important to verify if resources are available and whether there are strings attached. The correct legal arrangements must be in place. Through it all, good communication is one of the most important facets of a successful collaboration.

In Europe, the research into quantum technologies has been significantly boosted by the announcement of the EUR 1 billion flagship initiative. This initiative did not come about over-night, but was seeded and nurtured into existence by a long dedicated and coherent effort by researchers beginning in the late 1990’s. A key element was several roadmap reports that were developed and endorsed by a large fraction of the quantum research community across all of the EU. At later stages industry players got engaged, leading to a pivotal roundtable discussion with all stakeholders including industry CEO’s and members from the European Commission. In May 2016, all the work came to fruition in the form of the Quantum Technologies Flagship initiative. An important aspect has been the broad community consultation in the development of the initiative. The strategic research agenda describes 4 application domains (Communication, Computation, Simulation, and Sensing/metrology) all involving elements of Engineering/Control, Software/Theory, Education/Training and all underpinned by a foundation of Basics Science. Within 3, 6 and 10 year timeframes, the initiative sets very clear goals, stated in terms of objectives and not in terms of a specific technology. A clear and transparent management structure and industry involvement are crucial aspects of the initiative.

The ultimate question is how the quantum research community in Canada is strategically positioned and able to leverage the experience from, e.g., Europe and other industries to realize Canada’s maximal quantum R&D potential. This question is addressed by the Quantum Canada initiative.

Currently Canada spends more per capita on (non-classified) quantum related research than many other developed countries and it has a substantial mass of world-renowned researchers in its universities. Making the case for more investment in quantum, one has to display the positive impact it would have. This impact is often captured by the triple-bottom-line (Social + Environmental + Economic). Some factors that might dissuade from further investment are competition for scarce funds, no clear community voice, unclear “business case”, and lack of public and political support.

In the discussion, the question was raised if Canada is missing a funding source for innovation, e.g., a Canadian DARPA? It was remarked, that the Canadian funding ecosystem appears disorganized to people from the outside. We should compare with other major Canadian community entities such as Genome Canada and Nuclear Canada. We should get industry involved and convince it of the need for investment so that they can put their weight behind a Canadian quantum network initiative. Important is that we also try to understand the industry’s problem book.

Having an entity to incubate and fund start-ups would be an option/requirement. Important to remember is that many spin-offs are based on state-of-the-art technology developed for quantum networks, but the true market gold-mine is more likely in unrelated applications. Government also has an important role to play as a first adopter and first customer for industry, as a funder of demonstrators and critical infrastructure etc. A supercluster could be an entity, which in the future would be able to support a quantum network effort.

An effective way to convince decision makers and get the story to the public is through test-beds, which can demonstrate the viability of the technology. Test-beds are also important for encouraging the development of complete systems that work.

(+) List of attendees

Aimee Park, Canadian Institute for Advanced Research
Alex Lvovsky, University of Calgary
Alexandre Blais, Université de Sherbrooke
Amy Cook, Canadian Institute for Advanced Research
Andy Sachrajda, NRC Ottawa
Archana Singh, Western Economic Development
Barry Sanders, University of Calgary
Ben Sussman, NRC Ottawa
Brent Barron, Canadian Institute for Advanced Research
Charles Thiel, Montana State University
Christoph Simon, University of Calgary
Dan Dalacu, NRC Ottawa
Dan Gale, CMC Microsystems
Daniel Oblak, University of Calgary
David Danovitch, Université de Sherbrooke
Nikolai Lauk, University of Calgary
Konrad Lehnert, JILA
Michal Bajcsy, University of Waterloo
Paul Barclay, University of Calgary
Robert Thew, Université de Genève
Robert Corriveau, Canadian Photonics Industry Consortium
Thomas Jennewein, University of Waterloo (day 2)
Wolfgang Tittel, University of Calgary
Zachary Dutton, BBN Raytheon (day 1)

Regrets:
Cheng-Zhi Peng, University of Science and Technology of China
Matteo Mariantoni, University of Waterloo
Rebecca Finlay, Canadian Institute for Advanced Research
Stephanie Simmons, Simon Fraser University
Wenzhou Zhang, University of Science and Technology of China

Download the PDF

Related Ideas

Announcement | News

CIFAR Research Workshops: Call for Proposals

For more than three decades, CIFAR’s global research programs have connected many of the world’s best minds – across borders...

Symposium Debrief | Quantum Information Science

CIFAR workshop on quantum networks

Executive summary The two-day workshop brought together Fellows from CIFAR’s program in Quantum Information Science along with leaders in academia,...

Video | Quantum Materials

Building a Breakthrough, Atom by Atom

CIFAR Associate Fellow Jennifer Hoffman shares her research goals in this video produced in partnership with Research2Reality. From striving to...

Reach Magazine | Quantum Information Science

Quantum Insecurity

The ability to send gigabytes of data around the planet in the blink of an eye has transformed our world....

Quantum Information Science

How to breed Schrödinger’s cats

Alexander Lvovsky is a physicist, a CIFAR Quantum Information Science Fellow and most recently, a Schrödinger’s cat breeder. In 1935,...