Oratie Prof.dr. P. Börnert

14 maart 2014

Bridging the gaps


Rede uitgesproken door Prof.dr. P. Börnert op 14 maart 2014 bij de aanvaarding van het ambt als hoogleraar bij de faculteit der Geneeskunde met als leeropdracht experimentele klinische MRI.


Mijnheer de rector magnificus, zeer gewaardeerde toehoorders;

It is a great honor being elected for the chair of Experimental Clinical MRI; a chair at Leiden University endowed by Philips Healthcare. It is an honor for a couple of reasons. Leiden is the oldest Dutch university (founded in 1575), has a long tradition in science, education and research and is built on a solid basis, enclosed by a number of cornerstones formed by many people and their work.

Natural science / physics

As a physicist, I will focus on the physics cornerstone first. Here we have people like Huygens, van der Waals and Zeeman, who got their basic education in Leiden, partly up to the PhD level, afterwards moving to other places in Europe contributing significantly to physics. We have Kamerlingh-Onnes, who liquefied helium and discovered superconductivity, as well as Lorentz, who formed the math for Einstein’s special relativity theory and Gorter, a pioneer in NMR, what we will talk about in a minute: all of these great people have contributed to modern physics as we know it today - a physics, which has been and still is unbelievably successful and powerful and has influenced many parts of our society. And in such a historical place, realizing that we are “standing on the shoulders of giants”, one might pose the question why modern physics, for which the principia can serve as birth certificate, was born at the end of the 17 century in places like this one?

The answer seems to be rather simple. At that time, physicists and natural philosophers approached craftsmen, craftsmen like lens-makers, mechanics and clockmakers, asking them to build new and precise instruments together, i.e. in a joint undertaking. Physics threw in its knowledge about the laws of nature and their effects, to which the ancestors of modern engineering added their skills, know-how and technology. And so they jointly designed and built instruments such as: microscopes, used to explore the micro-cosmos, telescopes dedicated to explore the universe, and clocks to measure change, time, and to facilitate navigation. They built instruments that actually delivered numbers, quantitative numbers, numbers that paved the way for mathematics to become the language of physics and later also the language of engineering. And based on these precise experimental data, new laws could be found and deduced helping to erect the prestigious building of modern physics, as we know it today. This dualism, between these two independent entities: physics and engineering, the synergy between these two antipodes, bridging the gap, was rewarding for both sides: it propelled physics into the modern physics and craftsmanship into modern engineering, later resulting in the industrial revolution that formed the basis for the wellbeing of our western societies today.

However, this dualism, which was and still is the source and basis of many great discoveries and innovations, has also interesting self-similarity features: it replicated itself at different levels, and we can find the same dualism back in other more general pairs like:

  • Methodology and Hardware,
  • Theory and Practice, 
  • Academia and Industry.

Having stressed the foundation of natural science in this environment, you might get a glimpse on the “Experimental” side of the chair; I have the pleasure to take.

Medicine

However, there is another important cornerstone, on which Leiden University is based upon: Medicine. Great physicians like Boerhaave, Albinus and Einthoven have worked here and studied the human anatomy and all the different organs in very detail, thought about circulation, the role of respiration and the effects of different plant extracts to help patients. One of the aims driving those physicians was to understand the miracle of life: what is it about, what is the purpose of the different organs and how are they working together. They were eager to understand malfunction and disease to derive measures to cure, to fix those disorders, or just to care.

And making right here a reference to the medical roots of Leiden University, you might get an idea of the second term in the name of my chair “Clinical”.

But what were the measures, those physicians had at their time? They were able to interview the patients, getting a picture of the symptoms from their reports. They were able to sense body temperature, blood pressure, skin color or check basic reflexes. They had some crude measures to analyze body fluids and were able to listen to the sound of the beating heart, the lungs and the digestive track. But due to the fact that for the range of wavelength our eye is most sensitive to, body tissue is not transparent, they had only very limited means to get information from within the patient’s body. Palpation was one of the very few ways, but is limited to only a few centimeters in depth.

This changed fundamentally, as physics teamed up with medicine, personalized by a Bavarian physicist at the end of the 19 century. Roentgen called his mysterious rays, which were able to penetrate the human body and leave traces on photo sensitive detectors: X-rays. With his discovery, a new era was born. This imaging approach, which revolutionized medicine, did also influence arts and spread out immediately all over the world, just due to its simplicity. All that was necessary, were an appropriate gas discharging tube, operating at the right energy, and a photosensitive detector (film or fluorescence screen), and X-ray projection imaging came alive. And with it a new type of physician emerged: the radiographer and later the radiologist. Already a few years after Roentgens discovery, small national radiological societies were founded. It is interesting to note that some of these young medical societies tried to exclude engineers and physicists in the first place. The young radiologists were so enthusiastic assuming mastering X-rays on their own. A fault, rectified years later, for instance at the foundation of the RSNA (Radiological Society of North America), which explicitly invited engineering and basic science people, bridging the gap, because there were so many issues to be fixed. Electric safety, radiation safety,… in those days, it was a sign of expertise that a radiographer had his fingertips burned, paying a hard toll later, and scattering problems had to be solved by science and engineering. Industry moved in with standards, and reliable X-ray solutions and also medicine joined this cycle of innovation with great ideas. To overcome limitations of this projection-based technique, medicine for instance added ideas on using contrast. Heavy inks, today known as contrast agents, were administered into the patients’ blood stream to visualize vascular anatomy and function of the circulatory system (brain-, heart-, body- and peripheral angiography). So, we are not talking about dualisms anymore, but about much more complicated structures, in which physics or basic science, engineering and medicine were working together, even if some of these disciplines were talking different languages and had different objectives.

However, it took more than seven decades to overcome the limitations of Roentgen’s projection approach. South African and British scientists developed Computer Tomography (CT) as an X-ray-based technique that allows for thin slice imaging, letting a tube and a detector rotate around the patient and employing a computer to solve the difficult inverse problem to form an image from the data. CT was actually developed in a commercial research laboratory, not in academia, and it is a nice anecdote that it was the laboratory of the British record label EMI. In this way, “Rock and Roll” definitively paid its dues back to society. And like the music, CT was very quickly adopted by radiologists, who saw the great potential after the technique became available commercially in the early 70ies.

Magnetic Resonance Imaging

Only a decade later, another imaging modality was born: Magnetic Resonance Imaging (MRI). Like CT, it was developed on the shoulders of previous ideas and findings. During World War II, Russian and American scientists discovered the MR effect. Selected atomic nuclei have some interesting magnetic properties when placed in an external magnetic field: they can irradiate radio waves (RF) at specific frequencies when previously excited with appropriate RF energy. The electronic shielding of these nuclei caused by electrons involved in chemical bonding of corresponding molecules results in slight frequency shifts of the MR signal, making those signals very characteristic, a molecular fingerprint. This helped to identify low-molecular chemical compounds by means of their MR spectrum, which made MR essential for chemical analytics. Thus, it is not surprising that a chemist, Paul Lauterbur, had the idea to develop this spectroscopic technique further to an imaging approach in close analogy to CT.

Most MRI is done today using the proton, the nucleus of the hydrogen atom, which is everywhere present in the human body. The human body consists roughly of 65% water, 10% fat and 20% proteins and minerals, where the latter two are often difficult to detect directly by proton MR. So, if we focus on the water, which is involved in so many structural functional and metabolic processes, one sees that is a great idea to use it directly as a reporter molecule. It is present in all in-vivo environments: in vessels, in all body fluid reservoirs, in the interstitial space, in cells, it is trapped in ligaments and bones, being actually everywhere, and from all these locations, one is able to obtain signal, which is influenced by the local, specific environment, depending on the way, how the experiment is performed, in a huge variety and great soft-tissue contrasts. Therefore, MRI is overcoming limitations we still have in CT, making MRI an extraordinary complement to CT. However, without CT, MRI would never have been able to make it. First, there are the close conceptual similarities as imaging modalities, and on the other hand, with the increased complexity, which CT shows compared with projection X-ray, CT already raised the technology bar and also the price tag for sophisticated imaging equipment. MRI is even more complex than CT, but CT prepared the stage. Therefore, also technology-wise, one can claim “we are standing on the shoulders of giants.”

With this being said, one is easily able to understand the third term in the name of my chair “MRI”.

In the first decade, MRI was clearly dominated by physics and engineering, just to meet the needs. Whole body magnets, based on superconductivity principles, had to be designed and built, power-efficient and safe RF transmit systems and very sensitive MR signal receive systems had to be developed, gradient systems had to be mastered allowing spatially resolved measurements (just to make the difference from MR to MRI), and basic methodology had to be developed to facilitate MR imaging in many diverse applications. The entire MR system is full of high technology (cyro-, high precision high power amplifier-, RF-, low noise-, computer- technology), it is actually “synergy at work”, which is not always obvious to the user, and its development was and is based on many great engineering ideas, taking advantage of the physics and basic science principles behind.

In the early days, methodology was always challenging the existing hardware, and vice versa, newly available hardware provoked new ideas, because better hardware allowed even better methodological approaches and functions. It was this real dualism at work, in which also the different approaches to MRI became obvious. While engineers, who often see themselves in charge of getting a system running in a stable and reliable way, are always concerned about MR image artifacts, physicists, in contrast, are sometimes excited about artifacts. Those are seen as challenges, as something that results from a lack of understanding and not only a source of potential system malfunction, so, as an opportunity for innovation. Artifacts, if sufficiently understood, could turn into a source of information, new information, if used in the right manner. In that respect, Lauterbur’s idea of MRI is just a smart fix to the magnet homogeneity problems of the past. Poor magnet shimming represented a serious problem in MR spectroscopy degrading the quality of spectra substantially (causing artifacts). But the idea to introduce strong magnetic field gradients that can produce inhomogeneities much bigger than the magnet’s poor shims, but in a controlled manner, allows for the spatial resolution of signal detection, transforming MR spectroscopy to an imaging modality.

Flow artifacts became visible already in the first MR images. Trying to understand their origin allows us to measure selectively, in addition to spatial resolution, blood flow, its volume and velocity very precisely as no other modality can do in vivo. These data will support fluid-dynamic modelling that can guide physicians in interventions and angioplasty in the future, selecting and placing stents, avoiding future vessel rupture or occlusion. Thinking about motion on a much smaller scale than macroscopic flow allows to probe, even with the limited spatial resolution in MRI, microscopic scales: mapping diffusion - an unbelievably powerful tool to explore microstructure on the cellular level. This adds another contrast for tissue characterization, important to identify tumors or stroke, and if resolved regarding its direction, its anisotropy can give information about fiber orientation, important in muscles and nerves; with the latter potentially being helpful in answering the question, what is intelligence actually about? Is this the structure of the gray matter cells or is the wiring, the interconnection of the different brain compartments in the white matter. So in short, thinking about artifacts can bring one far beyond the technical scope.

Make numbers

There are numerous physical effects that influence the MR signal, and those can be used to encode corresponding information. Apart from basic MRI parameters like T1, T2, chemical shift, tissue composition and density, there are other parameters like tissue order, tissue oxygenation, perfusion, flow, motion, elasticity, strain, electric tissue properties and even more, which underlines the huge variety of information addressable by MRI. However, seeing this zoo of parameters, MRI is able to measure, the need for quantification becomes obvious. Currently, radiologists, who are trained over many years to see the tiniest contrast changes on MRIs, perform diagnostic reading in a very time consuming and tedious way. This is a complicated process, using undocumented knowledge, often performed in qualitative manner taking / requiring lots of work. Especially the huge amount of data to be read demands augmenting means supporting decision taking. Having real numbers, linked to anatomy, physiology or function can ease diagnosis and decision taking substantially. Quantification is, therefore, essential also to identify important parameters or contrasts for potential use as biomarkers that can guide therapy decision efficiently. Here, the scientific community is rather at the beginning. Imaging is great, it allows intuitively finding, localizing and sometimes also characterizing the medical problem, but an image also contains much more information than needed for diagnosis and treatment. Quantification, that involves lots of image processing and appropriate physical and biological modelling, is also a mean for information compression.  We heard already today that numbers were very important to transform anecdotal physics into the modern one. Quantitative MRI and quantitative medicine have the potential to undergo something similar.

Parallelization

But MRI has a real problem, and this is the low signal to noise ratio (SNR), which is actually the flip side of not having high energy radiation involved like in CT. MRI is not at risk of causing X-ray related radiation damages, but due the low quantum energy, in the RF range, the signal level is very low. There are two principal ways to increase the SNR: first by increasing the magnetic field strength and the second by improving RF detection technology. Both avenues have been pursued, and so, MRI moved from low-field systems (below 1T) to high (3T) and ultra-high fields systems (7T/9T) increasing SNR with field strength. In addition, parallel reception has been introduced using a bunch of small detection coils, which receive signal only from a portion of the patient. If the patient is the dominant noise source, SNR can be significantly improved compared with a single volume coil approach.

Nowadays, parallel reception represents a key MR technology, it took some time for the vendors to equip their systems with multiple receivers, a considerable technical effort, but it also took the science community some time to really understand this potential. A full decade was necessary to really revolutionize MRI. Parallel reception is, therefore, an excellent example that new hardware is challenging methodology. Local antennas, with their spatially focused reception sensitivity, are not only able to improve the SNR, but are also capable to locally encode information, much faster than gradients can. In that way, parallel reception allows to under-sample k-space data, the data used to reconstruct an MRI image, and to compensate for the resulting image artifacts by incorporating the information of the coil sensitivity into the reconstruction. This allows accelerating MRI by measuring less data per channel, but with more coils simultaneously, compensating the loss of SNR due to the shortened total scanning time, improving clinical diagnostics and patient throughput at the same time. Parallel reception is a great and very powerful concept introduced at the end of the last century, but we are working on improvements and refinements still in our days. 

Physicists are often attracted by symmetries. They try to find them in all structures, because symmetries often make theory, description and understanding a lot easier. The same happened in MRI. Inspired by the successes of the parallel reception concept and being also a little bit jealous about it, physicists saw and tried to explore the symmetry between signal detection and signal excitation, they bridged the gap. In that way, parallel transmission was born, showing mathematically and schematically very close similarities to the sampling process. Driven by the analogy, one could translate: optimal and uniform receive coil combination with RF shimming, accelerated MRI (SENSE) with Transmit SENSE and SNR optimization with SAR management (specific absorption rate). This new technology turned out to be door opener for clinical high field MRI at 3T and beyond. At high field strengths, the RF wave length inside the body approaches the body dimensions. RF wave propagation results in standing wave phenomena, which substantially degrade RF transmit homogeneity. As a consequence, the image contrast can become a function of space, a fact that can make the life of radiologists hard because if MRI contrast change results from the instrument’s performance and not from the patient’s anatomy / pathology, clinical diagnosis becomes difficult. Therefore, parallel transmit already found its way into the clinic to maintain high field image quality. However, compared to parallel reception, parallel transmission is even younger and more open questions have to be addressed and problems to be solved. Ultra-high field applications are definitively on the list, but also applications for low fields are conceivable, just to tailor the spatial area MRI is sensitive to.

However, there is more about parallelization. Parallel computing has hardly been noticed by users, because MRI systems have to cope with the high data rate and reconstruction efforts. Parallel gradients have been proposed, employing more than three non-orthogonal gradient directions of higher order, supporting new and very fancy ways of spatial encoding. This is still research and might help to overcome some currently existing limits. There is one component of an MRI, which has not being parallelized yet: the magnet; and actually, there are definitively good reasons for not doing that.

Future

MRI is becoming mature. Basic technical questions in terms of hardware and methodology have so far been solved, and MRI is already hitting some boundaries. The rate of field strength increase is slowing down. Magnet prices explode at ultra-high fields, and a lack of helium is anticipated. We are at the edge of current magnet technology. Also regarding imaging speed, there are some limits, because too high gradient switching rates can provoke nerve stimulation. However, facing those restrictions do not mean that this is the end of research and development for this imaging modality.

MRI already teams up with other modalities, is bridging the gap. The merger of PET (Positron Emission Tomography) and MRI is very promising. PET, with its capability to deliver metabolic information, combined with the highly resolved anatomical and functional information MRI can deliver can give new diagnostic and therapeutic insights. MRI’s great soft tissue contrast can help to guide interventions. MRI has entered the operating room, teaming up with classical and minimal invasive, catheter-based surgery and interventions inside or near the magnet. MRI-HIFU is a merger of high focused ultra-sound, used to destroy unwanted tissue, under the supervision of MRI to find and to track the lesion ensuring therapy success by continuous temperature mapping. The combination of MRI and radiation therapy is along the same lines. MRI is guiding the actual therapy and can update the therapy plan as well.

But no matter what, MRI is getting mature and seems to become an everyday commodity. People simply want to use MRI as a camera. They want to take great pictures of high diagnostic quality and don’t want to be bothered by choosing the right aperture, focal length, exposure time, etc. - simplicity and workflow are becoming key features.

On the other hand, shorter procedures times are needed simply for cost efficiency reasons and to give more patients access to MRI. Thus, in the medical community, people asking questions, how much time is really needed to rule out potential causes for headaches? Could we live with 95% confidence obtained in 10 minutes or 99% in 45 minutes? In any case, we need faster scanning, better logistics and workflow, but most importantly, we need a clear knowledge about real clinical needs, just to scan the right part of the anatomy, the right biomarker, within the given time. Therefore, we have to keep this triangle of “clinical need-methodology-hardware” vital and alive.

Furthermore, we have to get the best out of the data, the data we have been able to sample in a given amount of time. We have to boost image quality, enforcing consistency among all sampled data, taking advantage of the redundancy in the data, using available prior knowledge about the patient and the sampling process, and by doing so, we should also explore new sampling theorems like compressed sensing.

As mentioned before, we have to deal with all these data, the images and data of an individual patient obtained in multiple sessions or those of an entire population. Because within these data, there is lots of correlation, which can further be explored using “big data” technology. Even if this approach seems to be rather brute force, and even if some people complain that “looking only for correlations” could be the end of science, it can help to sharpen our senses and to find and investigate underlying laws and relationships in more detail.

In short, there is still so much to do!

Myself

I entered the field of MR in the mid-eighties, a decade after Lauterbur’s great idea. For me, it was like a dream coming true. After studying physics, I joined a very big governmental research institute in Berlin and got a job in a scientific instrumentation department, exposed from the early beginning to the duality between basic science and engineering or methodology and hardware. In a very small team together with my later friend Wolfgang Dreher, we transformed an existing solid state MR spectrometer into an MR microscope, adding gradients, logics and computers. We learned MRI from the very beginning. I joined a Ph.D. program and moved away from the proton mainstream, doing 19F MRI to map artificial blood substitutes and used them to measure the oxygen partial pressure and finished my Ph.D. five years later. 

I moved to Bremen as a post-doc and was infected with the “fast imaging” virus, and still have not fully recovered. David Norris passed it over to me. We studied all types of magnetization preparations and coherences in TSE and became close friends. And it was David, who pushed me further, to apply for a vacancy at Philips Research.

So, I went to Hamburg, from academia into an industrial research laboratory, and I was actually really proud to become a member of such a prestigious lab (Philips Research!). Industrial research labs like Bell Labs, IBM Research, GE Research and Philips Research had a great reputation. I joined the MR group finding myself in a little MR hardware project being the only methodological person in that and drove the hardware to its limits; great times – rewarding – dream fulfilling.  I focused on fast and ultra-fast MRI, echo-planar, spiral imaging and other efficient non-Cartesian sampling schemes. I designed multi-dimensional RF pulses for curved slice imaging and navigators for motion sensing, I worked with Bernd Aldefeld and Jochen Keupp on continuously moving table imaging, a smooth alternative to virtually extend the scanner’s usable field of view, with Jürgen Rahmer on ultra-short echo time imaging to visualize structures that can’t be seen with conventional MRI, with Holger Eggers, I had the pleasure to work on water-fat imaging, with Mariya Doneva on compressed sensing to further accelerate parameter mapping and with Tim Nielsen on alternative motion compensation (and this list is not complete). In many of these projects, I got in touch with the “real” industry.

Already as a very young Philips researcher, as I mastered the fastest MRI sequence (EPI) on a clinical scanner, I came in contact with the many bright people at Philips Best. Great people like Jan Groen, Giel Mens, Gerd van Ijperen, Miha Fuderer; people, who were really making a product and not only a nice concept or just a proof. And the way, in which they posed their questions, was sometimes so different to me, helping me to broaden my scope over the following years, while running numerous projects together. It made me somehow humble.

With every new technology we were able to master, we also tried to find new clinical applications. Just to bridge the gap to the clinical needs, to solve problems and to get challenged from the clinical side as well. And indeed, we have been challenged: for example, cardiac MRI kept us busy for many years and was full of very nice technical problems. We invested into coronary MRA together with Kay Nehrke, surely meanwhile a friend of mine, and many great Ph.D. students, we threw in all the advanced technology we had to offer (EPI, spiral, navigators, phased arrays, motion correction), we teamed up with all available clinical expertise, with people like Matthias Stuber and Rene Botnar (clinical science at Harvard Medical School), but even so we realized that we would not beat CT (yet), although this is not necessary.

I’m very grateful to my scientific fate giving me the chance to be involved from the very beginning in the two major technological breakthroughs in MR. One is parallel reception, where we worked on in Hamburg quite substantially with Peter Röschmann and Christoph Leussler on concepts, coils, arrangements, data handling and also on scan acceleration. I had the pleasure to support two gifted young folks from Zurich (Markus Weiger and Klaas Pruessmann), helping to form an axis between Hamburg and Zurich supporting early SENSE applications and technical research that way. The other breakthrough is parallel transmission, pioneered by the Hamburg group, where I had the pleasure to team up with Ulrich Katscher to finally get Transmit SENSE working, seeing options like RF shimming from the beginning onwards. Working with Ingmar Graesslin, Giel Mens and many others, we were able allowed us to bring parallel transmit into reality, finally reaching patients due to the great work of Paul Harvey, Johan van den Brink, and many, many other often not seen great engineers and physicists in Philips Best.   

’I’m not able to mention all the people who influenced me and my work. ’I’m grateful to my colleagues in the first place, to Philips Research for the great conditions and also for the opportunity to supervise many PhD students, who are very smart persons and a great source of inspiration, challenging me and our work substantially. I’m happy to be in touch with so many smart and ambitious people in Philips Best.  But I’m also grateful to be a part of a worldwide scientific community, a network full of engagement and voluntarism. Because what is science actually about? It is about the people, their curiosity, their ambition, their endurance. Science is a challenge, it is fun, but it is first of all hard work (according Einstein, it is 99% transpiration and only 1% inspiration)! Science needs trust, honesty and freedom and sometimes also needs a stronghold of freedom (Praesidium Libertatis, the motto of Leiden University).

Leiden

Now, you might ask, how Leiden came into the game. The answer is simple: Mark van Buchem! As a visionary, enthusiastic and very smart person, he approached me during the build-up of the Gorter center for high field MRI, and asked, whether I would be willing to support this excellent group (with Andrew Webb, Thijs van Osch, Itamar Roonen, Maarten Versluis), the Radiology department (with Hildo Lamb for instance) and also the LUMC image processing group (of Boudewijn Lelieveldt) with some expertise and small projects, getting the chance to learn about real clinical life, problems and projects.

The head of Philips Healthcare, the Director of Philips MR Clinical Science Ruud de Boer, the Hamburg Research management and most importantly the College van Bestuur of the Leiden University, the Raad van Bestuur of the LUMC and the chair of the Radiology department Hans Bloem supported this part time arrangement. Thank you very much. So, I was promoted in 2012 to a professor at the LUMC, occupying the chair of “Experimental Clinical MRI”, focusing here to learn more about clinical needs in MRI today and in the future while running some projects to improve image quality, high-field RF transmit performance, ultra-short TE imaging and water-fat imaging in a clinical environment.

However, my special focus is on fat, since the LUMC also wants to understand its role and its impact on health. Obesity is a serious threat in the western and also in the emerging countries, for quality of life and also for socio-economic reasons. The metabolic syndrome is completely new to mankind, and we lack understanding on several levels. Studying this is a multi-disciplinary undertaking, involving genetics, endocrinology, radiology, cardiology, data-processing, and much more. Imaging and in particular MRI, is essential. MRI based water-fat imaging is able to resolve the different compartments, because it is the location of fat that matters. MRI has the potential to further distinguish between brown and white fat and to quantify the amount of fat locally. Thus, there is a strong association between the amount of visceral, pericardial and intra-myocardial fat and traditional risk factors for diabetes type 2 and cardiovascular diseases, which is not reflected in the body mass index. Population imaging including MRI and other modalities is potentially the way to understand the underlying mechanisms and their consequences.

There is a simple saying about the ingredients of success: “One needs: the right time, the right place and the right people.” The timing is perfect. A new high field MRI institute has been founded as a branch of an experienced radiology department with many young, promising physicians and excellent basic scientists. And maybe, I am worth to be a part of them. Having said this, I already answered the “right place and right people” -question. I have never seen a place so open to other scientific groups and schools like this, a place, where collaboration is preferred over direct competition, where the universities of two and more cities are teaming up, bridging the gap, to bundle their efforts, but also to gently live their own scientific independence. So, I guess, apart from Hamburg, there is no better place right now. I’m looking forward the future to bridge the gap.   

Ik heb gezegd.