🔒
❌
Il y a de nouveaux articles disponibles, cliquez pour rafraîchir la page.
Hier — 11 novembre 2019Physics World

Liquid-crystal skyrmions swim like schools of fish

Par : No Author

Physicists in the US have discovered that large clusters of skyrmions in a liquid crystal undergo collective motions, much like schools of fish. Hayley Sohn and colleagues at the University of Colorado, Boulder, first spotted the behaviour by accident, but quickly realized that they had discovered an intriguing new form of active matter. Their work could lead to the development of new types displays with the potential to transform the ways in which humans and computers interact.

Soft matter systems like liquids, polymers and foams can sometimes exhibit coordinated life-like behaviour that resembles a school of fish escaping a predator or a cluster of living cells organizing itself into a biological structure. These active soft-matter systems have a range of technological applications and the challenge for scientists is how to fine-tune interactions between individual components – such as molecules in a liquid crystal – to get the desired behaviour.

The initial focus of Sohn and colleagues was not to develop methods of control, but rather to study the dynamics of large numbers of topological solitons, or skyrmions, within a liquid crystal. Skyrmions are particle-like excitations that propagate through the material.

Drastic change

Their latest insight came unexpectedly when the team left its experiment alone while on break and returned to find that it had changed drastically. Instead of being randomly oriented as they had expected, the skyrmions had adopted highly synchronized motions. With further experiments, the team found that the structures had developed a polar order within seconds; grouping into fish-like schools through their elastic interactions. These groups then moved around along spontaneously chosen directions.

The observation is unprecedented in active matter systems because, unlike typical building blocks, skyrmions have no physical boundaries, chemical compositions, or density gradients. Furthermore, Sohn and colleagues discovered a versatile technique for controlling the behaviour of the skyrmion schools. This involves tuning an applied oscillating voltage, which changes the elastic interactions between skyrmions.

Sohn and colleagues believe that their discovery could lead to the development of versatile, reconfigurable active matter. If successful, such technology could be used for applications ranging from realistic models of biological systems, to video games in which unexpected events occur without any necessary programming.

The researchers have also showed that the conditions under which the collective behaviour occurs are much like those found in liquid crystal displays. This, they believe, could bring about significant new advances in the display technologies.

The research is described in Nature Communications.

The post Liquid-crystal skyrmions swim like schools of fish appeared first on Physics World.

Ultrasound could enable future treatments for motor neurone disease

Motor cortex targeting
(a) Functional MRI maps the hand control region of the right primary motor cortex in two patients. (b, c) Gadolinium enhancement in MR images shows successful blood–brain barrier permeabilization in the sonicated motor cortex area. (Courtesy: Nature Commun. 10.1038/s41467-019-12426-9)

In a pilot study on four patients with amyotrophic lateral sclerosis (ALS), Canadian researchers demonstrated how drug molecules that are ordinarily unable to enter the brain were able to pass through into the tissue by using focused ultrasound to temporarily disrupt the blood–brain barrier (BBB). While no treatments were tested, the study aimed to test the safety and effectiveness of the technique (Nature Commun. 10.1038/s41467-019-12426-9).

ALS, commonly known as motor neurone disease, is a degenerative disorder of the nervous system that causes widespread paralysis and eventually death. There is no cure and treatments are generally ineffective. Like many neurological diseases, such as Alzheimer’s or Parkinson’s, the development of new treatments for ALS is impeded by an inability to get potentially promising drug molecules into the brain. This is due to the existence of the BBB, a system of physical barriers and cellular mechanisms in the brain’s vascular system that prevent unwanted molecules from entering this sensitive environment.

Drugs that affect the brain, such as anaesthetics, tend to be small molecules, which can pass through the BBB. However, many new promising drugs are very large molecules, such as antibodies and gene therapies, that are kept at bay by the BBB.

The researchers, based at Sunnybrook Research Institute in Toronto, disrupted the BBB in patients using MRI-guided focused ultrasound along with microbubbles: small gas bubbles, around the same size as a red blood cell, coated with a lipid shell, which respond to ultrasound. Microbubbles can be safely injected intravenously and travel through the blood vessels. Once they reach the BBB, ultrasound is applied to make the bubbles expand and contract within the blood vessels, temporarily making them more permeable. This in turn allows drug molecules in the blood to pass through into the brain tissue. The team monitored the whole process using the MRI scanner.

Focused ultrasound is much less invasive than the most effective alternative for drug delivery: directly injecting drugs into brain tissue, which requires opening up the skull. The approach also enables the BBB to be opened in a variety of regions. The phenomenon was first discovered in rabbits around 20 years ago, and a number of early clinical trials have taken place in the last five years. The Sunnybrook group has previously performed similar trials to open up the BBB in patients with Alzheimer’s disease and aggressive brain tumours.

The team used a commercial ultrasound system containing 1024 transducers embedded in a helmet that is placed within an MRI scanner. To reduce the chance of adverse effects and ensure the sound pressures used were correct, the researchers monitored the therapy in real time by detecting the ultrasound emitted by the bubbles, and by monitoring the temperature inside the brain using the MRI scan. The study did not test the efficacy of drug treatments, but instead used a gadolinium-based contrast agent that showed up within the brain tissue on the MRI.

The patients reported no serious adverse side effects beyond headaches and mild pain. One participant showed slight structural changes on the MRI scan that were not associated with any symptoms and had disappeared in a scan a week later. The BBB closed within 24 hours.

While this study didn’t test any drugs and was not aiming to treat the patients, it demonstrated that the approach works and appears safe. The team now aims to move on to delivering drugs using this technique, opening up a path towards more effective treatments for ALS.

The post Ultrasound could enable future treatments for motor neurone disease appeared first on Physics World.

Self-regulating behaviour helps ants avoid traffic jams

Ants are particularly good at avoiding traffic jams and can move about their business even when they occupy more than 80% of the available space – twice the value that stymies human pedestrians or drivers. According to experiments performed by a team of researchers in France, the US and Australia, ant movement is best described by a two-phase flow function that is very different from existing statistical models of traffic engineering.

The work could be important for studying systems that contain large groups of interacting particles, such as those in molecular biology and statistical physics. It might even have implications for programming self-driving vehicles so that they work together cooperatively, like ants.

Led by Audrey Dussutour of Toulouse University, together with colleagues at Arizona State University and the University of Adelaide, the scientists studied the behaviour of European Argentine ants (Linepithema humile). This highly invasive species can form “supercolonies” containing several billion individuals and is the largest recorded society of multicellular organisms. Their experimental testbed consisted of a nest and a food source, connected by a 17-cm-long bridge with varying widths of 5, 10 or 20 mm.

35 nests and 170 experiments

In a programme of research that spanned 35 ant nests and 170 experiments, Dussutour and colleagues varied the ants’ density (as measured in ants per unit surface area) by populating testbeds with between 400 and 25600 ants, and filmed the results as the ants travelled from the nest to the food source. The researchers plotted the flow of ants heading in two directions across the bridge – towards the food source and away from it – as a function of ant density k. They then analyzed the relationship between q and k and fitted their data to traffic engineering models.

The team found that although all the models performed well overall, none of them predicted the behaviour of the ants at intermediate and high densities. The researchers therefore introduced a two-phase flow function, in which flow first increases linearly as the density of ants increases, then reaches a plateau and remains constant thereafter. This function better described the behaviour of the ants, and it is very different to the pattern seen in humans, where slowing flows of traffic eventually lead to jams.

Traffic flow does not slow down for ants

The team say the ants appear to be adjusting their behaviour to their circumstances. For example, the researchers observed that the insects increase their speed at intermediate densities to avoid congestion. Ants also refrain from joining the flow of traffic at high densities (of 18 ants/cm2), preferring to wait until it has thinned out. In both cases, jams are avoided and there is a steady, uninterrupted flow of traffic.

Ants’ self-regulating behaviour ensures that they forage for food efficiently, Dussutour and co-workers say. It is very different to the external rules applied to car traffic, such as stopping at red lights regardless of whether traffic flow is dense or not. Ants, it seems, establish such rules without the help of traffic control systems.

The research is detailed in eLife.

The post Self-regulating behaviour helps ants avoid traffic jams appeared first on Physics World.

ASTRO showcase: Modus QA presents Clearview 3D gel dosimeter

John Miller, founder and co-owner of Modus Medical Devices, explains how the company’s Clearview radiochromic gel can be used as a dosimeter that provides accurate and precise 3D dosimetry measurements when combined with an optical CT scanner. This short video was filmed at ASTRO 2019.

The post ASTRO showcase: Modus QA presents Clearview 3D gel dosimeter appeared first on Physics World.

Turning science to movie magic

Par : No Author

In the movie Lucy (2014), the title character, played by Scarlet Johansson, ingests a lethal amount of a drug. But instead of dying, the drug allows her to access 100% of her brain, making her superhuman with some incredible physical and mental capabilities including telepathy and telekinesis. It’s a plot device used in many movies – such as Limitless (2011 film and 2015 TV series) and Phenomenon (1996) – but one that sets many scientists’ teeth on edge.

“Anything that involves the myth that we only use 10% of our brains is over used and cringe-worthy,” says Kevin Grazier, a planetary physicist formerly at NASA’s Jet Propulsion Laboratory and Marshall Space Flight Center who provided advice for the sci-fi thriller Gravity (2013). “We use our whole brain!”

Scenes and concepts in movies that mangle the science are not only annoying but can also break the cinematic illusion that has been carefully created by a huge team of screenwriters, producers, actors, visual effects artists and directors (see “VFX in movies: from weightlessness to curly hair”, Nov 2019).

“Any time the audience is questioning the science that they are watching on the screen – or they are saying, ‘That’s not what a laboratory looks like’ – is a moment when they are not paying attention to the story,” says James Kakalios, a physicist at the University of Minnesota who provided science advice for DC superhero film Watchmen (2009).

This is where scientists can help. “It is very important for filmmakers to talk to scientists because we’re really good at figuring out what kind of things happening together could possibly make sense,” says Sean Carroll, a theoretical physicist at California Institute of Technology (Caltech), who has been advising filmmakers about science for around a decade. “[Scientists] care about making a world that to the audience makes some kind of sense, both in how people act and in how the laws of nature act.”

Still from Amazing Spider-Man
Lessons from the lab: science advisers helped with the set dressing in Amazing Spider-Man. (Courtesy: Columbia Pictures/Sony Pictures/Collection Christophel/Alamy Stock Photo)

While connecting filmmakers with scientists might seem like a logical pairing, it’s not necessarily a natural one. “When I first moved to Los Angeles about 10 years ago, I discovered that Hollywood people were afraid to talk to scientists because the only thing the scientists would ever do is tell them why they were wrong,” says Carroll. “You can’t do that. Even if it is true, it’s not helpful.”

To improve the link, the National Academy of Sciences in the US launched the Science and Entertainment Exchange in 2008. The programme connects screenwriters, producers and directors with scientists to create accurate and engaging stories. It challenges the advisers in its database to come up with a better idea when there’s something wrong with the science in a story.

“I could say, ‘I have no business being here,’ ” says Clifford Johnson, a physicist at the University of Southern California, and consultant for Marvel’s Thor: Ragnarok (2017) and the Avengers film series (2012–2019). “I say, ‘No, my job is to help [filmmakers] tell their story, and what I can do as a science adviser is to use the knowledge I have about the laws of physics in our real universe to help build rules for their universe.’ ”

Still from The Martian
Science advisers may give advice to filmmakers about anything scientific, such as how science is performed in The Martian (above); creating accurate physics models of a wormhole for Interstellar (below); and explaining how Dr Manhattan’s powers work in Watchmen (top of article). (Courtesy: 20th Century Fox/Genre/International Traders/Mid Atlantic/Kobal/Shutterstock)
Interstallar wormhole
(CC BY 3.0/James et al. 2015 Class. Quant. Grav. 10.1088/0264-9381/32/6/065001)

I’m going to have to science the shit out of this – The Martian

Science advisers can be called upon at any time during the pre- or post-production process of a film (although they are typically not paid for the contribution). They can be involved in anything from generating initial ideas about the story, working with the screenwriter writing the script, reviewing scripts for shooting a scene, or helping to troubleshoot problems with the story during filming.

“One of my favourite moments on set was when the actors needed to argue about something that had nothing to do with the script and I could pick anything that I wanted; so, I picked a cosmological debate about the beginning and end of the universe,” says Mika McKinnon, a field geophysicist and a science consultant for the sci-fi TV shows Stargate: Atlantis (2004–2009) and Stargate Universe (2009–2011). “The only way [the audience] would know what was going on is if you recognized the equation or were on set that day.”

The interaction between filmmakers and a science adviser could be a brief e-mail, an hour-long phone call, a long lunch meeting or multiple interactions that are a mix of these. The scientists may be contacted once or multiple times during the production process and these communications can sometimes extend for years. “A new screenwriter who doesn’t have a lot of films made may call you up for an informal chit-chat over coffee at a stage where they are still coming up with ideas and the script is not yet written,” says Carroll, who was a science adviser for the sci-fi action film TRON: Legacy (2010) and Marvel’s Thor (2011). “Whereas, if you are dealing with a giant studio – like Disney or Marvel – they know what movies they are going to make, and you generally talk to them when the process is well under way and they are looking to touch up some particular issues in the script or special effects or something like that.”

During his consultations for Thor, for example, Carroll was asked how characters would travel across the universe. But when he said you’d need a wormhole, the production company executive said they couldn’t use the word “wormhole” because it sounded “too nineties” and had been used in too many other movies during that period. Carroll suggested calling it an Einstein–Rosen bridge instead – an idea that stuck and was used in the film.

You’re the genius up here. I only drive the bus – Gravity

The science adviser might help with the plot or some dialogue, or even with the props that appear in the background of a scene. While working on Gravity, Grazier explains that one of his notes to the studio was that the two solar panels on the Hubble Space Telescope rotate, rather than expand. “Sometimes what you do is just adding a line here or there,” says Carroll. “Some of what you do is almost invisible.”

When the creators of The Amazing Spider-Man (2012) contacted Kakalios, meanwhile, they needed him to come up with an equation for longevity. Kakalios did one better and based his equation on the real Gompertz law, which describes human mortality and why people live to the age that they do. “So, I took the Gompertz equation and I added some ‘mathematical glitter’ and at the last minute they said the equation didn’t look complex enough and could I add some more,” says Kakalios. “So, I did and at that stage it was so late that they basically Photoshopped the paper that I sent to them into the character’s notebook.” It was a small detail that most moviegoers won’t have noticed – apart from Kakalios’ daughter, that is. “She said, ‘You’re the guy who sat next to me helping me with my high-school math; of course I know how you write a sigma,’” Kakalios recalls.

For most scientists, the opportunity to be a science adviser means that they get to use their expertise to problem solve in a different way. “I prefer to spend more time talking with filmmakers about the people who they portray as scientists than I do worrying about if they got this scientific detail right,” says Johnson. “Because I think it is way more important to talk about and to show accurately who does science, who can do science, and what their motivations are about doing it, than dwelling on little facts here and there.”

Johnson explains how the movie The Martian (2015) was unusual in that it showed a wide and varied view of who a scientist is. “You see someone who you could have a beer with, the more business-like director, the uber nerd and the awkward young scientist,” explains Johnson. “You see the full range, which is unusual in a film.”

Your ancestors called it magic, but you call it science – Thor

It’s not just directors and filmmakers who are benefiting from their growing relationship with scientists. One advantage of serving as a science adviser for the film industry is that it gives scientists a great opportunity to talk to the wider public about physics, using the big screen as a hook in outreach activities.

Take Kip Thorne – the Nobel-prize-winning theoretical physicist at Caltech, who served as executive producer and science adviser for the sci-fi film Interstellar (2014). Working with Hollywood, he says, gave him tremendous opportunities to be a science communicator. “I was able to inspire a very large number of people about science – the movie sold roughly 100 million tickets worldwide – and there is no other way that I could possibly have reached so many people with my message of the beauty and power of science,” Thorne explains. “Over the years since Interstellar’s release, a large number of young people, in nations around the world, have told me that this movie influenced them to become scientists.”

McKinnon agrees. “Through watching a show, you, the audience, learn how to think like a scientist or you learn how they see data,” she says.

And sometimes it’s even possible to go the other way too, with a passion for science communication landing you a Hollywood role. At least that’s what happened to Rhett Allain from Southeastern Louisiana University in the US. His love of applying physics to movies and TV shows led to his role as a blogger for WIRED magazine – where he frequently discusses the physics in films – and then to his position as science adviser for CBS’s reboot of the TV show MacGyver (2016–present). “My blog gives me a chance to add to what Hollywood has done.”

For Carroll, being a science adviser is also a chance to challenge his students to think differently. “I had a wonderful example of this for a movie called Inversion,” explains Carroll, referring to a sci-fi film that is yet to be released. “The idea is that at random times and places here on Earth, gravity reverses. So, it’s pushing things away from the Earth instead of pulling – and they invited me to help try to make that seem more realistic.”

The actor playing the lead character, a theoretical physicist, wanted to come to Caltech to meet some actual physicists and understand how they think. “Which I thought was wonderful,” says Carroll. “I gathered a group of my graduate students to meet with her and one of the producers of the movie.”

Initially, his students were hesitant to pursue the idea. “When my graduate students had the [gravity] scenario explained to them, their immediate reaction was, ‘No, that will not happen,’ ” Carroll explains. “So, what I said was, ‘Don’t think of this idea that gravity is reversing – that’s not a theory – think of it as data; think of it as the experiment has already been done and now, you need to try to come up with an explanation for it.’ ”

Sometimes it’s not about working with the laws of nature, it is about creating a set of rules that stay consistent throughout the movie. “If you are going to play that game – of imagining different worlds – then, those worlds have to make sense,” says Carroll. “They have to have some logical coherence to them; both in how people act and in how the laws of nature act.”

Still from Spider-Man 3
All in the detail: Accurately depicting science in films could mean using computer models to simulate the movement of sand (above); or knowing how the Hubble Space Telescope’s solar panels move (below). (Courtesy: Sony Pictures/AA Film Archive/Sportsphoto/Alamy Stock Photo)
Hubble Space Telescope
(Courtesy: NASA)

You do the math. You solve the problem – The Martian

Films, however, aren’t just a great communication and teaching tool for scientists. Sometimes the software that moviemakers use translates to the lab.

When Kakalios had the chance to talk with somebody who created the digital effects for the Sandman character in Spider-Man 3 (2007), he commented that the artists must have filmed actual sand because it looked so realistic. Instead, Kakalios found out that Hollywood was using the exact same discreet element methods (DEM) software that he uses to run simulations in the lab.

Thorne, meanwhile, worked closely with the visual effects team at London-based visual-effects firm Double Negative to create and bring to life “Gargantua” – the black hole in Interstellar. While Thorne devised the set of equations, the physicists Oliver James and Eugénie von Tunzelmann from Double Negative developed new computer code called the Double Negative Gravitational Renderer. When the visual effects were complete, the collaborators realized that they had created new visualizing software that could be used for both Hollywood productions and scientific research.

For McKinnon, her best educated guess about a binary pulsar system consisting of two stars that creates a deadly burst of gamma rays approximately every 46 minutes turned out to be less science fiction and more science fact. “It was close enough that three years later, scientists discovered this in real life,” says McKinnon.

Houston, I have a bad feeling about this mission – Gravity

Unfortunately, however, it’s not a perfect system and sometimes a scientist’s advice doesn’t make the cut. “One of the misconceptions people have is that science advisers have copyeditor-like discretion over the science,” says Grazier. “For the writer, we are just one more voice or set of notes; not everyone listens to the science advisers.”

Kakalios agrees. “Sometimes the decision comes down to: does Hollywood want to antagonize a million rabid fans or one physics professor from Minnesota,” he says. “I know the decision I’d make and I’m the physics professor from Minnesota!”

Johnson also thinks that the movie industry hasn’t yet learned the value a science adviser can add. “Writers and directors still think of us as a necessary evil to check a couple of things so that they aren’t glaringly wrong, and no-one knows what stage to call us in. It is changing in a positive direction but there is a long way to go.” According to Johnson, ideally a writer working on a science-heavy film should talk to a scientist at an early stage, when they can offer ideas that improve the story and ensure that the science isn’t just there for decoration.

As with most industries, however, it comes down to money. “Until they [filmmakers] realize that it makes sense monetarily to do it, change will be slow,” Johnson concludes. But with the highest grossing film ever at the time of writing being Marvel’s Avengers: Endgame – a film franchise that frequently uses science advisers – perhaps we’ll be seeing fewer scenes like Lucy’s brain myth.

Science advisers on good movie science

Tornado
Storming ahead: Twister followed advice from meteorologists on storms and storm-chasing. (Courtesy: Shutterstock\solarseven)

Because of my fascination with severe weather and natural disasters, two of my favourite science movies are Twister (1996) and Dante’s Peak (1997). And while doing my own research into these films, I discovered in fact that both used science advisers. Meteorologists at the National Oceanic and Atmospheric Administration’s Severe Storms Laboratory taught the Twister crew about storm chasing, and volcanologists at the United States Geological Survey were technical consultants on Dante’s Peak.

But what about the science advisers I spoke to for this article? What are their favourite depictions of physics on screen?

“I loved the world created in Star Wars,” says Rhett Allain. “And things like the visual of the black hole in Interstellar.” As for Kevin Grazier, he cites 2010: the Year We Make Contact (1985) as a favourite, referring to the sequel of the 1968 classic 2001: A Space Odyssey. “What is interesting is that two of the most exciting scenes in that movie involve getting a relative velocity under the escape velocity and getting a relative velocity over the escape velocity,” Grazier says.

Most science advisers I spoke to, however, highlighted the 2015 Ridley Scott epic The Martian as an example of a movie that represents science well. “I think The Martian did a very good job of sticking pretty darn close to scientifically respectable ideas and, even better than that, it was really a window into doing science in a very high-pressure environment,” says Sean Carroll. “That is what I liked most of all; not the facts about the science being right or wrong, but some impression of how science gets done.”

Clifford Johnson has high praise too for The Martian, which starred Matt Damon struggling to survive alone on the red planet. “One of the things that The Martian did extremely well – which is rare to see – is showing the scientific process,” he explains. “There are entire scenes where we watch the lead character try and figure stuff out to improve his situation; he does experiments; he makes mistakes and tries again. You hardly ever see that in movies.”

“The other thing that was great about that movie was showing the collaboration between scientists,” Johnson continues. “You see the lead character working with his colleagues on Earth and you get to see a full spectrum of different people who can be scientists.”

The post Turning science to movie magic appeared first on Physics World.

À partir d’avant-hierPhysics World

Motions of the planets put new limit on graviton mass

Par : No Author

The motions of the planets have been used to make the best estimate yet of the upper limit of the mass of the graviton – a hypothetical particle that is a quantum of the gravitational field. That is the claim of Leo Bernus at the Paris Observatory and colleagues, who used over a century’s worth of data in their calculations.

In theories that try to provide a quantum description of gravity, the graviton mediates the gravitational force between massive objects. It can be thought of as a gravitational version of the photon, which mediates the electromagnetic force between charged objects. A correct theory of quantum gravity has yet to be developed, but it is possible to test some aspects of nascent theories including their predictions of whether the graviton has a mass.

If gravitational fields have an infinite range – as Einstein’s general theory of relativity dictates – gravitons must be massless and travel at the speed of light. However, some theories of quantum gravity suggest that the graviton could have an extremely small mass. If this were true, it would limit the range of the gravitational force and impose a subluminal speed limit on the graviton.

Orbital deviations

Previous attempts to measure graviton mass have tracked the orbital paths of planets in the solar system and checked for any deviations from paths predicted by general relativity. Recent observations of Mars’ orbit made by Clifford Will at the University of Florida, for example, suggested that the graviton mass must be less than 10-23 eV/c2. In comparison, the upper limit on the mass of the lightest known particle – the neutrino – is about 1 eV/c2.

In their study, however, Bernus’ team noted that Will’s equations did not include the possibility of the graviton having mass, which they say skewed his results towards a zero-mass result.

To address this oversight, Bernus and colleagues adapted previous theories to make the range of the gravitational field a finite and adjustable variable. They integrated this adapted theory into a model called INPOP17b, which predicts the motions of planets, moons and large asteroids in the solar system. Using the position of each body in 2000 as starting conditions, they ran the model backward to 1913 and forward to 2017. They then compared the predicted positions of the objects to their observed positions.

Their analysis gave them a 90% confidence that the range of the gravitational field cannot be any shorter than 1.8×1013 km. This corresponded to a graviton upper mass limit of 6.8×10-23 eV/c2 –heavier than Will’s result. As more and increasingly accurate data on the dynamics of the Solar System is gathered, the team hope to constrain this value even further in the future.

The study is described in Physical Review Letters.

The post Motions of the planets put new limit on graviton mass appeared first on Physics World.

AI says engineers are white men in hardhats, the physics of fun, Shaun the Sheep teams up with the European Space Agency

It is Tomorrow’s Engineers Week 2019 here in the UK, which according to its organizers “provides a unique opportunity for engineers, employers, universities and schools to showcase how engineers working in all sectors are on a mission to make the world a better place”.

An important challenge for engineering community is how to communicate exactly who is an engineer and what it is that they do. This was illustrated perfectly this week by an exercise undertaken by folks at the Royal Academy of Engineering, who used an artificial intelligence machine learning model called a generative adversarial network (GAN) to analyse more than 1100 images of engineers that are available online. GAN then outputted images that it concluded were representative of engineers: and the majority of these were of white men wearing hard hats.

This stereotype was backed up by an online search of images of engineers, where 63% of the images on the first page of results were people wearing hardhats. To address this stereotype, the Royal Academy has a website called This is Engineering, which includes biographies of a diverse bunch of engineers as well as the above video.

Who could resist a preprint called “The physics of fun: quantifying human engagement into playful activities”? The paper is by David Reguera and colleagues at the University of Barcelona, who have analysed the behaviour of millions of video-game players.

They have discovered a scaling law (what else) that describes the dynamics of how people engage with the games. Furthermore, they have identified a phase transition that occurs when a person decides to stop playing a game. Rather than only applying to video games, the team believes that they have characterized, “general and profound behaviour of how humans become engaged in challenging activities with intrinsic rewards.

For those looking to get a head start on Christmas gifts, how about some space-related ovine apparel for that special person in your life? The European Space Agency (ESA) has teamed up with UK-based Aardman Animations and film distributor StudioCanal to produce a range of merchandise to tie in with the Shaun the Sheep movie, Farmageddon.

The film, which was released in October, features an alien called Lu-La who crash-lands near Shaun’s home at Mossy Bottom Farm. Shaun and the gang then help the alien to safely get back home. Earlier this year, to prepare for his space adventure, Shaun even flew onboard an Airbus A310 aircraft that simulates the weightlessness that astronauts experience in space by undergoing a series of “parabolas”. The collection – branded with ESA and Farmageddon logos and icons – includes T-shirts, hoodies, mugs and bags. With a mug costing €15.90 and a tote bag setting you back €19.90, the design seems not the only thing that is astronomical.

The post AI says engineers are white men in hardhats, the physics of fun, Shaun the Sheep teams up with the European Space Agency appeared first on Physics World.

Crash and burn

Poignant, haunting, deep, poetic, existential, intimate – I have read a combination of some or all of those words in a number of reviews, fawning over the latest blockbuster science-fiction film, Ad Astra. Featuring seasoned Hollywood star Brad Pitt as an astronaut, the film tells the tale of his epic quest across the solar system, to save our planet from a deadly physical force, while also trying to solve the mystery surrounding his long-lost father. But despite being the prime audience for such a film, I sadly found it lacking in almost every sense.

From towering space platforms (“antennae”) that extend into the outer atmosphere of Earth, to a Moon-buggy chase on the lunar surface, all wrapped up in a slightly ominous space-thriller veneer, the film trailer promised everything and more. But as the first 20 minutes or so of Ad Astra played out, I was filled with an ever-growing sense of dread, that this was not the film to follow in the footsteps of recent fantastic offerings such as The Martian, Interstellar, Arrival and First Man. To properly explain the issues I had with the film, I must reveal most of the plot, so if you haven’t already watched it, major spoilers ahead.

Set simply in the “near future”, cool, calm and always-composed Major Roy McBride (Pitt) is an astronaut, and son of renowned astronaut Clifford McBride (Tommy Lee Jones), leader of the “Lima Project”. Launched some 26 years prior, the mission’s aim was to find intelligent life elsewhere in the universe, and was deployed to orbit around Neptune. Unfortunately, all contact with the mission was lost 16 years ago. In the present, McBride junior works on one of the lofty antennae many dizzying miles above the planet. Within the first few minutes of the film, an unexplained, immense “energy surge” knocks Roy and others off the tower. An injured Roy wakes in hospital to discover that these deadly surges have swept across the planet, somehow knocking planes out of the sky and killing thousands.

Our hero is invited to a top-secret meeting where it is revealed that his father may still be alive and somehow responsible for said surges, which are due to “cosmic ray bursts” via “antimatter”. Roy’s mission, should he choose to accept it, is to travel to a military base on Mars to try and establish contact with his father.

Roy must take a commercial flight to the Moon, and then travel onwards to Mars via a ship, Cepheus, located at a base on the far side of the Moon, across a war zone, complete with gun-toting space pirates. The journey to the Red Planet itself is not without drama, thanks to an absurd encounter with an abandoned biomedical research vessel overrun with angry, man-eating primates.

Once on Mars, Roy reads out a prepared message for his father. While his first stoic message, penned by top brass at the US Space Command goes unanswered, Roy suddenly goes off-script and makes an emotional appeal to his father – and seemingly gets a response. Unfortunately for Roy, his few moments of emotional exposition see him banned from the rest of the mission.

At this point, Roy is shown some classified footage, which reveals that Clifford’s crew, keen to return to Earth, staged a mutiny, causing him to turn off their life-support systems and kill them. Roy also finds out that the Cepheus crew has now been tasked with travelling to Pluto to destroy whatever remains of the Lima Project station with a nuclear payload (because you can blow up anything, including antimatter). Deciding that this burden must be his own, Roy reaches the ship just as it is about to take off, scales the outside as it fires up, opens a hatch and climbs into this now-aloft rocket – easy. A fight ensues and Roy is the sole survivor. He then makes the decision to single-handedly fly the ship to Neptune.

Once at Neptune, Roy parks Cepheus, hops into a shuttle and zips through Neptune’s rings to get to the Lima space station. Once inside, Roy plants the nuclear bomb before finally meeting his lonesome father, who explains that the surges are a result of the station’s malfunctioning antimatter power source.

It’s at this point that one of the film’s more meaningful scenes plays out, as Roy attempts to reconnect with Clifford, who is shown to be colder and more disassociated than ever, following his long isolation. Roy tries to convince father dearest that they must destroy the station and return to Earth. Initially successful, the tethered pair leave the space station, only for them to have a rather awkward, slow-motion tussle in space. A reluctant Roy unclips his father at his request, and watches him float away. He then uses a piece from the station as a shield and boogie-board, and pretty much surfs his way back up through Neptune’s rings, using his spacesuit’s thrusters to jettison himself back to his ship (which hasn’t floated away). Without enough fuel to return to Earth, he relies on the “shock wave” from the nuclear bomb, which destroys the station (antimatter and all), to make it all the way back to Earth. Roy has managed to snag all of the Project Lima data, which supposedly shows that we are alone in the universe (“We’re all that we’ve got,” he dreamily muses), inspiring him to reconnect with his fellow humans and his estranged wife. Phew.

As the beginning of the film revealed, Project Lima was situated by Neptune to get beyond the Sun’s heliosphere, which apparently inhibits our ability to detect alien life. In reality, the heliosphere extends far beyond Pluto, before the effects of solar wind are no longer felt. More importantly, our techniques for detecting far-flung exoplanets, and any possible signatures of life on them, do not involve visually imaging them – instead, we look for markers such as wobble in a parent star to detect a planet. We also look for certain biomarkers in a planet’s spectra for signs of life. The Lima Project just doesn’t make sense right from the start, so its eventual “failure” does not have the impact it should.

Another major scientific issue for me is the travel to Neptune. First, the ship Cepheus is mainly used to travel from the Moon to Mars, but is magically capable of making the much more distant journey to Neptune – and back. But the main issue is that of distance. The film makes a point of mentioning that it takes Roy 19 days to travel from the Moon to Mars, but only 79 days from Mars to Neptune. Even assuming that in this near future we have managed to build “antimatter engines” (a very far-off scientific feat), if it takes Roy 19 days to travel the 54.6 million kilometres from Earth to Mars, it ought to take him 1531 days (about four years) to travel the 4.4 billion kilometres that lie between Mars and Neptune. But somehow that return journey takes him a mere five and half months.

I could have forgiven all of these massive blunders, though, if the heart of the story – the fractured relationship between a father and son – was properly fleshed out. “I do what I do because of my dad,” says Roy at one point, but his words seem hollow. In the end, I think Ad Astra’s problems lie in the fact that director, producer and writer James Gray does not like the idea of intelligent aliens and meeting them someday, as he revealed in an interview with Digital Spy. His message is that humans only have one another to fall back on – but his science-fearing cosmic quest was unnecessary to put forth that simple idea.

Despite its many faults, Ad Astra does have some positives, mainly in the fantastic cinematography and special effects. The feel of the film is simultaneously futuristic and bleak, a perfect setting for a dystopia. The various locations of the Moon, Mars and Neptune all look stunning, especially as the film is shot in 35 mm, making it perfect for the big screen. Go watch Ad Astra if you fancy two hours and 20 minutes of a beautifully shot but confusing perfume commercial – after all, it’s in space, where no-one can hear you scream.

  • Directed co-produced and written by James Gray and Ethan Gross 2019 20th Century Fox 124 minutes

The post Crash and burn appeared first on Physics World.

Fabiola Gianotti to remain CERN boss until 2025

The Italian particle physicist Fabiola Gianotti is to continue as head of the CERN particle-physics lab near Geneva until 2025. The move marks the first time in CERN’s history that a director general has been appointed for a full second term. The announcement was made at the end of the 195th session of the CERN council on 6 November with Gianotti beginning her second term on 1 January 2021.

Gianotti has been at CERN since 1994 and from 2009 to 2013 was leader of the ATLAS detector. On 1 January 2016 she became the first woman to lead the lab taking over from the German physicist Rolf-Dieter Heuer. “During [Gianotti’s] first term, she excelled in leading our diverse and international scientific organisation, becoming a role model, especially for women in science”, says Ursula Bassler, president of the CERN Council. “I’m delighted to see Fabiola Gianotti re-appointed for a second term of office. With her at the helm, CERN will continue to benefit from her strong leadership and experience.”

The move now means that Gianotti will oversee the completion of the high-luminosity upgrade to the lab’s Large Hadron Collider (LHC), which is set to come online in 2026. This will see the collider’s luminosity boosted by a factor of 10 over the existing machine. “I am deeply grateful to the CERN council for their renewed trust. It is a great privilege and a huge responsibility,” says Gianotti. “The following years will be crucial for laying the foundations of CERN’s future projects.”

Indeed, in May 2020 European particle physicists are set to publish an update to their future strategy, in which it is hoped that a clear direction will be made for which collider could succeed the LHC. Lyn Evans, who masterminded the LHC’s construction, told Physics World that Gianotti has been “very good” at managing the lab as director general, adding that one of her main roles will now be “laying the foundation for the long-term future of CERN with input from the European strategy group”.

Analysis: Gianotti will lead CERN at an important crossroad for the lab

Many physicists have welcomed the announcement that the Italian particle physicist Fabiola Gianotti will complete a second term as head of the CERN particle-physics lab. “Congratulations to Fabiola Gianotti on a historic second term as [head] of CERN,” tweeted Nigel Lockyer, director of Fermilab in the US.

The decision to stick with Gianotti for another five years makes sense. The Large Hadron Collider (LHC) is currently shut down until January 2021 and then it will continue at 13 TeV for three years. From 2024 to 2026, it will be turned off again as the components for the high-luminosity upgrade (HL-LHC) are installed. This means that even the upgraded machine will not come online by the time that Gianotti has finished her second term. Continuity at the top of the lab will be vital.

However, Gianotti will certainly have her work cut out making sure the upgrade goes smoothly – after all there are sometimes more problems to solve during a shutdown than when it is operating routinely. But perhaps Gianotti’s biggest challenge will be planning for the lab’s future beyond the HL-LHC.

For the past decade, CERN has been at the forefront of the so-called “energy frontier”, but there is a real prospect that Asia will now take that crown in the form of the International Linear Collider (ILC), currently being pursued by Japan, as well as the China Electron Positron Collider – a huge 100 km circular machine that is being designed by Chinese physicists.

CERN has their own plans to build the Compact Linear Collider that would operate at higher energies than the ILC, but its design is not as mature. In an interview with Physics World earlier this year, Gianotti indicated that if Japan went ahead with the ILC then CERN would pursue a 100 km circumference proton-proton collider “complementary to the ILC”. But given the huge costs of construction — around £5bn for the tunnel alone — would it ever see the light of day?

Over a decade ago, Fermilab in the US, which operated the 1 TeV Tevatron collider, faced a similar issue when the energy shifted to Europe and the LHC. It opted to move away from smashing protons together and used them produce neutrinos instead (dubbed the “intensity frontier”). The lab is now currently building what will be the world’s most powerful accelerator-based neutrino experiment – the Deep Underground Neutrino Experiment.

If the energy frontier moves to Asia in the mid-2030s, then CERN may need to follow Fermilab’s example and look at new avenues of research or, perhaps, develop new collider technologies.

The post Fabiola Gianotti to remain CERN boss until 2025 appeared first on Physics World.

EU’s forestation targets demand dietary rethink

Par : No Author

Restoring and expanding Europe’s forests will be a vital part of the EU’s efforts to limit atmospheric CO2, but reforestation targets are probably unattainable without profound changes to the region’s food system. Heera Lee, at Karlsruhe Institute of Technology in Germany, and colleagues in the UK and Romania, modelled hundreds of scenarios in which they varied parameters such as food demand, land use and crop yields. The researchers found that, even with significant increases in the productivity of farmland, the majority of simulations forecast success only when meat consumption was cut drastically.

As human-caused CO2 emissions continue to rise, the large-scale removal of carbon from the atmosphere becomes ever more necessary to keep global temperatures to within 1.5°C of the pre-industrial average. This is sure to take the deployment of many complementary techniques, but the simplest of them is to plant more trees – billions upon billions of them.

The problem – particularly in a densely populated region like Europe – is that trees take up space, and much of that space is already used to grow crops for human consumption, animal feedstocks and bioenergy. To investigate how these competing demands might be met in the 28 countries of the EU, Lee and colleagues simulated Europe’s land use with an integrated assessment model.

The assessment platform that the researchers used models how land use responds to a range of interacting variables, including biophysical conditions like soil type and rainfall, and economic factors such as food demand and prices. Since the aim was to test outcomes that could meet the warming target set down by the Paris Agreement, the researchers ran the simulation assuming a temperature increase of 1.5°C. One factor that they kept constant throughout was the net proportion of food imported from outside the EU, so that they avoided scenarios in which European nations met their forestation targets by outsourcing food production.

After running more than 300 unique scenarios for each of three climate models, Lee and colleagues found that current dietary choices – meaning present-day levels of beef consumption in particular – were compatible with adequate levels of forest expansion only as long as agricultural yields increased significantly. But even then, with yield improvements of 75%, these model outcomes implied an overall decrease in daily calorie intake relative to today’s European average. Without any meat at all in the European diet, the most ambitious target for increased forest coverage could be achieved with yield improvements of 15% and no decrease in daily calorie consumption.

There are good reasons to believe that technological changes, such as crop breeding or better land management, will deliver significant increases in agricultural efficiency over the next few decades, but probably not to the degree required to sustain current diets.

“Historically, the effects of technology on crop yields have been enormous, delivering 3–4-fold increases over the last 40–50 years for most crops,” says Lee, “but we don’t think we can achieve an overall yield improvement of 75% everywhere in Europe. It is a hypothetical maximum value that differs a lot depending on the crop types and the region.”

Yield increases might also come about as a result of the very process that the afforestation effort is designed to prevent. In Europe, climate change is expected to improve growing conditions in northern latitudes while harming them in the south. “The effects on crop yields across Europe as a whole are broadly positive,” says Lee, “but I want to highlight that there could be winners and losers.”

Assuming that cutting down on meat does turn out to be necessary, persuading consumers to change their habits will not be easy. Lee suggests that raising awareness of the environmental consequences of food choices would be the most straightforward way, since this factor already seems to be at least partly behind the increased prevalence of vegetarian and vegan diets. Increasing the cost of meat directly (through taxes) or indirectly (through land-use regulations) could work, but would surely be unpopular. “The alternative to a stick,” says Lee, “is a carrot: plant-based foods could be subsidized to make them relatively cheaper than meat.”

Lee and colleagues reported their findings in Environmental Research Letters.

The post EU’s forestation targets demand dietary rethink appeared first on Physics World.

Fusion propulsion for interstellar travel, why ice is slippery, fireworks lit the fuse on science funding

This episode of the Physics World Weekly podcast features an interview with Richard Dinan, who has made a remarkable career change from reality television star to fusion entrepreneur. Hamish Johnston visited Dinan at his start-up company Pulsar Fusion to find out why he is fascinated with nuclear fusion and chat about his plans to develop a fusion-based thruster for space travel.

Also this week, Margaret Harris talks about a breakthrough in our understanding of why ice is slippery and Matin Durrani explains how the development of fireworks played a key role in the early days of modern science.

The post Fusion propulsion for interstellar travel, why ice is slippery, fireworks lit the fuse on science funding appeared first on Physics World.

Frequency combs shape the future of light

Par : No Author

This year marks the 20th anniversary of the first time an optical-frequency comb was used to measure the atomic hydrogen 1S-2S optical transition frequency, which was achieved at the Max-Planck-Institut für Quantenoptik (MPQ) in Garching, Germany. Menlo Systems, which was founded soon afterwards as a spin-off from MPQ, has been commercializing and pioneering the technology ever since.

Today, optical frequency combs (OFCs) are routinely employed in applications as diverse as time and frequency metrology, spectroscopy, telecommunications, and fundamental physics. The German company’s fibre-based systems, and its proprietary “figure 9” laser mode-locking technology, have set the precedent for the most stable, reliable, robust, and compact optical frequency combs available on the market today.

An optical frequency comb exploits laser light that comprises up to 106 equidistant, phase-stable frequencies to measure other unknown frequencies with exquisite precision, and with absolute traceability when compared against a radiofrequency standard. The most common and versatile approach to create an OFC is to stabilize an ultrafast mode-locked laser, in which pulses of light bounce back and forth in an optical cavity. The frequency spectrum of the resulting pulse train is a series of very sharp peaks that are evenly spaced in frequency, like the teeth of a comb.

Through the so-called self-referencing technique, the first tooth of the comb – called the carrier-to-envelope offset frequency – is well fixed to a certain position. When the spacing of the comb teeth is referenced to a known frequency, such as the radiofrequency generated by a caesium atomic clock or a hydrogen maser, the absolute frequency of a light source can be accurately measured by interfering it with the nearest tooth on the comb with respect to the OFC reference. The device thus provides a way of making very accurate spectroscopic measurements of atomic and molecular transitions, and offers a versatile and unique way of comparing atomic clocks.

Nobel connections

The OFC was invented by Theodor Hänsch, who together with Jan Hall was awarded the 2005 Nobel prize in physics for its development. Hänsch, together with two of his students, Ronald Holzwarth and Michael Mei, and Alex Cable, founder and president of Thorlabs, set up Menlo Systems in 2001 to establish the technology as a turn-key device that could be used for a range of applications. The company soon received orders for OFC systems from two laboratories in Austria and Italy, and today all major metrology institutes around the globe own one or more Menlo Systems’ OFCs.

The company has continued to lead the way in terms of innovation, quality and performance, with its products used in ground-breaking research as well as in emerging industrial applications. Its flagship FC1500-ULNplus, the world’s most precise optical frequency comb, is now a key technology in the development of optical clocks – which are expected one day to replace the caesium atomic clock as the current standard for defining the second.

“Menlo has recently made relevant leaps that would have seemed impossible a few years ago for researchers in quantum optics, atomic and molecular physics,” says Michele Giunta, project leader at Menlo and a doctoral candidate in Hänsch’s group at the Max-Planck-Institut für Quantenoptik. “We have developed and conceived frequency combs synthesizing optical frequencies that mimic the sub-hertz linewidth of our ORS ultrastable laser, and it is even ready to support future sub-mHz linewidth lasers. In this way, every frequency generated by the OFC is phase-coherent with the optical reference and exhibits the same linewidth of the reference with a negligible additive noise.”

This technology is exploited in the FC1500-ULNplus, allowing the optical coherence of the reference to be transferred to all lasers locked to the comb, even for the narrowest linewidth laser demonstrated so far. “This product can be used to compare two or more different optical clocks, having the comb keep pace with the best one, and at same time avoid hindering the comparison against the others,” explains Giunta. “It is also used for precision spectroscopy of hydrogen – which is a long-standing research effort led by Theodor Hänsch – to test quantum electrodynamics, and making it possible to determine fundamental constants such as the Rydberg constant or the proton charge radius (Science 358 79).”

The role of the optical reference system, which comprises a continuous wave (CW)-laser locked to a stabilized high-finesse optical cavity, is to allow the comb to reduce the noise of each optical tooth down to the sub-Hertz level. “By doing this we control the two degrees of freedom of the comb with very high bandwidth and unrivalled coherence,” explains Giunta. “In fact, we have recently demonstrated and reported the lowest noise synthesis for both optical and microwave carriers using such a system.”

Record breaker

In 2016, in a collaboration with academic partners at SYRTE, the National Metrology Institute of France – Yann Le Coq and Giorgio Santarelli, who is now at LP2N in Bordeaux – Menlo demonstrated that this technology can be used to synthesize microwaves with the highest spectral purity demonstrated to date.

“Menlo’s technology allows to transfer the frequency stability of the reference laser to the timing of the comb laser pulse train, and hence to a microwave frequency that is detected as the pulse repetition rate,” says Giunta. “The sub-hertz optical linewidth of the reference laser is translated into a microhertz linewidth in the microwave carrier, which is typically a harmonic of the pulse repetition rate.”

This advance in ultralow-noise microwave generation could directly impact on applications such as high-speed data transmission, high-stability atomic clocks, synchronization of Very Long Baseline Interferometry (VLBI), radio astronomy and high-accuracy radar for surveillance and navigation systems.

“In Doppler radar and moving traffic indicator systems, for example, oscillator phase noise causes large stationary clutter to spread through the Doppler spectrum,” explains Giunta. “Consequently, the performance of radar systems is strongly affected by the phase noise characteristics of the microwave oscillator. Low close-in phase noise, at offset frequencies of 1 Hz to 100 kHz, is of critical importance for high-fidelity radar systems.”

Emerging applications

Other high-end applications of frequency comb technology include high-speed precision distance measurements, such as co-ordinating the positioning of satellite swarms, telecommunications, and calibrating astronomic spectrographs, adds Richard Zeltner, head of business development at Menlo.

“We work very closely with research and metrology institutes to find out where different fields are heading and how we can serve them,” he says. “As today’s research can turn into tomorrow’s products, we are striving to develop reliable technology that fulfils the needs of upcoming high-end applications.”

Zeltner also points out that Menlo continues to make its established products more reliable and user-friendly. “Following more than 15 years of development we recently succeeded in reducing the footprint and ease-of-use of an OFC,” he says. “The SmartComb is a simple and fully autonomous comb system that, together with our ORS ultra-stable laser, provides end-users with a rack-mountable solution for synthesizing sub-Hertz linewidth laser light and extremely pure microwaves.” The SmartComb can easily be used by non-experts, which makes it accessible to a broader clientele.

But the company is aiming even higher, adds Zeltner. Since precise timekeeping and synchronization are both crucial for global navigation satellite systems (GNSS), there are ongoing efforts to qualify optical-clock technology for space applications. “As OFCs are an integral part of any optical clock, Menlo is undertaking R&D efforts in this direction as well,” he says.

A SmartComb was recently operated successfully onboard a high-altitude research aircraft, while a prototype comb was tested under micro-gravity conditions on a sounding rocket that was part of the DLR Texus programme. “We are not quite at a readiness level for in-orbit verification satellite missions or the ISS, but we are making continuous progress towards that goal,” says Zeltner.

Looking to the future

Several other potential applications are also appearing on the horizon, such as comb-assisted coherent control of multiple lasers is one. “This could be important for future ion or neural atom-based quantum computers,” says Giunta. “These emerging technologies could come useful in a multitude of areas and, again, the OFC will be of central importance for such an information technology revolution.”

OFCs are Menlo’s star products, with more than 300 systems currently in operation around the world, but its product portfolio also includes femtosecond lasers, ultrastable CW lasers and THz technology. The company today employs more than 100 people and has offices in Germany, the US and China.

Further reading

Menlo Systems has published a number of scientific papers on photonic microwave generation’s publications, the FC1500-ULNplus, and applications in the space arena:

M Giunta et al. 2019 “Real-time phase tracking for wide-band optical frequency measurements at the 20th decimal place” Nat. Photon. (2019).

E Oelker et al. 2019 “Demonstration of 4.8 × 10−17 stability at 1 s for two independent optical clocks” Nat. Photon. 13 714

X Xie et al. 2016 “Photonic microwave signals with zeptosecond-level absolute timing noise” Nat. Photon. 11 44

J W Zobel et al. 2019 “Comparison of Optical Frequency Comb and Sapphire Loaded Cavity Microwave Oscillators” IEEE Photonics Technol. Lett. 31 1323

M Lezius et al. 2016 “Space-borne frequency comb metrology” Optica 3 1381

The post Frequency combs shape the future of light appeared first on Physics World.

Electrons passing over nanophotonic materials could create synchrotron-like light

Vacuum fluctuations just a few nanometres from the surface of a material can cause a passing beam of relativistic electrons to emit X-rays and other high-frequency electromagnetic radiation — according to calculations done by scientists in the US, Israel and Singapore. If confirmed experimentally, the effect could be used to create compact and tunable sources of X-rays and even gamma rays.

Compact sources of high-quality electromagnetic radiation at X-ray and higher frequencies are difficult to make because electrons in materials cannot react quickly enough to electromagnetic field oscillations at these high frequencies. Instead, production of such radiation relies on strong external fields that accelerate beams of high-energy electrons in devices such as synchrotrons and free-electron lasers. While this approach has been extremely successful, it requires huge and expensive magnets or lasers, and can only be done at a handful of dedicated sites.

Now, Nicholas Rivera and colleagues at the Massachusetts Institute of Technology, Technion and Nanyang Technological University in Singapore, have proposed a new way of generating high-frequency light that does not require strong external fields. It involves a two-photon process whereby a relativistic free electron passing a few nanometers from the surface of a material spontaneously emits a high-energy photon and a polariton. The latter is a quasiparticle that forms as a result of interaction and mixing of light with dipoles in a medium. The cause of this spontaneous emission are vacuum fluctuations, a well-known phenomenon that arises because the vacuum is not empty space, but has a non-zero energy density of the electromagnetic field.

Famous effects

“The most famous effects arising from vacuum fluctuations are the Casimir and van der Waals forces between neutral objects, the latter of which explains the ability of geckos to walk on ceilings”, adds Rivera. These so-called vacuum forces are especially strong in the nanoscale vicinity of nanophotonic structures.

Rivera and colleagues did calculations of this effect in optical materials such as graphene; thin films of gold and silver; and silicon carbide. In these materials polaritons are emitted at infrared or visible frequencies. Using doped graphene increases the strength of the van der Waals force that extracts more emission from the electron. The calculated photon emission is more broadband than in synchrotrons, spanning from soft ultraviolet to hard X-ray frequencies, explains Ido Kaminer. Furthermore, the team calculated that the total emitted power is comparable to the power that equal-energy electrons would emit as synchrotron light induced by a 1 T magnetic field, a mind-blowing theoretical prediction.

The concept could be used to develop compact and tunable sources of high-frequency light, based on vacuum fluctuations near and inside nanophotonic materials. But what material will provide the desirable radiation characteristics?

Higher coherence and shorter pulses

Kaminer says: “We expect certain periodic patterns, as is now often done in metasurfaces, to result in higher coherence and shorter pulses”. Gigaelectronvolt electron beams could be delivered by a conventional linear accelerator, or perhaps in the future by a more compact laser wakefield accelerator. The framework could be extended to include more complex electron-beam configurations such as moving dipoles or bunched electrons – the latter is used in free-electron lasers.

Ultimately, by using the tools of nanophotonics, controlling high-frequency light emission may lead to creation of synthetic active nonlinearities at X-ray and even gamma-ray frequencies. Such nonlinearities could be used, for example, to create entangled pairs of X-ray photons via parametric down conversion.

Full details of the research are reported in Nature Physics, where the team also makes several suggestions for how the effect could be studied in the lab.

The post Electrons passing over nanophotonic materials could create synchrotron-like light appeared first on Physics World.

A 2030 UK energy plan

At this year’s Labour party conference held just outside London in Corydon in October, a motion was adopted that called on the party to work towards “a path to net zero carbon emissions by 2030”. Labour then asked a group of independent energy-industry experts to identify a pathway to decarbonize the UK energy system by 2030. The resulting report, which was published in late October, is very detailed and quite radical. It identifies four overarching goals to transform the UK’s energy supply and use: reducing energy waste in buildings and industry; decarbonizing heat; boosting renewable and low carbon electricity generation; and balancing the UK’s supply & demand.

Thirty recommendations were made to meet those goals. They include installing eight million heat pumps as well as upgrading every home in the UK with energy-saving measures such as insulation and double glazing but focusing first on damp homes and areas with fuel poverty. The report also calls for the installation of 7000 off-shore and 2000 on-shore wind turbines as well as solar panels that would cover an area of 22 000 football pitches, so tripling the UK’s current solar capacity.

“Delivering these thirty recommendations would make the UK a pioneer. No other industrialized country has plans of a similar scale,” the report notes. “The scope and pace of change would bring challenges, but also first-mover advantages and would avoid costly high-carbon lock-in for the country’s infrastructure.”

Keeping on track

Specific recommendations for early action include a vast expansion of offshore wind to 52 GW while onshore wind would increase to 30 GW and solar energy to 35 GW – all contributing to the 137 GW boost in renewable capacity. The report also calls for an urgent UK-wide programme to upgrade existing buildings to “significantly reduce energy wastage and a shift to low-carbon heat”. All new buildings would have to be net zero-carbon.

The plan is a maximalist programme of renewable expansion and energy efficiency upgrades in all sectors. On the demand side, it aims to reduce the need for energy across the UK by a minimum of 20% for heat and a minimum of 11% for electricity, relative to current levels. On the supply side, offshore wind would be supplying 172 TWh by 2030 while onshore wind would contribute 69 TWh and photovoltaic solar being 37 TWh. But there is also 63 TWh from nuclear — with 9 GW assumed to be in place by 2030 — as well as 32 TWh from gas with 40GW of power plants in use.

For the longer term, there would be significant investment in research and development for marine energy — up to 3GW of tidal — and renewable or low-carbon hydrogen for heating and energy storage along with carbon capture and storage for some heavy industries (2.5GW). The aim is that by the late 2020s “these emerging technologies can be deployed, alongside current technologies such as nuclear, to the appropriate scale”.

This is a good report that faces up to many of the issues. Yet the new set of proposals avoid detailed programme costing.

That part could be a hint of support for small modular nuclear to keep nuclear at its current level and also for fossil-gas steam methane reformation for hydrogen production. But the report also mentions the electrolysis “power to gas” route: using green power to make hydrogen. Interestingly, it sees solar providing about 6% of UK heating with biomass contributing 5%. Yet it recommends not expanding solid biomass use for large-scale electricity generation. So, no more DRAX-type imported wood pellet plants.

Based on the proposed programme, the report claims that the UK could be on track to deliver a 77% reduction in energy emissions by 2030 compared to 2010 levels. It adds that looking beyond that zero-carbon electricity “could potentially be anticipated as early as 2034-2040, and zero-carbon heating [from] 2036-2040”.

Social benefits

The report says that there would also be substantial social benefits from the plan. By 2030, its recommended investment in the energy sector would lead to a net benefit to the economy of £800bn and create 850 000 new skilled jobs in the green industry. “The UK would build a unique skill and knowledge base supporting the kind of transition that many other countries will need to go through, providing a huge opportunity for the UK to demonstrate industrial leadership,” the report says.

The report adds that upgrading the housing stock has the potential to end fuel poverty that is currently affecting 2.5 million UK households. By 2030, these measures could also mean 565 000 fewer cases of asthma by helping to alleviate damp. Replacing fossil fuels with renewable energy could also improve air quality resulting in 6200 fewer respiratory-related deaths each year by 2030. Overall, the report says, benefits to public health could potentially save the UK’s National Health Service £400m each year.

The proposals received a clean bill of health as technically credible from a range of academics and experts in the field and the sense of urgency was welcomed by Greenpeace. The detailed plan certainly does look interesting. However, there are some issues.  It doesn’t back district heating Networks (DHN), at least not yet. That is a change from traditional Labour stances that have seen urban DHN — along with Combined Heat and Power Plants — as important aspects.

The report says the deployment of DHNs would be constrained by the proposed building retrofit programme that will result in a “drop in heat demand by on average 25% which will further reduce the already marginal returns for DHN operators”. Yet DHNs are a good flexible infrastructure investment, able to use any heat source that comes along. And it has been claimed that for high rises, for example, plugging on to a DHN can offer lower cost carbon savings than often tricky-to-install and potentially risky retrofitted insulation.

 Nuclear retention

The report assumes that in its 90% low-carbon mix for 2030 nuclear capacity stays at the current level. But it also says it is “entirely possible to meet the 90% target without any new nuclear capacity”, though that would be “more challenging” due to the loss of low-carbon base-load and increased use of variable power. So, the report notes, more grid balancing would be needed via storage, interconnection, demand management or fossil fuel back-up. Though it adds, “the system will also benefit from cheaper generation technology such as wind & solar”.

The retention of nuclear is controversial. For example, far from helping to balance variable renewables, having nuclear on the grid may make the balancing problem harder to deal with since it is inflexible. By contrast, it can be argued that an increase in variable renewable capacity could reduce the balancing problem. There would be more green power available more often to meet peaks, thus reducing balancing needs and also an increased amount of surplus power at times, expanding the potential for power to gas electrolytic conversion to hydrogen. That could be stored and used to generate power again when there was a lull in renewable power availability.

This is a good report that faces up to many of the issues, nuclear apart. Yet the new set of proposals avoid detailed programme costing. That will be up to the Labour party to deal with if it adopts this plan. Rebecca Long Bailey, Labour’s shadow business and energy secretary, welcomed the report saying that it is “a major contribution to Labour’s plans to kickstart a Green Industrial Revolution”. Given that an election manifesto is now imminent, we will have to wait and see what is included in it.

The post A 2030 UK energy plan appeared first on Physics World.

Female chemists hit by ‘significant disadvantage’ when publishing their research

Gender biases exist at “each step” of the publication process in chemistry publishing. That is according to a 30-page reportIs Publishing in the Chemical Sciences Gender Biased? — released on 5 November by the Royal Society of Chemistry (RSC). The report, which examines the diversity of authors, referees and editors of RSC journals, finds that while these biases appear minor in isolation, their combined effect puts women at a significant disadvantage when publishing their research.

The RSC publishes more than 40 peer-reviewed journals, corresponding to around 35 000 articles each year that while focussed on chemistry also cover fields such as biology, materials and physics. The study looked at the gender of authors in over 717 000 manuscript submissions between 2014 and 2018 as well as over 141 000 citations between RSC journals from 2011 to 2018. The gender of authors was assigned to names using the same method by the UK Intellectual Property Office in their report on gender in UK patenting.

Call to action

The analysis found that women are less likely than men to submit their work to journals that have a high impact factor and are more likely to have an article rejected without review. The report notes that women are also less likely to hold positions towards the end of the author list, in particular that of corresponding author. Indeed, while around 36% of RSC authors are female, only 24% of submissions have a women as corresponding author.

While these issues don’t just apply to the chemical sciences, as an organisation there is absolutely no point telling others they need to change unless you’re willing to do so yourself

Emma Wilson

When it comes to peer review, the report finds that women are under-represented as reviewers but are more likely to be chosen to review articles that have female corresponding authors. The report also states that women cite fewer research papers than men overall, and men are less likely than women to cite papers authored by women. Indeed, only 18% of cited papers have a corresponding author who is a woman.

To tackle such biases, the report offers four areas for action. This includes publishing an annual analysis of authors, reviewers and editorial decision makers in each subdiscipline as well as recruiting reviewers, journal board members and associate editors to match the current gender balance in chemistry — with a target of 36% being women by 2022. The report also calls for more editorial training to eliminate biases and for the RSC to collaborate with other publishers to boost diversity and inclusion in the industry.

“We were surprised by some of the findings, which included a number of cases where women said they felt less confident submitting to a journal because they feel they might not meet the criteria for publication, while men may be more likely to take the risk,” says RSC publishing director Emma Wilson. “While these issues don’t just apply to the chemical sciences, as an organisation there is absolutely no point telling others they need to change unless you’re willing to do so yourself. In analyzing our journal peer review processes, committing to increase female representation within the publication process and annually reporting on our progress toward gender equality, we are aiming to raise the bar.”

The post Female chemists hit by ‘significant disadvantage’ when publishing their research appeared first on Physics World.

Douglas Trumbull: a mutual appreciation between scientists and moviemakers

Par : No Author
Douglas Trumbull
Parallel thinking: Douglas Trumbull turns abstract ideas into visual form – just as a scientist can. (Courtesy: Prod DB/Trumbull Studios/Alamy Stock Photo)

Douglas Trumbull has spent more than 50 years at the technological cutting edge of moviemaking – from his iconic special effects for 2001: A Space Odyssey (1968), The Andromeda Strain (1971), Close Encounters of the Third Kind (1977), Star Trek: the Motion Picture (1979), Blade Runner (1982) and The Tree of Life (2011), to his directorial work on the cult classics Silent Running (1972) and Brainstorm (1983). Now 77, the moviemaking legend admits he is “a complete outlier and weirdo relative to the entrenched motion-picture industry”. Indeed, he says he feels “much better understood by scientists and mathematicians than by studio executives”.

Trumbull’s career began on the short film To the Moon and Beyond (1964), which transports the viewer from Earth out to the entire universe before zooming back down to the atomic scale. Recorded at 18 frames per second using a fish-eye wide-angle lens on 70 mm film – a technique dubbed Cinerama 360° – the movie was projected onto a domed exhibit at the New York World’s Fair in 1964. Impressed by the special effects, director Stanley Kubrick hired Trumbull to work on 2001.

Since the 1980s he’s been based in rural Massachusetts, where Trumbull Studios is experimenting with new ways of making and showing films. These include a prototype 70-seat “pod” featuring advanced digital-projection technology as well as a slightly curved, torus-shaped cinema screen. “The work that I’m doing is predicated upon a belief that there’s an intrinsically, vitally important link between the medium itself and the movie experience you can deliver to audiences,” Trumbull says. “Kubrick was very conscious of this, he talked to me about it a lot. That has stayed with me forever.”

Trumbull’s entire career has been a learning curve. Or, as he puts it, “a hybrid of science, technology, and the drama and art form of movies”. It has also given him a unique perspective on the worlds of filmmaking and science, and the way that practitioners in one area can learn from those in the other. He cites, for example, his work on Terrence Malick’s epic The Tree of Life, which includes a 17-minute “creation” sequence showing the birth of stars and evolution of life on Earth.

“Terry is very scientific and well studied,” Trumbull says, pointing to Malick’s ability to understand, say, the fluid dynamics of two galaxies colliding. “He would go to a supercomputing lab at a major university and see if they had something like that. And sure enough, there were experimental movies made by weighting each star with a certain gravitational pull and having them interact in a scientifically valid way.”

Unfortunately, those simulations were, says Trumbull, “weirdly underwhelming, because they were only as good as the mathematical algorithms”, which prompted him and Malick to try something new. “By using real fluids in a real liquid, or real gases and explosive lights – and filming that with high-speed cameras at a thousand frames a second – we would find much more intuitively natural-looking effects than anything we could create with a computer.”

That ability to turn abstract thoughts into visual form is a skill that filmmakers share with scientists. For instance, Trumbull is about to start making a film about the engineer and inventor Nikola Tesla. “One of the aspects of Tesla’s nature was that he saw everything in his mind long before he manifested it in reality,” says Trumbull. “He could see things so vividly that, when he was developing some idea, he didn’t draw or build anything until his mind had completed the project – pre-visualized it. And this enabled him to understand more completely how one magnetic field would interact with another to create the Tesla coil, for example.”

At the same time, Tesla’s story provides a cautionary note about the limits of visualization, Trumbull warns. “In the later years of his life Tesla made some significant mistakes by believing that what was imagined in his mind was always accurate. He believed he could scale up a Tesla coil to the Wardenclyffe Tower [an experimental wireless station in New York] and transmit energy around the world, which I don’t think was ever going to work.”

One intriguing collision between the make-believe world of movies and real life is Blade Runner. Trumbull’s visual effects helped give the film its famously futuristic feel, but the story’s setting – Los Angeles, November 2019 – is no longer the distant future; it’s now. While not everything predicted in that film – flying cars, human-like robots and 3D holographic billboards – has been realized, “a lot of Blade Runner has come true but in a different form”, Trumbull says, pointing to the growing impact of artificial intelligence (AI). “AI is one of the most compelling topics of discussion in our society – and it’s evolved over time in such a way you don’t notice.”

Significantly, Blade Runner predicted a shift in our relationship with technology. “The resistance exerted by the AI beings – against the limit in their lifespan – is very much like 2001,” says Trumbull, pointing to the fictional crew’s decision in that movie to shut down the HAL computer. “The dying of HAL was really tragic. The fact that the audience could empathize with HAL more than they could empathize with the human characters in the movie is really telling. I think we’re going to see more and more of that. You’re going to feel bad when your vacuum cleaner dies.”

The post Douglas Trumbull: a mutual appreciation between scientists and moviemakers appeared first on Physics World.

Can conventional X-ray tubes deliver FLASH dose rates?

The researchers
Magdalena Bazalova-Carter and Nolan Esplen with the X-ray tube. (Courtesy: Magdalena Bazalova-Carter)

The curative potential of radiation therapy is limited by normal tissue toxicity, which restricts the dose that can be delivered to nearby non-target tissue. Recently, interest in ultrahigh dose rate radiotherapy has been rekindled, following preclinical studies showing increased normal tissue tolerance at high dose rates. This approach – known as FLASH radiotherapy – employs dose rates exceeding 40 Gy/s and can improve the therapeutic ratio by increasing the differential response between normal and tumour tissues.

The expectation is that FLASH could one day provide tumour ablation in a single sub-second treatment, while substantially reducing radiation-induced side-effects. The underlying mechanism, however, is not yet understood and requires further research. And to date, FLASH dose rates typically require specialized electron sources or substantial modifications to clinical linacs.

“Access to FLASH beamlines is limited and the progress on understanding of the FLASH mechanism is quite slow,” explains Magdalena Bazalova-Carter from the University of Victoria’s XCITE Lab. To remove this constraint, Bazalova-Carter and PhD student Nolan Esplen are investigating the feasibility of using a conventional high-powered X-ray tube for FLASH radiotherapy (Med. Phys. 10.1002/mp.13858).

“Our results will hopefully inspire researchers without access to high-power electron sources or proton beamlines to perform FLASH in vitro, and possibly some limited in vivo experiments, with a standard X-ray tube,” says Bazalova-Carter.”

The researchers used Monte Carlo (MC) modelling to evaluate the maximum dose rates achievable by two conventional X-ray tubes: the 3 kW MXR-160/22 (which was being validated in the XCITE lab) and the 6 kW MXR-165, which benefits from a short distance from the focal spot to the tube surface. For both tubes operating at maximum power, they simulated the output of an unfiltered 160 kV beam and calculated the dose deposited in a water phantom placed against the tube surface. They then converted the MC-calculated dose to dose rate.

The simulations revealed that both X-ray tubes were FLASH-capable, with maximum phantom surface dose-rates of 114.3 and 160.0 Gy/s, for the MXR-160/22 and MXR-165, respectively. Dose non-uniformity due to the heel effect – an inherent directional variation in the X-ray intensity emitted by the anode – was seen in both cases. For a 1 cm diameter region-of-interest within the high-dose region, dose rates were 110.6 Gy/s for the MXR-160/22 and 151.9 Gy/s for the MXR-165.

Plotting dose rate versus depth revealed a rapid fall-off for both 160 kV X-ray beams. At 2 mm depth, for example, dose rates decreased to 23% and 28% of that at the surface, for the MXR-160/22 and MXR-165, respectively. The dose rate remained FLASH-capable at depths of up to 1.4 and 2.1 mm, for the MXR-160/22 and MXR-165, respectively.

To validate their MC models, Bazalova-Carter and Esplen measured the dose in a plastic water phantom irradiated with a 120 kV beam from the MXR-160/22. They placed Gafchromic EBT3 films at 15 and 18 mm depth in the phantom and compared the measured 2D dose profiles with those from MC simulations of the 120 kV beam.

Dose distributions
Experimental and simulated 2D dose distributions in the solid water phantom (a). Comparison of dose profiles at the two depths (b). (Courtesy: Med. Phys. 10.1002/mp.13858)

In the region not affected by the heel effect, the simulations agreed well with the film measurements. The mean X-profile differences between experiments and simulations were 1.5% and 3.2%, at 15 and 18 mm, respectively; the mean y-profile differences were 1.5% and 3.5%, at 15 and 18 mm. Agreement in the heel-effect region was poorer, however, with a mean difference of up to 17.8% along the X-profile.

This validation experiment demonstrates the ability of conventional X-ray tubes to deliver FLASH therapy. The researchers suggest that these particular tubes could be suitable for FLASH skin irradiations, in vitro experiments or testing dose-rate dependence of small-field dosimeters.

They are now working to further tailor the X-ray tubes for FLASH applications. “We are currently building a shutter mechanism that will inset in the X-ray tube, which will further increase the dose rate,” Bazalova-Carter tells Physics World. “We are also designing experiments to test cell survival for FLASH irradiations with and without gold nanoparticles.”

The post Can conventional X-ray tubes deliver FLASH dose rates? appeared first on Physics World.

Wearable MEG scanner used with children for the first time

The human brain undergoes significant functional and structural changes during the first decades of life, as the fundamental building blocks of human cognition are established. However, relatively little is known about maturation of brain function during these critical times. Non-invasive imaging techniques can provide information on brain structure and function, but brain scanners tend to be optimized for adult head-sizes. Traditional fixed scanners also require patients to stay completely still, which can be highly challenging for children.

A UK research collaboration aims to solve these problems by creating a wearable magnetoencephalography (MEG) system that allows natural movement during scanning. They have now used the wearable MEG for the first time in a study with young children (Nature Commun. 10.1038/s41467-019-12486-x).

The researchers, from the University of Nottingham, the University of Oxford and University College London, developed a lightweight ‘bike helmet’ style MEG scanner and used it to measure brain activity in children performing everyday activities. As well as enabling studies of neurodevelopment in childhood, this system should allow investigation of neurological and mental health conditions in children, such as epilepsy and autism, for example.

MEG measures the small magnetic fields generated at the scalp by neural current flow, allowing direct imaging of brain activity with high spatiotemporal precision. Traditional MEG systems use an array of cryogenically-cooled sensors in a one-size-fits-all helmet. Such systems are bulky and highly sensitive to any head movement.

To address these issues, the team is using optically pumped magnetometers (OPMs) to measure the magnetic fields generated by the brain. These small, lightweight sensors can be positioned on a 500 g helmet that can adapt to any head shape or size. The OPMs can also be placed far closer to the head than conventional sensors, increasing their sensitivity. The researchers also employed an array of electromagnetic coils to null the residual static magnetic field inside the magnetically shielded room, allowing individuals to be scanned whilst they move freely.

“The initial prototype scanner was a 3D printed helmet that was bespoke – in other words only one person could use it. It was very heavy and quite scary to look at,” explains PhD researcher Ryan Hill who led this latest study. “Here, we wanted to adapt it for use with children, which meant we had to design something much lighter and more comfortable but that still allowed good enough contact with the quantum sensors to pick up signals from the brain.”

The researchers designed and built the new bike helmet style scanner and used it to successfully analyse the brain activity of a two-year old (typically the hardest age to scan without sedation) and a five-year watching TV whilst their hands were being stroked by their mother. The children were able to move around and act naturally throughout.

To show that the MEG system is equally applicable to older children, the researchers also used it with a larger helmet to scan a teenager playing a computer game. Finally, they used the new scanner to examine brain activity in an adult learning to play a sequence of chords on a ukulele. Despite the substantial head and arm movement required to complete this task, clear electrophysiological responses were observed.

“This study is a hugely important step towards getting MEG closer to being used in a clinical setting, showing it has real potential for use in children,” says Matthew Brookes, who leads the MEG research at the University of Nottingham. “The challenge now is to expand this further, realising the theoretical benefits such as high sensitivity and spatial resolution, and refining the system design and fabrication, taking it away from the laboratory and towards a commercial product.”

The researchers conclude that their study demonstrates that the OPM-based MEG system can generate high quality data, even in a 2-year-old child, and can be used to measure brain activity during naturalistic motor paradigms. “OPM-MEG, with generic helmet design, paves the way for a new approach to neurodevelopmental research,” they write.

The post Wearable MEG scanner used with children for the first time appeared first on Physics World.

Air-quality regulations shown to lower traces of airborne transition metals

Par : No Author

When taking a deep breath you draw a range of gases into your lungs from oxygen and nitrogen to carbon dioxide and argon along with traces of water vapour. But that same breath could also contain microscopic amounts of copper, iron, zinc and even chromium. Despite making up a tiny fraction of the pollutants in the air, transition metals have some of the most damaging impacts on our health.

A study has now assessed the abundance of various airborne transition metals in urban areas across the US. While the overall particulate pollution has decreased over the past two decades, particularly in urban areas, some cities have, however, seen a rise in the amount of air-borne transition metals. By studying the trends, researchers are beginning to pinpoint what the likely sources of various metals are and how their emissions can be better controlled.

Diluting concentrations

Clean air is a staggeringly good investment. Since 1990, the US has spent an estimated $65bn on implementing the 1990 Clean Air Act but gained $2 trillion in benefits. According to the Environmental Protection Agency, this year alone the act has prevented around 230 000 early deaths, avoided 120 000 emergency room visits, and stopped 5.4 million sick-days in schools and 17 million sick-days at work.

Particulate pollution is responsible for some of the worst health and economic impacts of air pollution with transition metals believed to be more damaging than other compounds. That is because the metals act as a catalyst and help to produce oxidants, which can lead to oxidative stress. “Oxidative stress has been linked with the genesis and progression of many different diseases – it’s why there is so much research and marketing for foods that contain antioxidants,” says Christopher Hennigan, an environmental engineer at the University of Maryland, Baltimore County.

There is a strong body of scientific research that shows that a transition to more sustainable energy sources will have co-benefits in air quality

Christopher Hennigan

Hennigan and colleagues analysed seven different transition metals over the period 2001 to 2016, across ten different urban areas — Atlanta, Baltimore, Chicago, Dallas, Denver, Los Angeles, New York City, Seattle, St Louis and Phoenix. They found that around a decade ago concentrations of nickel and vanadium in port cities were around five times higher than non-port cities but that the difference has now all but disappeared. “The reductions in port-cities were most likely from regulations on marine fuel sulphur content,” explains Hennigan.

The strong downward trend in vanadium across all urban areas clearly matched the introduction of diesel-fuel regulations in 2006. Yet copper, meanwhile, has stayed stubbornly constant in most areas. “Our results suggest that vehicle brake-lining dust is a major source of copper,” says Hennigan. The team also found higher concentrations of iron in western cities (by around a factor of two) than cities in the east, most likely because soil and dust are major sources of iron and prevailing winds cross more land and carry more dust to western cities.

One puzzle, however, was chromium, which increased in cities in the east and midwest, with a distinct spike in 2013. “We don’t have a good explanation for this which indicates a gap in our understanding of chromium sources and their magnitude,” says Hennigan.

The findings confirm how beneficial air-quality legislation has been for the US. It also makes a strong case for continuing to improve air quality with Hennigan believing there are still big gains to be make. “There is a strong body of scientific research that shows that a transition to more sustainable energy sources will have co-benefits in air quality,” he says.

The research is published in Environmental Research Letters.

The post Air-quality regulations shown to lower traces of airborne transition metals appeared first on Physics World.

Daniel Radcliffe: VFX tricks and wizardry

Par : No Author
Daniel Radcliffe on the set of Horns
Fantastic beast: Daniel Radcliffe (right) on the set of Horns with director Alexandre Aja. This film used prosthetics and practical effects as well as VFX. (Courtesy: Red Granite/Mandalay/Kobal/Shutterstock)

Jess Wade: You have been in a bunch of films that use VFX in the most progressive and creative ways. What was it like starting your acting career with the extraordinary VFX in the Harry Potter films [2001–2011]?

Daniel Radcliffe: For some of the experienced actors on Potter, it was their first time working with VFX on that kind of scale. It was different for us kids. Telling us that “the dragon is this tennis ball on the end of the stick” is a little different from giving an older actor that instruction – we’d never known anything different. And we were all kids, so using our imagination was something that we were doing a lot anyway.

JW: Has VFX changed how you act?

DR: I don’t think so – it’s always been a big part of my career. I enjoy the challenge of it. I think I’m weirdly good at following numbered cues now. I remember when they shot all the audience reactions during the Tri-Wizard Tournament [in the fourth film, Harry Potter and the Goblet of Fire (2005)], and there would basically be a bunch of the cast and background artists on a big stand – sometimes on a green screen, depending on what the backdrop was. Assistant directors would hang big numbers around the studio and just say, for example, “1” so everyone would turn to the same eye line at the same time.

JW: The Harry Potter films ended eight years ago, and you’ve done some really exciting things with VFX since then. Has it changed a lot?

DR: Potter came at a time when people were leaning heavily toward visual effects and away from “practical” make-up or special effects. Even though, of course, we had plenty of them too. In the last couple of years, we’ve reached a nice balance – where big franchises like Star Wars and Mad Max use a lot of practical stuff in creature effects and stunt work. People see the value of having practical, on-set effects, but VFX are so good. It can also make stunt work safer because you don’t have to put a human being through what you can get VFX to do.

But certainly, VFX is improving at an extraordinary rate. If you were to look at the difference between the first and last Potter films [in 2001 and 2011] – they get exponentially better over time.

JW: How does working with all that VFX compare to stage acting?

DR: I think that’s the joy of my job – I’ll do some films where there’s almost no VFX whatsoever, then I’ll do films like Swiss Army Man [2016] where it’s a crazy mix of VFX and old-school practical stuff such as camera tricks. There was one scene in that film where my character gets punched in the mouth, then swallows the hand that punches him… and punches himself in the stomach to make the hand that’s in his mouth get forced back out. I wondered “how are we going to do that?”. There was no VFX involved – it was entirely clever camera angles and a bit of make-up on the arm to make it look like it was covered in spit. It’s wonderful to be able to flit between those things – the very low-fi and the highly sophisticated ways of solving problems on film.

JW: Do you ever get involved with VFX? Do you go and see what they’re doing?

DR: The closest you get on set is when the film’s big enough to do previs [previsualization] sequences – like an animated storyboard that no-one else ever sees. For example, when there was a big quidditch sequence on Potter, they’d have that all mapped out on a visual storyboard first, and we’d try and stick to that when we filmed. But the majority of the time, the VFX is in post-production, when the actors aren’t around.

JW: But sometimes you go in to do that funny thing – what’s it called – ADR?

DR: Yeah, ADR – additional dialogue recording. At that point you might see some sequences with half-finished VFX – and that’s always cool; it’s always fun to see it in a primitive phase. For someone who is interested in how films get put together it’s kind of fascinating. In this rough cut of the film there will be shots like, if you did a driving sequence on a green screen, they’ll just show the shot on a green screen with a little caption saying “VFX needed”.

When films started using huge sets that were just entirely blue screen and VFX, I think actors were a bit whiney about it – there’s something about being on a bright blue or green screen that can drive you slightly insane. At first it was something to be remarked upon, but now it is so much part of the industry – I don’t think anyone sees it as a novel thing anymore.

JW: What’s your favourite example of VFX that you’ve worked with?

DR: That’s really hard. There are some amazing sequences in Potter – there is some really beautiful stuff. The Hall of Prophecy in [the fifth film, Harry Potter and the] Order of the Phoenix [2007] was almost entirely green screen if I remember rightly.

And then in Horns [2013], when my on-screen brother took some hallucinogenic drugs and had this really visual trip – that’s a really good mix of practical prosthetics, VFX and tricks the designers built into the sets.

There’s also the other side of VFX, which is less glamorous but even more useful. Like driving sequences – when you’re filming in a place where you can’t shut down roads, you have to do it on green screens. Then there’s patching up a prosthetic. Sometimes things look fantastic when they’ve been put on at 9 a.m., but when you’ve been wearing it for 10 or 11 hours, visual effects can be helpful for polishing up that stuff.

JW: What has been the most ridiculous thing that you had to work with?

DR: None of it feels too ridiculous at the time. The hippogriff [a magical creature that’s part eagle, part horse] in [the third film, Harry Potter and the] Prisoner of Azkaban [2004] – the reality of the hippogriff and the flight of it was quite funny. If you imagine a limbless, headless bucking bronco…

JW: [descends into laughter] Like…a mechanical thing?

DR: Yeah, a mechanical bucking bronco on hydraulics. Just a grey torso with no texture, filmed on a blue screen and a green screen with a motion control camera.

JW: [can’t stop laughing] But you were all kids! I imagine when one 14-year-old starts laughing, everyone starts laughing.

DR: Sure, there would be an element of that. Thankfully, for the hippogriff sequence I was on my own at the start – so I’d got used to it. Of course, it also feels slightly strange when you mark it through for the first time if you’re acting alongside something like a tennis ball, but you get used to it.

JW: Is it weird to watch yourself after you’ve been VFX-d?

DR: It’s not weird so much as it is cool! It’s satisfying and really fascinating to see the finished product all put together, after having seen it at its most basic stages.

JW: Have you had experience with any cool VFX technologies?

DR: On Potter there was something called cyber-scanning. You’d stand in the middle of around 30 cameras and a computer would make a 3D map of you. And you know, as a kid, I had to be very still for a long time. They also had to keep doing it for every film because us kids were growing up.

JW: What did they use that for?

DR: If there’s a scene where you’re being thrown around in a crazy way – or you’re falling from a broom or something – and they didn’t want to do it with a stunt man. They use the cyber scan to recreate a digital version of you.

JW: It’s kind of cool but also intimidating. I think I’d hate to have 30 cameras pointing at me from all different angles.

DR: Yeah, for sure, it’s weird. You don’t just sit there either – you sometimes have to make expressions. There will be six or seven “first do a neutral face, then do smiling, then smiling with teeth, then surprised, then scared…” – so you have to make slightly caricatured versions of facial expressions. It’s one of the weirder parts of my job – but I enjoy all of those parts of my job!

JW: Does it feel like there’s a movement in the film industry to go back to more old-school techniques, away from VFX?

DR: Maybe a little bit. If you go to one of J J Abrams’ sets for the new Star Wars films there are lots of practical prosthetics, make-up effects and creatures – it’s really cool. It’s one of the things people love about the films that he has made.

The directors of Swiss Army Man, Daniel Kwan and Daniel Scheinert, love doing stuff practically. There are sequences in the film where we’re attacked by a bear, and there is no safe or practical way of doing that really, and we didn’t have the money that The Revenant [2015] had to do a bear attack. But Dan Kwan has a VFX/animation background and knew how to film things to make the VFX easy – there are tricks.

People used to say they didn’t want movies to look like video games – but video games look incredible at this point in time, so it’s not really a valid criticism anyway anymore. I don’t think we’ll ever get to a point where we completely do away with human actors and have entirely VFX movies – though there is a place for those movies right now, and they’re awesome.

You see how people respond to films like Mad Max: Fury Road [2015], which had a lot of practical stunts, the crazy cars – that was all real. But it was coupled with a tonne of VFX – removing wires, stunt harnesses. I think the industry has got to a point where we realize the value of both and find a compromise between the two.

JW: When you think about your career – of course you think about acting, but increasingly producing and directing – do you see yourself getting more involved with VFX?

DR: Depending on what level of VFX is in the film, VFX teams work very closely with the director. I think it’s really important to work with people you get on with and who understand the vision of the film. I cannot overstate how important that relationship is – the VFX team can really bail you out of stuff. On Guns Akimbo [2019] there was a lot of VFX, and we had a very chill, cool VFX co-ordinator called Tony [Kock] – and whenever there was a problem on set we’d say, “Hey Tony, can you fix that?” and he’d be like, “Yeah, that’s fine.”

I cannot overstate how important that relationship is – the VFX team can really bail you out

JW: When you find someone like that do you not just want to ask them a tonne of questions about the technical parts of it?

DR: I do, but it’s like when I ask you about physics – I can only understand so much.

JW: Talking of physics, it’s not often we have a film star in Physics World. If you played a physicist who would you be?

DR: I will reverse the question: who would you cast me as?

JW: Paul Dirac would be great. Remember we read that great book about him [Graham Farmelo’s The Strangest Man]. But I want to know more about whether you like physics?

DR: I was always excited by space but there was way too much maths in it for me to ever feel truly at home. I’m interested in it now though – absolutely. You know I always watch science shows and listen to podcasts. I guess I’d say I’m an enthusiast but I’m not informed. Maybe I got it from my teachers at school and my tutors on set. Even though I wasn’t great, they got me interested. But I think pretty much across the board, every subject I didn’t think I was good at when I was at school, I’m fascinated by now. I’m fascinated by mathematics. I don’t understand anything about mathematics, but I love hearing people talk about it. It blows my mind.

The post Daniel Radcliffe: VFX tricks and wizardry appeared first on Physics World.

Why fireworks are so important to science

Fireworks
Spectacular connection: Fireworks have long linked science with the public and the state. (Courtesy: Jan Bogumil Plersz (1732–1817))

Fireworks are essential to many of today’s celebrations – from national holidays and sporting events to musical concerts and the gatherings held on Bonfire Night (5 November) in Britain each year. Once upon a time, though, fireworks were serious scientific business. Designing the rocket and preparing the propellant and coloured fire required, after all, a detailed knowledge of chemistry and physics (see “Whizz-bang science” by Pierre Thebault, December 2018). Mounting effective firework displays required other skills too, including architecture, artillery, ballistics and even poetry.

Fireworks, it turns out, also played a critical role in the complex and evolving relations between science, the public and the state. That, at least, is the intriguing argument in a book by Simon Werrett, a science historian at University College London, entitled Fireworks: Pyrotechnic Arts and Sciences in European History (University of Chicago Press 2010). Previous histories of fireworks had ignored this connection. As Werrett put it, the true history of fireworks has “gone up in smoke”. His book brings it back.

Up in smoke

Fireworks originated in China, where by the 12th century they were routinely used in public spectacles. Werrett, though, focuses on the European story, which started in around the 14th century, when gunners began to develop a new genre of spectacle – “artificial fireworks” – for a general audience. The spectacles were called “artificial” because they were specially crafted for non-military purposes, and “fireworks” because they used gunpowder to produce fiery effects. The people who made the fireworks, meanwhile, were known as “artificers” and worked in spaces called “laboratories” (a name also used by alchemists) well before the modern scientific use of the term.

The first grand firework display over the Thames took place in 1613. Indeed, in his novel New Atlantis, published in 1620, one of the crucial tasks that the philosopher and statesman Francis Bacon assigned the scientists in his utopian world was to produce fireworks. Fireworks were on the way to becoming an important undertaking of nations, not so much because they demonstrated knowledge of military capital such as explosives and rockets, but because they symoblized power and authority.

“In a world without electric light,” Werrett writes, “fire was a powerful medium, a source of light and heat whose divine and magical connotations were strong”. Indeed, the ability to control, tame and exploit fire in spectacular and artistic displays seemed to demonstrate an ability to bring the divine and celestial down to Earth and under human control.

The ability to control, tame and exploit fire in spectacular and artistic displays seemed to demonstrate an ability to bring the divine and celestial down to Earth and under human control.

By the end of the 17th century, fireworks had become an important element in public displays and extravaganzas in several European states. Monarchs gave resources to those who could manufacture them and stage their displays, and supported the institutions where they worked. Fireworks makers were encouraged to invent new and more dramatic effects, fostering a culture of innovation. Power and prestige came to those who could successfully innovate.

One of Werrett’s unusual stories involves the quest to create green fireworks. While artificers could produce most colours, green was difficult and in the early 18th century the ability to produce it became the subject of quests at Imperial courts – rather like the modern hunt for blue light-emitting diodes. Scientists at the St Petersburg Academy of Sciences eventually succeeded, and for a time were able to keep their knowledge a trade secret. The Russians, typically, attributed the discovery of green fireworks to Peter the Great himself. But the key breakthrough occurred at the St Petersburg Academy, when its scientists began treating fireworks as based on a chemical rather than a mechanical process.

In the 17th and 18th centuries, Werrett writes, Britain, France, Italy, Russia and other nations sought to outdo each other in the grandeur and scale of the fireworks displays they staged, with the manufacture of fireworks serving to promote science. How precisely this occurred depended on local conditions. At the time of the restoration of the monarchy in England, for instance, fireworks were sometimes associated with Catholic plotting and religious zeal, provoking a counter-reaction – but English philosophers and natural scientists also debated the significance of fireworks for understanding nature. In Russia fireworks appealed chiefly to the Imperial Court’s thirst for spectacle, which fostered its support for the country’s first generation of Western-style scientists.

In Russia fireworks appealed chiefly to the Imperial Court’s thirst for spectacle, which fostered its support for the country’s first generation of Western-style scientists.

“With no scientific tradition in Russia,” Werrett writes, “academicians found that experimental lectures failed to interest the Russian nobility, whose support was critical to the survival of the academy. Simultaneously, academicians learned that the design or ‘invention’ of allegorical fireworks could improve their fortunes as spectacles appealing to the Russian court.” Werrett’s book opens, for instance, with a description of a firework display intended to symbolize the incremental but inexorable growth of the power and prosperity of the Russian state.

In the 1750s, seeking to exploit competition amongst their academicians, the St Petersburg Academy commissioned two of its prominent scientists – Mikhail Lomonosov and Jacob Stählin – to work separately on fireworks displays, with the intention of choosing whoever was better. Lomonosov was offended when Stählin’s was chosen, and announced that he was giving up firework-making. Fireworks were not only an important activity of the young academy, but also elevated its position and prestige, as well as of Russian science itself.

The critical point

The lesson I draw from Werrett’s book is that producing fireworks was not a hobby or side occupation that scientists tacked on to their “real” work. Scientists who produced fireworks were simply carrying on the practice of science, not trying to promote themselves or curry favour. A modern-day equivalent would be researchers consulting on governmental projects. Such activity is not only an integral part of the work of science, but it also bolsters the confidence of legislators and the public in science and their awareness of its value.

Science today needs more fireworks.

The post Why fireworks are so important to science appeared first on Physics World.

Clingfish inspires suction cups for underwater robots

Par : No Author

By mimicking how a tiny fish clings to rocks and other objects, researchers in California have made suction cups that adhere to rough surfaces in air and underwater. The team also showed how a robotic arm fitted with such a suction device can manipulate delicate objects such as a strawberry and a raw egg. They hope that their design could be used by deep-sea remotely operated vehicles (ROVs) for the recovery of fragile archaeological specimens and brittle marine samples.

Suction cups work well on smooth surfaces such as car windscreens, where the pressure difference they rely on can be maintained for a very long time. Rough surfaces are much more challenging because creating an effective seal is difficult.

Evolution has solved this problem for sea creatures that use natural suction cups to cling onto rugged rocks both above and beneath the waves. Trying to mimic these natural  structures has long been an active area of research in soft robotics — however, scientists have only recently begun investigating the passive mechanism by which the northern clingfish avoids being tossed about by intertidal surges.

Michael Tolley’s soft robotics group at the University of California, San Diego, began to look at clingfish adhesion when PhD student and ROV pilot Jessica Sandoval shared her frustrations of gathering objects underwater.

Softness built in

“[ROVs] have rigid manipulators that don’t have much fine tune control,” said Tolley. “It started us thinking, can we do manipulation underwater with some softness built into the system to delicately handle things?”

Tolley struck up a collaboration with Dimitri Deheyn, a marine biologist at the Scripps Institution of Oceanography in nearby La Jolla. With Deheyn’s guidance, Sandoval dissected clingfish specimens from the Scripps’ extensive fish collection, and some fresh specimens collected along the San Diego coastline. They examined the structure of the clingfish’s suction cup using various optical techniques, identifying four core features likely to be involved in the adhesive strategies employed by the clingfish.

The team then studied how these features — slits, a soft sealing layer, microfibrils and the shape of the cup — impacted adhesion. They fabricated 25 mm diameter silicon suction cups, with different combinations of these features and tested them in different scenarios.

Secret to suction

The tests involved applying a relatively small force to attach a cup onto surfaces of varying roughness, both underwater and in the air. Then the force needed to remove the cup was recorded.

“The commercial suction cup always did better on flat surfaces, but with any sort of roughness our prototypes did much better. It’s an exciting start, but we’ve not yet reached the actual performance of the clingfish suction disc,” explains Tolley.

Different combinations of shape and slits performed better in water and air, but the soft sealing layer is essential for adhesive performance on rough surfaces below and above water.

Another feature thought to help clingfish maintain a seal on challenging rough surfaces, is the dense bed of microfibrils or “micropapillae”, which are tiny soft protuberances that line the cup perimeter. The team mimicked these micropapillae by adding silicone micropillars to their cups.

Somewhat surprised

“We were somewhat surprised the microstructures weren’t an improvement ……on the soft sealing layer,” said Tolley. “But we looked only at structure and material properties, while the clingfish has other features, like mucus secretion that could affect papillae adhesion.”

The team followed up these investigations by examining the impact of curved surfaces and analysing how slits in the suction cup enabled it to conform to concave surfaces.

The researchers then turned their attention to demonstrating what a suction cup could do by attaching it to the end of a robotic arm handling a variety of delicate fresh foods such as tomatoes, strawberries in the air. Tests were also done underwater, where the arm picked up several objects including a crab and a knobbly vase. Finally, with her pilot hat on, Sandoval used a ROV arm with a suction cup attachment to handle a raw egg without breaking it.

High pressure

Nicola Pugno at Italy’s University of Trento, who was not involved in the study, praised the team’s extensive investigations into suction cup performance in different scenarios. Pugno adds that he is intrigued to see how the suction cup, which relies on establishing a pressure differential for suction, would perform when subjected to the high pressures ROVs experience on the ocean floor.

The team is keen to perform further underwater tests and plans to study live clingfish to find out how suction cups are actively altered according to the environment.

“I see these types of adhesive components as being a very specific piece of the puzzle that fits into a larger soft robotic system,” said Tolley, who hopes to combine adhesion with other work in his team on pneumatics and smart muscles, to create robots with greater utility.

The research is described in Bioinspiration & Biomimetics.

The post Clingfish inspires suction cups for underwater robots appeared first on Physics World.

Gd-loaded nanoparticles plus monochromatic X-rays can destroy tumours

Par : No Author
Energy dependence of tumour spheroid destruction.
Tumour spheroids irradiated with 50.0, 50.25 or 50.4 keV X-rays for 20 min. Fluorescence images reveal the energy dependence of spheroid destruction in spheroids containing Gd-loaded nanoparticles. (Courtesy: CC BY 4.0/Sci. Rep. 10.1038/s41598-019-49978-1)

The combination of gadolinium-loaded nanoparticles and monochromatic X-rays completely destroyed tumour spheroids within three days after 20 to 60 minutes of irradiation in a laboratory setting in Japan. The technique, which selectively amplifies the effect of radiation delivered to a tumour site, could eventually pave the way for a new type of cancer radiotherapy, according to researchers from Kyoto University’s Institute for Integrated Cell-Material Sciences (Sci. Rep. 10.1038/s41598-019-49978-1).

The research was conducted at Kyoto University and SPring-8, the largest third-generation synchrotron radiation facility in the world. The facility creates synchrotron radiation consisting of narrow, powerful monochromatic X-ray beams. These X-ray beams can be precisely tuned to target the K-shell of high-Z atoms, such as gadolinium, which causes ejection of inner K-shell electrons (K-edge activation) and triggers a series of events that releases Auger electrons. This approach, called photon activation therapy, has been shown to enhance DNA damage that can kill cells.

Fuyuhiko Tamanoi
Fuyuhiko Tamanoi.

Principal investigator Fuyuhiko Tamanoi and colleagues hypothesized that nanoparticles loaded with high-Z atoms located close to the nuclei of cancer cells could improve this photon activation therapy. They selected gadolinium as the high-Z material because it can generate Auger electrons and cause DNA damage. After loading gadolinium into mesoporous silica nanoparticles, they added the nanoparticles into a culture media of human ovarian cancer cells and confirmed that the particles could enter the cells without causing toxicity.

The researchers next prepared tumour spheroids from ovarian cancer cells that express green fluorescence protein. Fluorescence imaging confirmed that the nanoparticles were uniformly distributed in the spheroids. They then irradiated tumour spheroids without and with varying levels of gadolinium using monochromatic X-rays at 50.0, 50.25 and 50.4 keV.

Tumour spheroids containing 50 ng of gadolinium-loaded nanoparticles and irradiated with 50.25 keV X-rays broke up into pieces 72 hours after a 10 minute exposure. After 60 minute exposure, these spheroids were completely destroyed. Spheroids irradiated with 50.4 keV X-rays showed slightly less levels of destruction, while 50.0 keV X-rays caused almost no spheroid damage.

Tumour spheroids containing 10 or 20 ng of nanoparticles were only partially destroyed, and there was no damage at all when the nanoparticles did not contain gadolinium.

“Destruction of the tumour spheroids was exposure time dependent and was also dependent on the amount of gadolinium loaded to spheroids,” the researchers write. “The dramatic difference between the effect of 50.25 and 50.0 keV X-rays is consistent with the idea that the Auger electrons are exerting cellular effect.”

As for the slight difference in destruction efficiency between 50.25 and 50.0 keV, noting that similar levels of energy are likely to be absorbed, the researchers speculate that the energy release processes may have differing degrees of efficiency and/or that the energies of electrons released from the inner shell differ. “It is also interesting that tumour spheroids were broken into pieces after irradiation, which may suggest that the treatment has some effect on cell adhesion,” they write.

The researchers are hopeful that a compact X-ray generator capable of producing monochromatic X-ray beams in a clinical treatment facility will be developed for experimental and clinical use. They are now planning studies using animal model systems, Tamanoi tells Physics World. After this research is successfully completed, the nanoparticles will need to be approved for use in human clinical trials.

“My guess is that it will take more than five years to be able to use this technology in a clinic,” Tamanoi says. “But I would like to emphasize that our work opens up a possibility to develop a new type of radiation therapy. This could have a major impact on how radiation therapy is carried out.”

The post Gd-loaded nanoparticles plus monochromatic X-rays can destroy tumours appeared first on Physics World.

Plants receive nitrogen boost in hotter climes

Par : No Author

Scientists in the US have shown that plant growth under extreme-warming conditions could be boosted thanks to more nitrogen in the soil. While plant growth is limited by the low level of nitrogen in the soil during modest warming conditions, the study shows that this is not the case in hotter temperatures due a surge of soil microbes that act to boost nitrogen supply. The researchers add, however, that this increase could be curtailed by the greater amount of carbon dioxide in the atmosphere.

Previous studies have shown that elevated carbon dioxide can boost plant growth, whilst increased temperature may have the opposite effect. But few studies have looked at the combined effects of increased carbon dioxide and temperature. Current projections suggest that both atmospheric carbon-dioxide levels and average temperature will increase over the coming decades. To understand what kind of impact this will have on ecosystems, Genevieve Noyce from the Smithsonian Environmental Research Center, and colleagues manipulated growing conditions at Kirkpatrick Marsh in Chesapeake Bay — a tidal marsh environment on the east coast of the USA.

Shoots and leaves

Using infra-red heaters, soil-heating pins and carbon-dioxide chambers the researchers carefully controlled the conditions on several different plots and measured both root and shoot growth of sedge plants over two growing seasons. The ratio of root-to-shoot growth indicates how much nitrogen is available to the plant as shoots put on more weight when nitrogen is readily available.

Under modest warming conditions — 1.7 °C above present day — they found that root growth outpaced shoots, indicating that plant demand for nitrogen outstripped supply. But under more extreme warming (5.1 °C above present day) shoots outpaced roots indicating that there was surplus nitrogen available in the soil. “Microbes generally become more active under warmer conditions, so as the soil warms up, the rate of microbial mineralization increases, which leads to more plant-available nitrogen being added to the soil,” says Noyce.

It is likely that our results apply to other unmanaged ecosystems including grasslands and forests

Genevieve Noyce

However, when elevated levels of carbon dioxide were added to the warming treatment, the response changed, with a swing back towards greater root growth. “As the temperature rises, soil nitrogen supply increases, but as carbon dioxide rises, plant demand for nitrogen also increases, so the net result is going to be the balance between the two,” says Noyce, whose findings are published in Proceedings of the National Academy of Sciences. “It is likely that our results apply to other unmanaged ecosystems including grasslands and forests, provided the soils contain adequate soil organic matter to be broken down by microbes to yield plant-available nitrogen.”

This response to climate change demonstrates the complex interaction between plants, which increase growth even at low levels of warming, and soil microbes that don’t increase their activity until it is significantly warmer. In recent decades rising atmospheric carbon dioxide has boosted plant growth and helped trap more carbon in land sinks. Levels of nitrogen in the soil have usually been the limiting factor to growth. But as temperatures continue to rise microbial activity looks set to boost nitrogen supply, such that it is no longer a limiting factor for plant growth.

The post Plants receive nitrogen boost in hotter climes appeared first on Physics World.

Ice-water interface goes viscous

The liquid film that develops as an object glides across ice is as viscous as oil and much thinner than expected, say a team of researchers who have developed a way of probing the ice-water interface much more precisely than was previously possible.

Ice and snow have exceptionally low friction coefficients, making them good for skiing, skating and sledging, but dangerous for drivers on icy winter roads. Although these materials have been studied for more than 150 years, scientists still do not understand why they are so slippery.

Unanswered questions

Some have attributed the low friction coefficients to the formation of a thin layer of liquid water between the ice and the sliding object – caused, paradoxically, by frictional heating slightly melting the ice. However, this hypothesis raises many unanswered questions, says Lydéric Bocquet of the Physics Laboratory at the Ecole Normale Supérieure (ENS) in Paris. Water is a bad lubricant compared to oil, and the thickness and properties of the proposed interfacial water layer have not been measured. Indeed, its very existence has been under debate.

Now, however, a team led by Bocquet and his ENS colleague Alessandro Siria have used a new instrument – dubbed a stroke-probe tribometer – to measure the properties of this interfacial water layer. Their work shows that the liquid film does indeed exist, but it is just a few hundreds of nanometres to a micron in depth, and its viscoelastic properties resemble those of polymers or polyelectrolytes rather than simple water.

Tuning fork technique

Bocquet, Siria and colleagues studied “interfacial water” using a modified double-mode Tuning Fork Atomic Force Microscope (TF-AFM). The instrument they developed comprises a millimetre-sized probe ball glued to a macroscopic tuning fork. Although the fork is very similar to a piano tuning fork, it can be excited by a very low frequency vibration, typically several hundred Hertz. The system can be accurately modelled as a stiff mass-spring resonator with a quality factor of around 2500.

ice gliding experiment
Schematic of the experimental setup. Courtesy: L Bocquet

When the researchers bring the vibrating ball at the end of the fork in contact with the surface of a centimetre-sized block of ice (using a piezo element with an integrated motion sensor of nanometric resolution), the lateral stroke of the ball slides across the ice with a fixed amplitude and velocity. The frequency of the system then changes, but so does its quality factor.

The ENS team use the frequency offset to measure the elastic properties of the contact surface, and the change in quality factor to evaluate the dissipation processes occurring there. The two measurements together give the layer’s interfacial viscosity.

“Listening” to forces

The researchers say the instrument allows them to “listen” to the forces between the probe and the ice with remarkable precision. Indeed, despite being centimetres in size, the instrument’s sensitivity is such that it is possible to probe ice contact and friction properties at the nanometre scale. “The system allows us to access several vibration frequencies, offering us the possibility to simultaneously probe the tribology of the contact (‘how it rubs’) by moving the ball in a lateral direction and its rheology (‘how it flows’) by moving the ball in a perpendicular direction,” Bocquet explains.

The experiments confirm the super-slippery nature of the interfacial ice, but they also – for the first time – confirm that friction generates a film of liquid water when the probe ball is set in motion. This film is, however, much thinner than previous theoretical calculations have suggested, and it is also as viscous as oil, with a viscosity of up to hundreds of mPa-s – two orders of magnitude larger than the viscosity of water. The researchers also showed that the film’s viscosity depends on the shear velocity – a behaviour known as shear thinning.

Crushed ice and water state

According to Bocquet, one interpretation for this unexpected behaviour is that surface ice does not completely transform into liquid water when an object glides across it. Instead, it may enter a mixed “granité-like” (crushed ice and water) state. This mixed film, he suggests, lubricates the contact between the solid ice and the ball and prevents any direct contact between the two surfaces.

Separate experiments by the ENS team show that making the probe hydrophobic reduces friction even further by modifying the interfacial viscosity. This “waxing” process is practiced empirically by skiers, but the reason why it made skis glide better was not previously understood.

Towards a new theory for interfacial ice

The team’s result means that existing theoretical descriptions for interfacial ice need an overhaul, Bocquet tells Physics World. A new theory would provide a better understanding of sliding on ice, which would come in useful in developing winter sports equipment or self-healing, ultra-low-friction lubricants for industrial applications. It might also help, conversely, to find ways of increasing friction, which is essential to avoid slipping on icy roads.

Angelos Michaelides of University College London, UK, who was not involved in the research, says that the ENS study is very exciting. “I am not aware of such a nice and elegant set of measurements on the friction of the quasi-liquid layer and think it is an extremely interesting new perspective on this age-old story,” he comments.

The research is described in Physical Review X.

The post Ice-water interface goes viscous appeared first on Physics World.

Voyager 2 spacecraft goes interstellar as it leaves the solar bubble

The spacecraft Voyager 2 left the heliosphere and travelled into interstellar space over the course of a day in November 2018, according to a suite of papers published today by scientists working on the mission.

The spacecraft was launched in 1977 along with its twin Voyager 1, which crossed-over into interstellar space seven years ago. Scientists analysing data from Voyager 2 have found both similarities and differences to the crossing of Voyager 1.

The Sun is surrounded by a huge bubble called the heliosphere that is inflated by the supersonic solar wind of charged particles emitted by the Sun. The edge of this bubble is called the heliopause, which is where the outgoing solar wind is halted by the interstellar wind of charged particles.

Different crossings

Both Voyager missions crossed the heliopause on the windward side of the bubble but at different locations. Voyager 1 left the northern hemisphere of the heliosphere and Voyager left the southern hemisphere at locations separated by about 160 au (1 au is the distance from Earth to the Sun).

Voyager 1’s departure point was about 122 au from the Sun, while Voyager 2 exited at 119 au from the Sun.  According to scientists working on the Voyager 2 mission, these slightly different distances could be a result of the exit events occurring a different times in the 11-year solar cycle. This cycle changes involves changes in the intensity of the solar wind that could make the size of the heliosphere fluctuate.

One big difference between the two spacecraft is that all five instruments onboard Voyager 2 are still functioning, whereas the plasma instrument that measures the solar (and then interstellar) wind was damaged on Voyager 1 in 1980. This meant that Voyager 1 was unable to measure the transition from the hot, low-density solar wind to the cold, high-density interstellar wind.

Thinner and smoother

Analysis of the Voyager 2 data suggest that the heliopause it encountered was thinner and smoother than the boundary crossed by Voyager 1. Indeed, Voyager 2 made the crossing in less than one day. The Voyager 2 data also suggested that the interstellar medium that the spacecraft first encountered is hotter than had been expected.

Voyager 2 also discovered a region between the heliopause and interstellar space where the solar and interstellar winds interact. This layer was not detected by Voyager 1.

Both spacecraft found little change in the direction and magnitude of magnetic fields across the heliopause. This is surprising because scientists had expected an abrupt transition between solar and interstellar magnetic fields to occur at the interface.

Gaining a better picture of the heliopause and heliosphere could provide important clues about how life emerged on Earth – and how it could emerge on exoplanets orbiting distant stars that would also be surrounded by bubbles. That is because the heliosphere shields Earth from many cosmic rays impinging on it – radiation that is harmful to life.

The Voyager 2 papers appear in Nature Astronomy.

The post Voyager 2 spacecraft goes interstellar as it leaves the solar bubble appeared first on Physics World.

VFX in movies: from weightlessness to curly hair

Par : No Author
Still from Gravity
(Courtesy: © Warner Bros. Image supplied by Framestore)

It’s almost ironic that, decades after failing to attend many of his own undergraduate physics lectures, Tim Webber found himself teaching his colleagues the physics they needed to do their job. As chief creative officer at London-based Framestore – one of the world’s leading visual effects (VFX) studios – he’d worked on blockbusters such as Harry Potter and the Goblet of Fire (2005) and The Dark Knight (2008). But it was his Oscar-winning work leading the visual effects on the Alfonso Cuarón movie Gravity (2013) that forced him to share his physics insights.

Gravity featured Sandra Bullock and George Clooney as space-shuttle astronauts fighting for their lives in a zero-gravity environment after their craft gets hit by space debris. According to Webber, the problem was that animators spend years developing the skill of creating virtual beings that don’t just look good, but also move in a way that suggests they have weight. “Suddenly,” he says, “they had to animate things that didn’t have weight, but still had mass.” It was a concept that Webber’s team struggled to get their heads around. “So I got them into a room and gave them physics lectures.”

His tutorials paid off. Webber – plus his colleagues Chris Lawrence, Dave Shirk and Neil Corbould – won the 2014 Academy Award for Best Visual Effects for their work on the movie. But then Webber has always had a creative bent. As an undergraduate at the University of Oxford in the early 1980s, he’d spend more time in arts studios than physics labs. Indeed, he’s one of many similarly inclined people who use their training in physics, engineering and maths in the VFX industry. And no wonder. When it comes to recreating a believable world on screen, physics is everything. “Maths and physics feature very heavily,” says Webber’s Framestore colleague Eugénie von Tunzelmann.

Before working at Framestore, von Tunzelmann – an engineer and computer scientist by training – was a visual-effects designer at another London VFX firm called Double Negative. While there, she worked on Christopher Nolan’s epic sci-fi movie Interstellar (2014) and ended up co-authoring a scientific research paper about Gargantua – the black hole that’s the focus of the film (Class. Quant. Grav. 32 065001). She wrote the paper with Paul Franklin, who had co-founded Double Negative, and another colleague from the firm, Oliver James. The trio had collaborated with Caltech physicist Kip Thorne (the fourth author on the paper) to create as realistic a simulation of a supermassive black hole as possible. The simulation won plaudits from physicists and Hollywood critics alike – and led to Franklin sharing the VFX Oscar in 2015.

From humble beginnings

Things have certainly come a long way since the iconic – but less-than-realistic – VFX of King Kong (1933) or Jason and the Argonauts (1963). The transition to computer-generated imagery (CGI) in films such as TRON (1982) and Jurassic Park (1993) was a game-changer, but there was still plenty that was unrealistic about the way light behaved, or creatures moved. This, though, is an industry that never stands still. “New techniques are constantly being developed,” says Sheila Wickens, who originally studied computer visualization and is now a VFX supervisor at Double Negative. “It is very much a continually evolving industry.”

These days, the industry has embraced what is known as “physically based rendering” whereby physics is “hard-wired” into the CGI. Industry-standard software now includes physics-based phenomena, such as accurately computed paths for rays of light. “The complex maths used in ray-tracing is in part based on maths developed for nuclear physics decades ago,” says Mike Seymour, a VFX producer and researcher at the University of Sydney whose background is in mathematics and computer science.

Other phenomena captured by today’s CGI include life-like specular reflection, which means that materials such as cotton and cardboard – which in the past did not reflect light in CGI scenes – are now modelled more realistically. A similar thing has happened with the inclusion of Fresnel reflection so that image-makers can account for the fact that the amount and wavelengths of light reflected depend on the angle at which the light hits a surface. Indeed, it’s no longer acceptable to make things up, or break the laws of physics, says Andrew Whitehurst, VFX supervisor at Double Negative, who won an Oscar in 2016 for his work on Alex Garland’s artificial-intelligence-focused thriller Ex Machina.

“When I began in the industry a little over 20 years ago, we cheated at almost everything,” Whitehurst admits. “Now, surfaces are more accurately simulated, with reflectometer research being implemented into code that describes the behaviour of a variety of materials. Our metals now behave like metals and, by default, obey the laws of energy conservation. Fire and water are generally simulated using implementations of the Navier–Stokes equations: the tools we use in VFX are not dissimilar to those used by researchers needing to compute fluid simulations.”

In fact, Whitehurst says, it’s hard to see how many things can be made any more realistic. “We can blow anything up we want, we can make anything fall down that we want, we can flood anything we want, and we can make things as hairy as we would like.”

Much of this accuracy comes out of deliberate research programmes, either in academia or within the studios themselves. The fact that filmmakers can now accurately model curly hair, for example, owes a debt to researchers at the renowned US animation studio Pixar, who, in the early 2010s, developed a physics-based model for the way it moves. Their model is described in a Pixar technical memo (12-03a) entitled “Artistic simulation of curly hair” by a team led by Hayley Iben, a software engineer who originally did a PhD at the University of California, Berkeley, on modelling how cracks grow in mud, glass and ceramics.

Still from Brave
Character creation: Physics-based models have helped to make the curly hair of characters like Merida in Brave hugely realistic and also led to supremely life-like animals, such as in the 2019 remake of The Lion King. (Courtesy: ©Walt Disney Co. Courtesy Everett Collection/Alamy Stock Photo)
Still from The Lion King 2019
(Courtesy: Disney Enterprises, Inc./The Hollywood Archive/PictureLux/Alamy Stock Photo)

Modelling the movement of curly hair for animations, it turns out, is best done by representing hair as a system of masses on springs. The technique Iben and her team developed was used to great effect in the animation of the curly-haired hero Merida of Brave (2012), and later in films such as Finding Dory (2016) and The Incredibles 2 (2018). Admittedly, it’s more accurate to model hair as infinitesimally thin elastic rods, but this, the Pixar group says, straightens out the hair too much when in motion. Increase the stiffness to avoid this, and the hair takes on an unrealistic wiriness as the character moves their head.

Such compromises are important. After all, it’s the movie director who gets the final say in whether a visual effect works. “Being able to make something 100% real is actually just a stepping stone to making something cinematic,” says Seymour at the University of Sydney. “It is often critical to be able to create something real, and then depart from it in a believable way.”

Sometimes the departure doesn’t even have to be believable. Back at Framestore in London, von Tunzelmann recalls being asked to create fantasy fire where the flames curled in spirals. “There was no software that can do that, so we wrote a new fluids solver that measured the curl of the field and exaggerated it,” she explains. Everyone in the VFX industry, it seems, wants someone with a background in physics or engineering on their side (see box below).

Even in things where we are trying to play by the rules, we are going to bend them here and there

The same conflict between aesthetics and reality occurred in Interstellar. Whitehurst, who helped develop some of the VFX techniques used in the film, suggests that movie was both realistic and not. The team had to rewrite the ray-tracing software to account for the intense curvature of space around a black hole – and also had to dial down Gargantua’s brightness for the audience. “You can see exciting detail in it, and you probably wouldn’t be able to [in reality] because it would be so staggeringly bright,” Whitehurst points out. “Even in things where we are trying to play by the rules, we are going to bend them here and there.”

That human factor

For all the progress in making movie animations look as realistic as possible, one challenge still looms large: how best to represent human beings. Getting human features to look right for movie-goers is about more than just simple physics. We can make a human face that is photographically perfect – the issues of light transport through skin and modelling how wrinkles work have largely been solved. The difficulty is in the subtleties of how a face changes from moment to moment. The problem, Framestore’s Webber reckons, is that evolution has trained the human eye to analyse faces and work out if someone is lying or telling the truth, is healthy or unhealthy.

“Being able to recreate faces and fool the human eye is exceptionally tricky,” he says. And the truth is that we don’t even know what it is we see in a face that tells us something is awry: it’s a subconscious reaction. That means VFX designers don’t know how to fix a wrong-looking face – and just can’t generate one from scratch, let alone know how they might recreate a particular emotion that you want that character’s face to portray at that moment. “The last thing you want is a character saying ‘I love you’ when the eyes are saying the opposite,” Webber says.

And if you want to make life for a visual-effects designer even harder, try asking them to put a human face underwater, Seymour suggests. “The skin and mass of the face moves with gravity mitigated by buoyancy,” he says. “If they then quickly move a limb underwater near their face, the current produced by the simulated flesh of their hand needs to inform a water simulation that will affect their hair, their face-flesh simulation and any tiny bubbles of air in the water. These multiple things all interact and have to be simulated together.”

For now, animators compromise by combining CGI with motion capture, whereby an actor does their performance with dozens of dots glued to their face so that the image-processing software can track all the muscle movements from the recorded scene. VFX designers then use this information to create a virtual character who might need larger-than-life qualities (quite literally, in the case of the Incredible Hulk). Finally, they overlay some of the original footage to re-introduce facial movements. “This brings back subtleties that you just can’t animate by hand,” Webber says.

Still from The Dark Knight
Visual challenge: Animating humans is really tricky to do well – even when starting from live-action footage, such as in The Dark Knight (above) – and especially if, like in Harry Potter and the Goblet of Fire, they’re underwater (below). (Courtesy: © Warner Bros. Image supplied by Framestore)
Still from Harry Potter and the Goblet of Fire
(Courtesy: © Warner Bros. Image supplied by Framestore)

It turns out that our eye is more forgiving when it comes to CGI representations of animals. That’s why we have seen a slew of movies led by computer-generated “live-action” animals, from Paddington (2014) to the recent remakes of Disney’s The Jungle Book (2016) and The Lion King (2019). Framestore has recently been working on Disney’s upcoming remake of Lady and the Tramp. Due out later this month, it mixes footage of real and CGI dogs – and the VFX are so realistic that many in-house animators can’t tell which is which, according to Webber. “People in the company have asked why a particular shot is in our showreel when it’s a real dog, and they have to be told – and convinced – that it isn’t!”

The new remake of Lady and the Tramp mixes footage of real and CGI dogs that’s so realistic that many in-house animators can’t tell which is which

Some things in movies, however, will never be truly realistic. Directors in particular want their monsters to move quickly because that’s more exciting. However, the laws of physics dictate that massive creatures move slowly – think how lumbering an elephant is compared to a horse – and our subconscious knows it. So when we watch a giant monster scurry across the screen, it can feel wrong – as if the creature has no mass.

“If Godzilla or a Transformer were actually to try to move at the speed they do in the movies, they would likely tear themselves apart, as F = ma last time I checked,” Whitehurst says. “This is a fight that I always have, and that everyone always has. But ultimately a director wants something exciting, and a Pacific Rim robot moving in something that looks like ultra-ultra-slow motion doesn’t cut it.” It’s a point echoed by Sheila Wickens, who studied computer visualization and animation at Bournemouth University and is now VFX supervisor on the BBC’s flagship Doctor Who series. “We usually start out trying to make something scientifically correct – and then we end up with whatever looks good in the shot,” she says.

That fight between directors wanting visual excitement and animators wanting visual accuracy is what made working on Gravity so special for Webber. He says the film was the highlight of his career to date – but also “by far the most challenging movie” he’s worked on. “All we were filming was the actors’ faces, everything else was made within the computer,” he says. The team had to write computer simulations of what would happen in microgravity when one character is towing another on a jet pack, and the result became a plot point. “We found that they bounced around in a very chaotic and uncontrollable way,” Webber says. “It’s literally down to F = ma, but Alfonso, who was working with us, really loved it and folded that into the script.”

That was quite a moment, Webber says. Suddenly, all his physics lectures – given and received – had been worth it.

Paths to success in the visual-effects industry

On the set of Gravity
Physics in action: Tim Webber of Framestore (far right) on the set of Gravity. (Courtesy: Framestore)

If you’re a physicist who wants to work in the visual-effects (VFX) industry, what opportunities are available and what skills do you need – beyond a willingness to lecture your colleagues about the finer points of F = ma?

Yen Yau, a Birmingham-based project manager who trains newcomers in the world of film

Having worked on careers publications for ScreenSkills – the industry-led skills body for the UK’s screen-based creative industries – Yau says there is huge diversity in the paths people can take. “Certainly, physics is going to be more important in some roles, but there are numerous routes in for all types of backgrounds and experiences of applicants.”

Eugénie von Tunzelmann, a visual-effects designer at London VFX firm Framestore

While most workers in the VFX industry don’t have a background in science, technology, engineering and mathematics (STEM) subjects, she says there is a need for people with skills in those areas. Anyone working in a job that involves programming a software plugin that, say, defines how light bounces off a surface will almost inevitably have a background in physics or optics. If you were trying to model fluid flow, “you’d need to have an understanding of thermodynamics”, she says.

Andrew Whitehurst, Oscar-winning VFX designer

Describing himself as “an artist with an interest in physics and engineering”, Whitehurst says that many people in the VFX industry are physicists or engineers with a creative itch that won’t go away. “I work with people with physics doctorates or engineering doctorates and art-school dropouts but we all meet in the middle. I have no formal background in physics, but I have a reasonable passing knowledge of a lot of physics and engineering principles. I need to know why camera lenses do what they do, for example, so that we can mimic their behaviour.” But scientists working in VFX will have to learn how to compromise, he notes. “We use a lot of science and engineering, but we are not in the business of scientific visualization. I am an enormous respecter of science, but if I can make a more beautiful picture that tells the story better, I’m going to do it.”

Tim Webber, physicist who is now chief creative officer at Framestore in London

Even if you don’t need to understand the maths hidden in the software that the VFX industry use, Webber feels it helps to understand the principles of what the equations are doing, pointing to his experience early in his career working on a 1996 Channel 4 TV mini-series dramatizing Gulliver’s Travels and starring Ted Danson. “There are lots of small people and big people and we had to work out the angles, where to put the camera, so that the perspective would match what it would match in the other scale. I was using bits of paper and rulers and protractors and calculators.”

The post VFX in movies: from weightlessness to curly hair appeared first on Physics World.

UK research network to advance radiotherapy developments

A £56 million research network announced today by Cancer Research UK will transform the UK into a global hub for radiotherapy research. The network – Cancer Research UK RadNet – will accelerate the development of advanced radiotherapy techniques, including FLASH radiotherapy, MR-Linac treatments, proton therapy, stereotactic radiotherapy and artificial intelligence.

RadNet will unite seven centres-of-excellence across the country. The University of Manchester, the University of Cambridge and the CRUK City of London Centre (a partnership between UCL, Queen Mary University of London, King’s College London the Francis Crick Institute) will receive funding for infrastructure and research programmes, including the formation of new research groups. The Universities of Glasgow, Leeds and Oxford and the Institute of Cancer Research, London/Royal Marsden will receive funding for infrastructure.

“Radiotherapy is a cornerstone of cancer medicine, with around three in 10 patients receiving it as part of their primary treatment,” says Michelle Mitchell, chief executive of Cancer Research UK. “The launch of our network marks a new era of radiotherapy research in the UK. Scientists will combine advances in our understanding of cancer biology with cutting-edge technology to make this treatment more precise and effective than ever before”.

The Cancer Research UK RadNet aims to improve cancer survival by optimizing and personalizing radiotherapy. The centres will develop new techniques for delivering radiotherapy and investigate new radiotherapy–drug combinations, with a focus on reducing long-term side effects and improving patients’ quality-of-life. Projects will include innovative research into:

  • FLASH radiotherapy, in which pulses of high-dose of radiation are delivered in a fraction of a second. Early research suggests that FLASH has the potential to cause less damage to healthy tissue near the tumour than traditional radiotherapy.
  • Proton therapy. The Christie NHS Foundation Trust in Manchester is the first NHS hospital to provide high-energy proton therapy; the second centre will open at University College London Hospitals NHS Foundation Trust next year. RadNet will support researchers across the country to optimize this new technology.
  • Overcoming hypoxia. Hypoxic tumours are far less susceptible to radiotherapy. Scientists will develop better ways to identify hypoxic tumours and new treatments to oxygenate them, making radiotherapy much more effective.
  • Cancer recurrence. Researchers will investigate why some cancers come back after radiotherapy by studying the role of cancer stem cells. These cells are remarkably resistant to radiation, and just a few remaining after treatment can cause a recurrence. For some patients, targeting stem cells could be the key to unlocking radiotherapy’s full potential.
  • Drug development. Scientists will develop and test drugs, including immunotherapies, for use in combination with radiotherapy. They will also study how tumours can repair DNA damage caused by radiotherapy and use the latest gene-editing technology to develop drugs that interfere with this process.
  • Artificial intelligence. RadNet researchers will use AI to design personalized treatment plans based on data from patients’ scans. This could improve radiotherapy accuracy and provide treatment options for patients whose tumours were previously too risky to target with radiation.

“I’ve seen first-hand how successful radiotherapy can be for patients that I treat, but it’s been frustrating to see the UK lagging behind other countries when it comes to prioritizing research into this vital treatment,” says Adrian Crellin, Cancer Research UK trustee and former vice-president of the Royal College of Radiologists. “Cancer Research UK’s investment will overhaul radiotherapy research in the UK to bring the next generation of treatments to patients sooner.”

The post UK research network to advance radiotherapy developments appeared first on Physics World.

ASTRO showcase: RaySearch highlights machine learning innovations

In this short video, filmed at ASTRO 2019, Frederik Löfman of RaySearch Laboratories explains how machine learning can improve consistency and efficiency in clinical practice.

The post ASTRO showcase: RaySearch highlights machine learning innovations appeared first on Physics World.

Magic-angle graphene reveals a host of new states

Last year, researchers at MIT lead by Pablo Jarillo-Herrero observed superconductivity in a pair of graphene layers engineered to be slightly misaligned. Now, a team at the ICFO in Barcelona, Spain, says it has seen a host of additional correlated states in the same “magic angle” system, providing a much more detailed view of how twisted bilayer graphene behaves and opening up new ways of studying strongly-correlated physics.

According to Dmitri Efetov, the study’s lead author, magic-angle twisted bilayer graphene represents a simple system in which to investigate novel phenomena that arise due to interactions between electrons in a material. The electron density in this platform can be tuned by applying an electric field, which allows the strength of the electron-electron interactions to be varied. It also allows the material to be tuned between different phases – for example, between the superconductor and the correlated state. Being able to do this could shed light on the underlying mechanisms at play in superconductors – especially high-temperature ones based on cuprates, for which a fundamental understanding is still lacking.

To create their testbed, Efetov and colleagues followed the “tear and stack” method previously developed by Emanuel Tutuc and colleagues at the University of Texas. The researchers stacked two sheets of atomic-thick carbon (graphene) on top of each other with a small angle misalignment. When the misalignment reached an angle of exactly 1.1° — the theoretically predicted “magic angle” — the MIT researchers found that the material became a superconductor (that is, able to conduct electricity without resistance) at 1.7 K. The effect disappears at slightly larger or smaller twisting angles.

Fundamentally new approach to device engineering

The MIT result kick-started a flurry of activity in “twistronics”. In this fundamentally new approach to device engineering, the weak coupling between different layers of 2D materials (such as graphene) can be used to manipulate the materials’ electronic properties in ways that are impossible with more conventional structures, simply by varying the angle between the layers.

Xiaobo Lu and Dmitri Efetov
Xiaobo Lu (left) and Dmitri Efetov (right) manipulating the experimental setup ©ICFO

The crystal structure of a single layer of graphene can be described as a simple repetition of carbon atoms, which is known as its unit cell. When two graphene layers are stacked on top of each other at a slight angle, they form a moiré pattern or “superlattice” in which the unit cell expands to a huge extent, as if the 2D crystal was artificially stretched 100 times in all directions. This stretching dramatically changes the material’s interactions and properties, and simply varying the angle between 2D material layers changes its electronic band structure. At small twists, the moiré graphene superlattices can even be switched from fully insulating to superconducting, as Jarillo-Herrero’s team discovered.

Improving material homogeneity

In the new work, Xiaobo Lu, a postdoctoral researcher in Efetovs’s group, improved the structural homogeneity of the bilayer twisted graphene by mechanically cleaning it to remove trapped impurities and release local strain between the layers.

When he subsequently changed the charge carrier density within a device made from the material by applying a varying voltage to it, he observed that the device could be tuned from behaving as a Chern insulator (a state where the material’s electron bands are all either filled or all empty, and for which the filled bands have a net total Berry curvature or Chern number) to a superconductor. It could also be made to form an exotic magnetic state in which magnetism arises because of orbital motion of the electron rather than (as in typical ferromagnets) the electron spin. Such a state, Lu says, has never been seen before.

Competition between many novel states

Lu explains that magic-angle bilayer graphene seems to be competing between many novel states. “By tuning the carrier density within the lowest two flat moiré bands, it alternately shows correlated states and superconductivity, together with exotic magnetism and band topology.”

The researchers say that the different states they observed are very sensitive to the quality of device. However, they do not fully understand why the material behaves this way. “For the time being, we only know that all the correlated states come from the electron-electron interaction,” Lu says. “Their ground states and the interaction mechanisms between these quantum phases remains a mystery for now.”

Another “astounding” finding according to Lu is that the device enters a superconducting state at the lowest carrier densities ever reported for any superconductor, Lu says.  This result may have implications for applications such as quantum sensing, since it makes the material more sensitive to most kinds of radiation. The team have already tried to integrate magic-angle bilayer graphene into single photon detectors to make devices that might be employed in quantum imaging, bio-photonics and encryption systems, to name just three examples.

They were also able to increase the superconducting transition temperature of the material to above 3 K, a value twice that previously reported for magic-angle graphene devices.

Emergent quantum effects

“I think this a very interesting experiment,” comments Jarillo-Herrero, who was not involved in the ICFO team’s work. “The authors have found an interesting set of correlated insulator and superconducting states, some of which had not been seen before. This shows that the phenomenology of magic-angle graphene devices is even richer than previously thought.”

“While the origin of the new states and the differences with the results obtained by other groups remains to be understood, I believe this work will generate great interest and more experimental and theoretical work on this very exciting subject.”

Efetov’s team includes scientists from the University of Texas at Austin, the National Institute for Materials Science in Tsukuba and the Chinese Academy of Sciences in Beijing, and Efetov says they will now be focusing on investigating the superconducting mechanism in twisted bilayer graphene. “We will also be developing entirely new experimental techniques to study these emergent quantum effects in twisted low dimensional quantum materials, including graphene,” he adds.

The research is detailed in Nature.

The post Magic-angle graphene reveals a host of new states appeared first on Physics World.

❌