Science Journalism

The big game

The problem
MODERN SPORTING EVENTS have grown into megaprojects. Tournaments like the FIFA World Cup or Universiade are huge investment projects that host international teams, are watched worldwide, and require vast management administrations.
With such huge costs, and equally huge potential economic benefits, the organization of such games is taken very seriously. Planning is already well underway for the 2015 Pan American Games to be hosted by Toronto.
However, with so many people involved and with so much at stake, creating an efficient framework for communication amongst the network of coordinating bodies can be a daunting task.

The researcher
Milena Parent is an expert in sports administration at the University of Ottawa’s School of Human Kinetics. She specializes in strategic management and organization theory for large-scale sporting events.

The project
By chronicling and understanding the coordination network that existed for organizing the 2010 Vancouver Olympic Games, Parent can develop broad network theories for the management of large-scale sporting events that can then be used by future organizers.
The city of Vancouver began planning for the 2010 Olympic Games nine years before the opening ceremonies. A total of 97 separate federal, provincial, and municipal departments were involved in the planning and those were just the governmental bodies.
The coordination network of stakeholders included sponsors, organizational committees, community groups, governmental departments, the media, and delegations of athletes. Each stakeholder had his or her own interests and each was needed for the sporting event to be a success.

The key
Traditional theory presents the organizational network as a wheel with the organizing committee as the hub and the stakeholders as spokes, but Parent found a strikingly different picture. She discovered centralized control of the planning process lay with the local communities or “people on the ground,” and consequently, played a more pivotal role than that assumed by officials.
In practice, there wasn’t one centralized hub, but rather groups that formed multiple hubs of organization. None of the hubs were well connected to the entire coordination network. Instead, each had strong ties to a handful of stakeholders. Stakeholders formed strong local contacts with each other, but these local networks were relatively independent with only weak links between them. According to Parent, organizers who bridged two or more of these local networks had some of the strongest positions in the planning process since they acted as the main lines of communications between the fractured groups.

Tsunami simulations

The problem
IN THE YEAR 1700, a megathrust earthquake (that’s science talk for scary-big-earthquake) occurred along the Cascadia fault in the Pacific Ocean. The fault runs along the coast from Vancouver Island down to northern California. The earthquake triggered a tsunami off the Pacific Coast, which resulted in a flood that reached inland as far as the mouth of the Fraser River, travelling all the way across the Pacific Ocean and striking the coast of Japan.

The researcher
Engineering professor Ioan Nistor is fascinated by such tsunamis. After the 2004 Southeast Asia earthquake, he and his collaborators were the first research team in the tsunami-affected areas of Thailand and Indonesia. While in the field, they inspect damage to buildings and structures. Back in his University of Ottawa lab, Nistor measures the force of surge impacts on models. He then compares what he saw in the field to laboratory and numerical models in order to gain a better understanding of the effects of tsunami bores—the fast moving walls of water that occur once tsunamis break near shore.

The project
Nistor ruminates over the various scenarios that could result from a modern-day earthquake along the Cascadia fault. By simulating earthquakes at various points along the fault and the propagation of the waves towards shore, he is able to predict the resulting tsunami’s height and speed as it crashes inland. Nistor uses these values to estimate the strength of disastrous forces to which coastal buildings would be subject.

The key
Even though the major Canadian cities on the West Coast are located on inland waterways, simulations show that they would not be spared from devastation in the event of another Cascadia tsunami. Even though its approach is slowed in shallow waters, Nistor still expects 25-metre high surges.
Current building codes in Canada do not explicitly provide special design guidelines for structures located on tsunami-vulnerable shores. They do not account for the initial surge forces, the sweeping drag force, the increase in hydrostatic pressure, or the buoyancy force as the building floats away from its foundation. By properly quantifying these extreme loads on structures during inundation, new design guidelines for structures in tsunami-prone zones can be recommended and, in the event of a disaster, save countless lives.

Not your grandma’s network

The problem

THE WORLD IS more interconnected than ever before. Social networks, the global economy, the Internet, and even delivery routes can all seem like a jumbled mess. Nowadays, it is common to see complex nets of relationships everywhere we look.
The simplest network we can imagine is life as an employee on a production line: Our neighbour to our left passes us some widget, we add our component and pass it on to the neighbour on the right. It isn’t a web at all; it’s just a chain.
Now imagine we work in a more complicated factory. Imagine we can get different widgets from multiple neighbours. In fact, even coworkers far from our workstation can toss us widgets. To make matters worse, the foreman lets us wander to and work at any part of the production line we want! What a disaster. We’d be doing a random walk on a random network while receiving random input to deal with.

The researcher
Vadim Kaimanovich, a professor in the Department of Mathematics and Statistics, creates mathematical methods that can predict the nature of complex networks. His goal is to understand when the chaotic evolution of random systems can lead to stable and predictable output.

The project
Kaimanovich uses the analogy of a production line to ask: if we start the production line at a slightly different workstation—one that is close but not exactly the same—will we get a similar widget or something completely different? If the widget doesn’t change, the production is stable. If it’s different, the production diverges.
Scientists have noted many systems that seem as complicated as our crazy production line, but seem to have stable output. However, there were no mathematically rigorous proofs for the existence of stable solutions.

The key
Using a mathematically precise measure to decipher which widgets are similar and which are different, Kaimanovich demonstrated that certain sorts of abstract “random production lines” must have groups of workstations that give stable solutions for the same kind of random input. Not surprisingly, this is the first proof of it’s kind.

Fishy neurons

The problem
PARKINSON’S DISEASE (PD) deteriorates a patient’s central nervous system and debilitates motor skills. Doctors don’t know the cause of 90 per cent of PD cases, but better understand the source of the other 10 per cent. Heredity and genetics are the culprits in this type, called early-onset PD.
Surprisingly, the genes associated with PD are found in all kinds of life forms, including mice, yeast, and zebrafish. These genes play an important role in the special cells that control body motion and make dopamine, an indispensable chemical needed to transmit signals between neurons—these cells are called dopaminergic neurons.

The researcher
Marc Ekker, a biology professor at the U of O, works in the Center for Advanced Research in Environmental Genomics to better understand the genetics of PD. Ekker genetically alters zebrafish, whose genes are simpler than those of humans and can be associated with the disease, in order to further study the causes of PD.

The project
Since zebrafish are transparent, Ekker is able to genetically alter their neurons to fluorescent, enabling him to watch the destruction and regeneration of the dopaminergic neurons in the fishes’ brains while they are alive. He can therefore destroy individual neurons with a laser blast, poison, or alternatively, he can genetically block the gene altogether, making it inactive for the fishes’ entire life—essentially giving the zebrafish PD.

The key
Ekker looks at the genetically altered neurons in the brain and studies what they are doing to the fishes’ motion. Fish larva whose dopaminergic neurons are destroyed have very limited motor skills, and young fish without dopaminergic neurons will not respond with evasive motion when gently poked. Ekker’s zebrafish share the same symptoms as PD patients. Zebrafish, however, can regenerate the neurons. We can’t.
They can do this because of stem cells. Stem cells are different from common cells because they aren’t committed to becoming any one type such as a blood cell or a neuron. While humans have only a limited number of stem cells, zebrafish make stem cells throughout their entire life. The fish can draw on their bank of stem cells to replace the neurons.
Lucky fish.

Mercury munching microbes (om nom nom)

The problem
THE MAJORITY OF MERCURY in the atmosphere is generated by human industries like coal combustion and gold mining. The rest comes from natural sources such as volcanoes and forest fires. Either way, when mercury is dispersed into the atmosphere, it is carried poleward where it is oxidized and becomes heavier, falling into sensitive Arctic regions as a toxic contaminant.
Mercury binds to proteins and then accumulates in organisms, causing mercury compounds from the environment to enter the Arctic’s atmosphere when they get soaked up by the tiny microbes that form the ecosystem’s foundation. Since there’s nowhere for it to go, mercury is passed from prey to predator. Eventually, high levels of mercury accumulate in the top of the food chain—that’s us, friend.

The researcher
Alexandre Poulain, a professor at the U of O, studies how microbes alter the mobility and the toxicity of metals and metalloids in the environment. He focuses on aquatic systems in polar regions and ventures out into the Arctic to bring samples home for analysis in his lab.

The project
Anaerobic microbes, bacteria that don’t use oxygen, alter the nature of the mercury. Some make metals more toxic by turning ordinary mercury into very harmful methylmercury. Dissimilarly, other bacteria break down methylmercury, making a gas and venting it out of the ecosystem. Poulain looks at the difference between the rates of these two processes by analyzing the production of proteins (ribonucleic acid or RNA) that control whether microbes create toxic mercury or whether they detoxify the Arctic atmosphere.

The key
Upon sensing mercury in their environment, certain northern microbes activate genes naturally encoded in their DNA. Poulain can determine which of these genes are active in biomass samples from polar regions and can even tell exactly which genes are needed to defend against the toxic nature of mercury.
Poulain’s goal is to bridge global-scale environmental science and microscopic biology The reduction of Arctic mercury by tiny microbes plays a major role in regulating the toxicity of the Far North and could possibly be used in integrated approaches to environmental management.

Tomorrow’s butterflies

The problem
WORLDWIDE SHIFTS IN land use and global climate change are transforming the environment at a concerning pace. Only recently have scientists become aware of just how significant the impact of our actions has been. Average global temperatures have risen sharply over the past few decades, in addition to the loss of natural habitats by conversion into agriculturally cultivated land.
Intuitively, it is clear that such intense environmental changes will have repercussions that increase extinction rates, but the world’s ecosystems are complicated, and predicting how species diversity responds to climate change is no easy matter. Improving conservation and recovering endangered species requires accurate predictions of future shifts in biodiversity.

The researcher
Jeremy Kerr’s lab, the Canadian Facility for Ecoinformatics Research, is located in the Biosciences complex on campus. There he researches changes in biodiversity across entire continents rather than in any one, local ecosystem. This means that he deals with enormous amounts of information, requiring him to be on the forefront of ecoinformatics, the science of information in ecology.

The project
In order to test whether he can accurately predict future changes in biodiversity over larger areas, Kerr pretended to go back in time. He used a macroecological computer model to predict gradients in butterfly diversity over the entire 20th century. By comparing the predicted richness in butterfly species to actual historical records of 139 species, Kerr was able to judge the predictive power of his model.

The key
Starting from the year 1900 and inputting historical data sets on climate, elevation, land cover, and human population density, Kerr was able to accurately simulate how butterfly diversity changed across Canada throughout the 20th century. In northerly areas, butterfly diversity increased while at lower latitudes it decreased. This observation suggests that macroecological theory can indeed forecast where species will be found well into the future.
The ability to predict how species diversity will respond to climate change could improve conservation planning in the 21st century.

Goldfish on Prozac

The problem
WHEN YOU THINK of pollution, what jumps to mind? Heavy metals, BP oil spill, carbon tax? What about the words antibiotics, the pill, nicotine, or Prozac? These so-called pharmaceutical pollutants are seeping out of our medicine cabinets and into our rivers and lakes.
Drugs are only partially metabolized in your body; the rest of them are flushed down the toilet. To make matters worse, traditional sewage treatment plants fail to cleanse the water of these chemicals, allowing them to flow right into rivers and lakes.
Last year Canadians filled 483 million prescriptions (that’s 14 prescriptions per person and doesn’t count the large amounts of antibiotics given to livestock).
So what happens when all the fish in the pond are on Prozac?

The researcher
Vance Trudeau is a neuroendocrinologist at the U of O and the Centre for Advanced Research in Environmental Genomics. He studies how hormones control brain function and how, in turn, the brain regulates sexual development.

The project
Fluoxetine, the trade name for Prozac, can be found in the brain and liver tissues in wild fish, and, just like in people, increases fishes’ serotonin levels. To understand how the drug upsets sex hormone levels in wild fish populations, Trudeau studies normal goldfish whose food intake, seasonal growth rates, and reproduction have been previously well studied.

The key
When Trudeau’s research group studied female goldfish injected with flouxetine, they found that multiple genes in the brain were affected, causing a decrease in estrogen levels in the blood. Some of these genes are known to have an impact on the reproductive and social behavior of fish. To make matters worse, fluoxetine has an impact on the secretion of growth hormones, causing the fish to feed less and to become underweight.
To simulate the levels of Prozac detected in the environment, another test was done where fluoxetine was added directly to the tanks of male goldfish. Trudeau’s team then added potent female sex pheromones to the water. This should have stimulated the healthy, normal males to release their sperm and fertilize the eggs. However, male goldfish that had been exposed to the fluoxetine completely fail to release their sperm.
Poor goldfish.

Building biological barcodes

The problem
Medical tests required to diagnose diseases need to be performed at specialized centres, causing long wait times and expensive costs. In addition, current analytical tools are limited to looking at only handful of the biomolecules that signal the onset of diseases, such as cancer.

The researcher
Michel Godin, an assistant professor in the Department of Physics at the U of O, dreams of making disease testing as easy as scanning a barcode. Godin is part of the Interdisciplinary Nanophysics Centre labs where he mixes physics, chemistry, and biology to engineer hand-held microfluidic devices for the health sciences.

The project
Microfluidic devices are the computer chips of the chemistry world. Medical lab technicians search for biomolecules associated with disease—also called biomarkers—the way you would do math on an abacus: one by one. Godin wants to design a device that can take less than a drop of blood, purify it, and identify the presence of hundreds of biomarkers within seconds. That kind of speed would resolve the earlier inconveniences of wait time and would also allow better statistics for analysis. The device would be smaller than your cell phone and potentially cheap enough to be used in developing countries. Bigger isn’t always better—at least when you’re talking about microfluidic devices.

The key
But how would Godin’s device tell the hundreds of biomarkers apart? Some microfluidic devices integrate ultra-sensitive detectors that push biomarkers through tiny nanoscopic tunnels (or nanopores) capable of detecting single molecules as they pass. However, detecting molecules and telling them apart are two very different processes. While a nanopore might be able to detect biomarkers, it can’t distinguish between those that signal disease and perfectly normal biomolecules. To identify them, Godin wants to create a DNA scaffold—a long chain of single-stranded DNA that would attach specific biomarkers to unique spots along the DNA chain. By threading the DNA through the nanopore, Godin could read what biomarkers are present in the blood—exactly like scanning a barcode.

Truly unbelievable

Former on-campus researcher creates a media motion machine

THANE HEINS, SUPPOSED inventor of a perpetual motion machine, identifies with Thomas Edison, Nikola Tesla, Alexander Graham Bell, and the Wright brothers. Despite his lack of any university education, he compares himself to these heroes of science because, like each of them, he claims to have invented an unbelievable technology. But is there a difference?

Controversial Claim

Heins, whose company Potential Difference was recently asked to leave the University of Ottawa’s SITE laboratory they were occupying, claims that using his discovery “generators can now accelerate themselves… It’s a cancelling of the work-energy principle.”
The work-energy principle describes the conservation of energy for mechanical work: the work done is exactly equal to the change in energy. Any violation of this would call into question humanity’s entire understanding of the physical world—you can’t get something from nothing.
Heins claims “Our generator can create power from no power. What that means is [that] it’s not a perpetual motion machine, but it is more than 100 per cent efficient. There’s a huge difference.”

Severely Skeptical

Not everyone sees the difference. Brian Dunning, the host and producer of Skeptoid, a popular weekly pro-science, anti-pseudoscience podcast, says in an email to the Fulcrum that “Heins has built another in a very long line of variations on electric motors, claimed by the inventors to be ‘over unity’ or ‘free energy’ machines, where more energy is produced than is put in. Think of pouring a litre of water into a measuring cup, and expecting to get two litres out. That’s not the way the universe works. It would be nice, but it just isn’t so. The basic laws of thermodynamics state that over unity machines are impossible, and all known experimentation supports that.”
Dunning, who has never seen Heins’ machine, sees problems even with his fundamental concept of full energy efficiency.
MIT-educated electrical engineer Seanna Watson also sees problems with the details of Heins’ experiments. Watson and a group of engineers from Ottawa Skeptics visited Heins’ lab in 2008.
“From what I could tell at the time, he was taking measurements and he was, for example, measuring volt-amps instead of watts, not taking into account phase differentials, and he was doing some rather odd math,” explains Watson about her doubts regarding Heins’ invention.
Watson made the results of the group’s investigation public through the Ottawa Skeptics website. She summarizes the skeptics’ disquietude saying “there seems to be people who do not have enough of a background to be able to look at what he is doing and see a problem with it … It’s a concern that he’s trying to dupe people. And when I say ‘dupe’ I have to be a little bit careful because I don’t believe that he is deliberately trying to deceive anybody. I think he really does believe in what he is doing, but I think that he is very badly mistaken.”

Still Invested

Despite the validity of the skeptics’ claims, not everybody has always been so apprehensive. According to the Dean of Engineering, Claude Laguë, the University of Ottawa’s Faculty of Engineering opened its doors to Heins in order that he might get Potential Difference on its feet at the request of Ottawa Centre for Research and Innovation (OCRI). However, on March 1, after two years of facilitating Heins with lab space and access to the expertise of campus professors, the faculty asked Heins to vacate SITE due to his claims of external funding and a lack of return from his lengthy residency.
“After two years, our assessment was that we had moved beyond what we consider the normal start-up period. The company had also indicated that they were expecting financing from external sources. Due to that change to the situation, we felt that it was no longer appropriate for the faculty to continue to provide resource to that company free of charge,” Laguë explains of the faculty’s decision.
Heins has claimed financial support from various individuals over the years. In a 2008 Ottawa Citizen article by Tim Shufelt, Heins claimed that a $15-million investment was offered by influential Oregon private investor Jacques Nichols. The Fulcrum contacted Nichols by email about his investment.
“I met Mr. Heins during the summer of 2008 and we discussed his company and its capital requirements. No offer to invest was made, and I heard nothing more,” says Nichols in response to Heins’ claim.
Currently, Heins is financed by a number of personal investors including Robert Clark, founder of VesCells, a company that treats heart disease by stem cell therapy, who optimistically expects to “be able to clearly see the returns,” and Kevin Thistle, president of Coppingwood Golf Club, who has already invested nearly $250,000 in capital.

Attaining Attention

Heins can attribute some of his investors’ attention to the notoriety given to him by the media. When energy and green technologies columnist Tyler Hamilton wrote about Heins, his article became the Toronto Star’s second most read online story of 2008.
“I think Heins used it to his advantage to try to get in the door because it gave him a bit of [a] profile… He benefited from that and he rode that exposure,” asserts Hamilton. Although Hamilton says his intention was never to create debate, the Star’s article gave a level of credence to Heins—and started its own chain reaction of perpetual media attention. Canadian Business wrote an article. Heins garnered a mention on Gizmodo, Slashdot, BoingBoing, Wired.com, and innumerable private blogs. The Internet was abuzz, and both the Ottawa Citizen and the Toronto Star each devoted an article to all the attention he was getting.
Just this month, on the very heels of Heins’ exodus from campus, EV World published an article entitled “The Heins Effect,” in which tech editor Micheal Brace’s admitted purpose was to laud Heins with tenability. Brace writes that “[Heins] asked me to write this article because he’s hoping to change the public perception of his discovery.”
Dr. Riadh Habash, the U of O engineering professor who opened his lab to Heins, is not interested in discussing supposed controversy.
“We worked with him and we couldn’t prove his claims and, in science, to prove your claim you should be able to demonstrate that experimentally. In addition, you might write that in terms of a paper reviewed by others … When you do research in science you shouldn’t contact journalists.”
The role of journalism in scientific debate is an important one in modern society, and the degradation of that debate is a main concern of each of the skeptics approached by the Fulcrum.

Mixed Media

Robert Park knows all about public debate regarding scientific issues. Park, who spent 25 years in Washington representing the American Physical Society to politicians and the press, sees a critical problem with the media.
“Many people in the media who write science stories do not themselves have a real appreciation for the basic laws of science, so they are perfectly willing to violate the second law of thermodynamics. That doesn’t trouble them at all.”
Park says about five new perpetual motion machines are brought to his attention each year and he finds that astounding.
“Five perpetual motion machines a year? And you know, every one of those is a drag on the economy, but, worse than that, it encourages people to believe in this kind of mythology.” Dunning agrees with Park.
“The media is not engaged in the charitable act of educating people; they are engaged in the business of drawing attention … The problem is that the media is the main source of science information for most people, and viewers are offered little reason to suspect the information that’s reported might not be complete or correct. Such reporting erodes the already low level of public understanding of science, technology, and medicine.”
Béla Joós is not only the head of the Physics Department at the University of Ottawa, but also the editor of Physics in Canada, a monthly periodical published by the Canadian Association of Physicists. Physics in Canada reports on research findings, but also keeps physicists informed about important issues relevant to the scientific community.
“A newspapers’ true purpose is just announcing things, but their purpose is not in that sense critical analysis,” Joós says.
Joós does not necessarily see this as a fault, but does note the need for caution.
“Journalists do have a responsibility to not take as fact what is being proclaimed by one solitary voice.”
Joós points to the benefits of the peer review system in which fellow researchers in the same field are asked to evaluate scientific work before it can be published in reputable journals.
“Nobody can be a specialist in everything, so peer review is essential to make sure that the proposed new results have followed the scientific method of reproducibility, quality of data or error calculation, and spurious effects which may explain the data which are not being accounted for … Peer review manages also to identify questionable steps which have been taken or questionable assumptions that are not based on reality.”

Sayonara Science

Heins has had more success with the media than with scientific journals. According to Heins, “People were more critical than they should have been,” and so he has chosen to focus on the mass media rather than the scientific community.
“My initial approach was the scientific approach. Have it evaluated, have it legitimized, go through the scientific route, but we hit a wall—we hit a wall that you couldn’t get over.”
And so with no discernible support from the academics on campus, Heins continues his “letter writing campaign” to Macleans, National Geographic, the CBC and whomever will listen—even the Fulcrum.