Multidisciplinary Research: Pros and Cons
By bringing together experts from different disciplines we can find the solutions for today’s global challenges. Having spent a year in a multidisciplinary research group, Mit Bhavsar shares his thoughts on the advantages and disadvantages of multidisciplinary research in science.
The increasing popularity of mixed scientific disciplines like mechatronics, bioinformatics, biomedical engineering and biophysical chemistry is evidence of the importance of multidisciplinary. And, based on the number of multidisciplinary conferences taking place around the world, it seems that many policymakers agree that bringing scientists from a variety of different backgrounds together is a crucial part of fixing the world’s problems.
Going multidisciplinary does not mean leaving behind your own skills — it means heading in new scientific directions using your own specialties. I completed a neurophysiology PhD in a monodisciplinary research group. Now, I’m working as a postdoc in a multidisciplinary research group in the field of regenerative medicine. Here are my perceived advantages and challenges.
One problem I’ve found with a monodisciplinary research group is a lack of creativity when it comes to working out what kind of work can be done. A multidisciplinary group can combine the expertise of your field with other fields and create a varied team. Such combination can lead to creative and high impact research. For example, my lab is working on tissue regeneration and repair through electrical stimuli. For such kind of research, one often needs expertise in the field of medicine and electrical engineering.
For me, the most attractive part of multidisciplinary research is that you can work on projects that involve more than one discipline of science. This meant honing my existing skills and learning a whole lot more from scientists I’d never previously had a chance to interact with. As well as that, because I’m the only expert in my field in my group, I can work independently to address problems when they come up.
Multidisciplinary research also leads to unusual scientific inventions. A lot of great science has come from the robust interactions of researchers from different fields. A good example of this is the discovery of “Magnetic resonance imaging” by Paul Lauterbur (a chemist) and Peter Mansfield (a physicist) — for this they were awarded the 2003 Nobel prize in Physiology or Medicine. An independent researcher designing and conducting their own separate experiments would never have had these opportunities.
One of the common challenges of working in a multidisciplinary research group is a lack of a “common language.” It’s hard to find a way to start working on a problem when everyone has been trained to approach it from different directions. For me, this makes it difficult to discuss ideas with team members and get the right feedback. This problem feeds into feeling of loneliness — I’m surrounded by lab mates but I’m the only one working on this particular problem in this particular direction in my lab. Another issue: there is no meaningful criticism and evaluation of your work. Your ideas and suggestions are either accepted without any questions or they will be rejected without constructive criticism.
If you can deal with these challenges, it can be very rewarding to do multidisciplinary science. To facilitate multidisciplinary research, universities and research institutes should encourage interaction between different disciplines where scientists can meet, share ideas and discuss problems.
Mit Bhavsar is a researcher living and working at Frankfurt Initiative for Regenerative Medicine (FIRM) Frankfurt, Germany. You can contact him on: email@example.com
Breaking the Curse on Science
Open data can help us avoid inherent biases in our work, says Ayushi Sood Better Science through Better Data writing competition winner Ayushi Sood Recently, an economist friend told me that “scientific inquiry is inherently cursed.” At first I was offended. But I had to agree after he elaborated further – science today suffers from something economists enigmatically call the “winner’s curse”. The first scientific journals were print editions — something akin to a printed memo — circulated among researchers to update them of the findings of others in the field. To submit a paper for publication, only the observations required to prove results needed to be included in a manuscript, and rightly so: if every paper included every shred of data, journals would run into thousands of pages. This means, though, that what was communicated to the scientific community was only a fraction of what could have been communicated: only the observations that were ‘winners’ – the ones which best supported a result – would be presented, and the others would be effectively relegated to obscurity. Although we’re not limited by paper and page counts today, the same pattern of data use continues. This leads us to the problem of the winner’s curse: by the process of selection, the ‘winning’ observation oversells itself. In economics, the winner’s curse refers to situations in auctions where the winner tends to overpay, because the actual value of the product is the average of the bids, not the highest bid. In scientific research, the curse takes hold in scientists who aim for publication in the most selective journals, with the most impressive results being favored. This ignores all the other results — the ones which weren’t so impressive — while giving disproportionate importance to the ‘winning result’. The problem with this phenomenon isn’t immediately evident — isn’t the result what actually matters? The data is, after all, just a tool, necessary only to prove what’s important — the conclusion. In looking for conclusions in data, however, researchers can forget to ask: “does the conclusion effectively justify my repeated sampling of the real world?” In other words, is reality accurately reflected by the dataset presented? All the observations we take, whether they are inconclusive, negative, or ‘winners’, represent an analysis of the natural world. By only reporting the ones that work, the other sampling efforts cannot be used by anyone else. This process confers on a small, selected number of observations the authority to predict an unpredictable future! Back in the auction house, this would mean the value of the product is set only by the winning bid. When we report only the best set of data, we are relegating the less impressive observations to obscurity, even though these also represent an analysis of the real world, with real potential to inform. So what does this mean for us? How should scientists avoid falling into the trap of the winner’s curse? One way would be to save, store and share all data — not just positive results. We are only human. By making our data openly accessible, we can avoid internal inconsistencies. The smallest of mistakes would be corrected by fresh eyes poring over the very same data. More importantly, open data could prove to be a shot in the arm for scientific inquiry as a whole. What data I find important may be perfect for my study, yet a small cluster of ignored numbers in my dataset could lead to a breakthrough for someone else, possibly in a way that I could never have imagined! Gene expression data in cancer cells could provide insights into cell signaling pathways in neurodegenerative disorders. Algal bloom observations in polluted lakes could help in effective biomass production for algal biofuel. The analysis and application of open data could usher in a new age of scientific connectivity, with the available knowledge transcending traditional discipline boundaries in way never seen before. Well, if it’s so good, why hasn’t open data been the norm since science began? We come back to the thousand-page journal here — the question wasn’t of why not, but of how. Transmitting every single byte of data through papers and talks was impossible before the advent of computers and the emergence of the internet in the 1990s. In 2017, however, we have the tools at our disposal to store, parse, organize and retrieve every single digit. The burgeoning field of data science and analysis is ours to exploit, just waiting to script the next scientific success story. So, I have to hand it to the economists on this one — the winner’s curse is alive and kicking in science. But, like any good scientist, I’m thinking of solutions, and every clue suggests that open data, accessibility and collaboration could be just the spell that breaks this curse. Ayushi Sood is an undergraduate microbiology student at Amity University, India. Her interest in what makes life tick made her fall in love with bacteria and astrobiology, and her passion for making scientific research more efficient and accessible led her to explore bioinformatics. She has been a part of research projects investigating nanoparticle-plant interactions, transgenic algae, and bacteria-algae associations. Ayushi enjoys dance, writing, and functional DIY craft. You can follow her work on Bitesize Bio and connect with her on LinkedInor Facebook.
A general reinforcement learning algorithm that masters chess, shogi, and Go through self-play
The game of chess is the longest-studied domain in the history of artificial intelligence. The strongest programs are based on a combination of sophisticated search techniques, domain-specific adaptations, and handcrafted evaluation functions that have been refined by human experts over several decades. By contrast, the AlphaGo Zero program recently achieved superhuman performance in the game of Go by reinforcement learning from self-play. In this paper, we generalize this approach into a single AlphaZero algorithm that can achieve superhuman performance in many challenging games. Starting from random play and given no domain knowledge except the game rules, AlphaZero convincingly defeated a world champion program in the games of chess and shogi (Japanese chess), as well as Go.
How Indian biotech is driving innovation
Bolstered by government support, a wealth of investment and an eager graduate workforce, the country’s biotechnology industry is booming. Anu Acharya was in her twenties when the human genome was first mapped in its entirety. In 2000, the young Indian entrepreneur was just breaking into the biotechnology arena with her first start-up — the genomics and bioinformatics company Ocimum Biosolutions in Hyderabad. She saw the Human Genome Project’s achievements as opening up a new world of possibilities in personalized medicine, informed by an individual’s genetic profile and predispositions — but at the time, the field of genomic medicine was dominated by Western science. “I wanted to make sure that India had its own voice heard in that,” Acharya says. So, a decade later, she launched her second biotech start-up — molecular-diagnostics company Mapmygenome, also in Hyderabad — to bring the personalized-medicine revolution to India’s diverse population. “Because, ultimately, when you’re making medicine precise, it has to be for specific individuals and populations rather than based on one population that has been studied.” Acharya is among India’s rapidly growing ranks of biotechnology entrepreneurs and start-ups that are riding a wave of government enthusiasm, free-flowing venture capital and growing demand from an increasingly wealthy population that wants better treatment options. These factors are helping to drive India’s biotechnology industry beyond its historical focus on unbranded generic drugs and into the innovation limelight. By the end of 2016, there were more than 1,000 biotechnology start-ups in India, and more than half of these had been established within the previous 5 years. Australia, by contrast, has 470 biotechnology companies and the United Kingdom 3,835. The biotechnology industry in India was valued at US$11 billion in 2016, and is forecast to grow to $100 billion by 2025. More than half of the biotechnology start-ups are in the medical arena — diagnostics, drugs and medical devices — but 14% are in agricultural biotechnology, 3% in bioindustry, 1% in bioinformatics and 18% in biotechnology services. India is already eyeing the prospect of its first biotechnology ‘unicorn’ — a start-up valued at more than $1 billion. The potential unicorn in question, Biocon in Bangalore, started in 1978 as an enzyme manufacturer but is now making a name for itself in the research and development of biological drugs for treating diabetes, cancer and autoimmune diseases. By March 2018, its revenue had topped $650 million. India has long been a global player in the manufacture of generics (unbranded versions of existing pharmaceutical products), accounting for 20% of global exports of generics and earning just over $17 billion from that market in 2017. So what has prompted the nation to move beyond such a lucrative comfort zone and into the more risky game of biotechnology innovation? Government support In 1986, with the encouragement of then-prime minister Rajiv Gandhi, India became one of the first countries in the world to have a government unit dedicated solely to biotechnology. The Department of Biotechnology started with a relatively modest budget of between 40 million and 60 million rupees ($557,000–835,000), growing exponentially to 24.1 billion rupees in 2018. In addition to establishing 17 Centres of Excellence in Biotechnology at institutes and universities around the country, the department has supported the creation of 8 biotechnology parks, or incubators, in cities such as Lucknow, Bangalore, Hyderabad, Chennai and Kerala. The aim of these parks is to provide facilities for scientists and small to medium-sized enterprises (SMEs), where they can develop and demonstrate their technologies and even build pilot plants. The hope is that this will speed up the commercialization process. The park staff also provide mentorship and guidance on issues such as intellectual property, business plans, proposals for clinical development and exit strategies. This support is helping to address some of the logistical challenges that have hampered industry in the past, says Tej Singh, a biophysicist at the All India Institute of Medical Sciences in New Delhi and president of the Biotech Research Society, India. “They created some sort of industrial regions in many areas, but there were issues like electricity, water [supply]; all these small things used to take time,” Singh says. “But the government has addressed these things nowadays; this current government particularly is very proactive.” The Department of Biotechnology has also supported biotechnology research infrastructure, including a high-resolution mass spectrometry facility in Mumbai, flow-cytometry, imaging and microarray facilities in Delhi, and animal-house facilities in five other regions. The jewel in the departmental crown, and the scheme that attracts the most attention, is the Biotechnology Industry Research Assistance Council (BIRAC). This is a not-for-profit, public-sector enterprise that was set up by the Department of Biotechnology in 2012 to “stimulate, foster and enhance the strategic research and innovation capabilities of the Indian biotech industry, particularly start-ups and SMEs”. “The idea of forming BIRAC was to support the innovation ecosystem in India, and to nurture innovators from academia and industry to work independently or together,” says Shirshendu Mukherjee, mission director of the Program Management Unit at BIRAC. Mukherjee says India has always excelled at basic research but has faced challenges in translating that into commercial outcomes. BIRAC’s mission is therefore to “take innovation from the bench to the bedside, from the lab to the field, from the desk to the market”, he says. In just six years of existence, BIRAC has supported 316 start-ups, which have generated $125 million through 122 products and technologies, including a cattle-feed supplement, a new process to manufacture human albumin and immunoglobulin, microfluidics-based diagnostics and a rapid test for malaria. Its initiatives include ‘biotechnology ignition grants’ of up to 5 million rupees for start-ups and entrepreneurs to take a proof-of-concept through to the first major step on the path to commercialization. Another is a ‘glue grants’ scheme, which connects clinical-science departments with those for basic science in institutes and universities in the hope that this will encourage partnerships and collaborations. BIRAC has also joined forces with the Bill & Melinda Gates Foundation in Seattle, Washington, on the Grand Challenges India initiative to tackle global health and development problems. “I always call my Grand Challenges programme ‘in India, for India and beyond’,” says Mukherjee. “So we will do it in India, we will validate it in India, we will use it India, our citizens will use it, and then if it goes beyond India we are happy to do that.” Consumer demand A similar motivation is driving at least some of the scientists and entrepreneurs such as Acharya, who get into the biotech space because they feel that Western biotechnology isn’t necessarily addressing the needs of the Indian population. One example is Vivek Wadhwa, a technology entrepreneur at Harvard Law School in Cambridge, Massachusetts, and at Carnegie Mellon University’s College of Engineering at Silicon Valley, California, who has invested in Indian medical-diagnostics company HealthCube in New Delhi. “I did a big study on the pharmaceutical industry in India, and I concluded that Western companies were not addressing Indian disease because it wasn’t profitable enough for them,” Wadhwa says. But as the cost of technologies such as genome sequencing and medical sensors comes down, Wadhwa says, it has now become viable for Indian biotechnologists to harness these advances for the Indian market. And what a market India is for these innovations. The country’s population is 1.36 billion and rising, and health care is one of India’s fastest-growing sectors, driven by higher incomes and an increasing prevalence of lifestyle diseases, such as heart disease and stroke. By 2022, the health-care market in India is expected to be worth $372 billion. “People are finally realizing that the consumer, or the patient, actually has control over their own health,” says Acharya. The rising middle class wants better health and medical choices, and she says that’s one of the main drivers for investment in biotechnology research and development. For example, Biocon has developed the first recombinant insulin to be produced in India, and an antibody-based treatment for head and neck cancer. In 2017, Indian vaccine manufacturer Bharat Biotech in Hyderabad began the first clinical trials of its vaccine against the mosquito-borne virus chikungunya, which re-emerged in India in 2006 after 32 years and infected more than 1.4 million people. Another major driver of the biotechnology boom in India is the accessibility of funding, from both government and private industry. In one 2016 report on biotechnology, India ranked only 49th out of 54 countries. But it scored particularly highly on the availability of venture capital compared to countries such as the United Kingdom, Australia and Canada. Acharya says that some of the investors who have made their fortunes in manufacturing generic pharmaceuticals are now investing in biotechnology. She says much of the capital investment in early-stage biotechnology is coming from India, whereas investment in medical devices is flowing from Japan, China and the United States. But late-stage investment is still an issue. “A lot of early-stage start-ups are getting funded but I think the challenge is still the late stage,” she says. “It’s not just the first two to three years; it’s more how do you take it from start-up to scale-up? I think that’s the challenge in terms of getting to where we need to get in terms of biotechnology.” Human resources One thing India has plenty of is people. Recognizing that human capital can be a key resource for a nation not as well endowed financially as Western countries such as the United States or United Kingdom, the Department of Biotechnology implemented or supported various training initiatives. These include the Biotech Industrial Training Programme, set up in 1993 for recent graduates, and 12 Biotech Finishing Schools in Karnataka state to train Indian graduates and researchers in biotechnology. That programme “created a very large number of institutions or departments of biotechnology in institutions and also departments of bioinformatics”, says Singh. For example, in September, the state of Gujarat proposed India’s first university focused entirely on biotechnology. “A decade or so ago, India didn’t have the engineers or scientists it does today — it’s been graduating them in droves,” says Wadhwa. “It has millions of technologists who now just need to be connected to the medical practice and they can be solving great problems.” Singh notes that these graduates aren’t waiting for a job to walk up and tap them on the shoulder; they’re taking matters into their own hands. “Graduate students who come out in large numbers from Indian institutes of technology and institutes of management are not looking for jobs so much; they create small start-ups and then they grow very fast,” Singh says. Working in biotechnology in India does present its own unique set of challenges, says Acharya. “Some operational things that you never have to think about in the United States you have to plan more in India, because a lot of times we are still importing the reagents and things like that.” Red tape Although the government of India is enthusiastic about supporting the biotechnology industry, Acharya says the regulatory process for getting products approved could be more streamlined. In agricultural biotechnology, the government’s Genetic Engineering Appraisal Committee has been working to make it easier for companies to get approval for genetically modified crop field trials from state governments. The drug approvals process in India has hit some rough patches in recent years, and the authors of a 2017 World Health Organization report suggested that innovation there could be outpacing regulation. Even the government’s own National Biotechnology Development Strategy for 2015–20 acknowledges that timelines and regulatory steps for biotechnology drug approvals are not user-friendly. It has proposed reforms, including the establishment of regulatory departments that are fluent in good practice in the clinical, manufacturing and laboratory arenas. There are also concerns about the environmental impact of India’s pharmaceutical industry. An investigation in 2016 found “unprecedented” levels of pharmaceutical pollution in the water system of Hyderabad (C. Lübbert et al. Infection 45, 479–491; 2017), which is home to a significant proportion of biotech start-ups and generics manufacturers. However, as the US Food and Drug Administration reportedly steps up inspections of overseas pharmaceutical suppliers, environmental standards could be forced to improve. Despite the challenges, there is palpable excitement about what lies ahead. “Right now, we are seeing the beginnings of a revolution in biotechnology in India,” Wadhwa says. Acharya is still fired with the same enthusiasm that propelled her into biotechnology nearly two decades ago. “Any innovation in this space can actually impact lives,” she says. “That’s why I continue to be in it.” This article is part of the Nature Spotlight on Indian biotechnology, an editorially independent supplement. Advertisers have no influence over the content.