Home Guest content

This guest post is from the BiotechBlog Intern,  Fintan Burke. Fintan is a student at the School of Biotechnology at Dublin City University. Do you have a response to Fintan’s post? Respond in the comments section below.

Bioethics in the public eye has been recognised as an integral part of modern biology research for a while now. When the Human Genome Project began, James Watson dedicated 5% of the estimated $2.7bn budget to the Ethical, Social, and Legal Issues (ELSI) that concerned the project. As funding trends change and demand for scrutiny of the field increases, the public increasingly needs to be aware of the changes and misinformation presented to them.

Above all else, research is affected by funding. Traditionally biotechnology was funded from government-affiliated agencies, venture capitalists, or deals with Big Pharma. Recently, as the Wall Street Journal noted, funding from the latter two has been dwindling since the 2008 financial crisis. This has lead to researchers applying for grants from some underused sources and in the case of the UK the creation of incentives to bridge the funding gap.

Recognition of the commercial application of biotechnology has lead to the UK’s formation of the Synthetic Biology Leadership council, whose roadmap aims for an “economically vibrant” biology sector “of clear public benefit.” Particular emphasis is put on noting that “public acceptability…cannot be adequately dealt with through communication through the public.” Professor Joyce Tait serves on this leadership council and examines the social impact of scientific developments with the Innogen Center at the University of Edinburgh. “I think what the public needs to be educated about is judging the quality of the evidence that’s presented to them” she says when contacted. “There’s a terrible tendency in this area for any group, no matter what their motivation, to bias the evidence that’s around out there to suit their case … especially when it becomes the potential for conflict.”

 

For Peter Pitts, President and Co-Founder of the Center for Medicine in the Public Interest, communicating any research benefit to the public can be a problem. “Generally speaking the only thing you read about genetically modified foods – from those who actually think they’re a good idea – are extremely technical comments that are of use to almost nobody.” For Peter, scientific jargon and rhetoric are preventing any meaningful public discussion with researchers. “I can sit in a room with scientists, I can talk shorthand and know exactly what’s going on, but everybody else is completely confused like we’re speaking in some sort of ancient Latin or something.”

 

“Every explanation has to be a 94 PowerPoint slide presentation as opposed to the quick, obvious media savvy answer.”

 

Research potential may be further overstated by the press, skewing the image of the research even more. In an interview with the Guardian, Professor Hilary Rose notes the paper’s editorial on new stem cell research (which notes it may someday “make the blind see, the crippled walk, and the deaf hear”) as an example of the misinformation supplied to the public.

Henry I Miller of the Hoover Institution at Stanford University also feels that modern science journalism needs to be redressed. “Specialty journalism is waning, and reporters often create a “moral equivalence” between opposing views of an issue, even after one viewpoint has been discredited. Newspapers are failing, and people increasingly are using websites to become “informed” about issues.” This feeling of declining speciality journalism is echoed by Peter Pitts. “Years ago a journalist covering the FDA, for example, would have been covering the FDA for years; reading the ins and outs of what was going on. Today you’re talking to people who today they’re covering the FDA, yesterday were covering a baseball game.”

Anti-GM companies are also a persistent, unregulated source of biased information for the public. There are, however, indications of public attitudes cooling as seen in a public attitudes survey last year, where extreme pro- and anti-GM attitudes shrank and the indifferent middle ground grew. “I thought that was an excellent outcome” says Prof Tait “it was no longer a politically contentious issue for a lot of people.” She notes however a tendency for anti-biotech lobbies to misreport the findings still exists. “If you actually look at the way… it’s still being represented by anti-GM pressure groups, they’re focusing on the one end of that scale and they’re not pointing out that if you look at if you look at the other end of that scale there’s an equal move in the opposite direction!”

The main risk of a poor public attitude can be seen reflected in the funding decisions made in government. “I think among politicians there’s kind of a fear of the fear of the public, in Europe in particular [there’s] a really strong concern to avoid a kind of public backlash against any particular technology, I think that’s been true for nanotechnology, which was a subject of concern about 4 or 5 years ago” explains Prof Tait. “[Public opinion] seeps through to what governments will decide to fund and that then feeds through to the opportunities there are for the scientists.”

For Peter Pitts, one way to address that is to open up social media. “I think the FDA should facilitate the use of social media by regulated entities, by pharmaceutical companies, etc – to use it more robustly and to send a green light that they want them to do that.”

“I think we can get people more excited and get more people into the fields of science relative to young kids in high school – for example, pursuing a career in science. I think that will probably help politicians support larger budgets for research …and it will also allow people to accept the benefits of the science – GM foods for example – much more readily than they do now.”

About the author:

Fintan Burke is a student at the School of Biotechnology at Dublin City University. His main fields of interest include biomedical therapies and recombinant organisms.  Fintan may be contacted at fintan.burke2@mail.dcu.ie .

This is a guest post from Susan K Finston, President of Finston Consulting. Do you have a response to Susan’s post? Respond in the comments section below.

Susan Kling FinstonVCs are starting to talk again about the ‘perception’ of scarce funding for early stage bio-pharma:

At the New Paradigms meeting (a satellite of JPM Conference), a panel I was involved with discussed the perceived funding gap and whether great companies were still getting financing.  The unanimous view was that innovative new startups were continuing to attract capital.

Whenever I hear VCs affirmations that all the good, innovative startups are funded, or that the problem is optics, my antennae go up and I want to see the data.

In other words:  Show me the money – and exactly where it is being invested.

Fortunately we live in a world where VC investment is measured and analyzed obsessively – and not just by bio-pharma startups.

Rather than rely on a straw poll of self-selected VCs, we can consult data for 2012 from the National Venture Capital Association’s MoneyTree report that shows an overall 10% fall in 2012 as compared to 2011, and double-digit year-on-year declines for biotech (down 15%) and medical devices (down 13%).

Sadly, the MoneyTree report confirms that it is not the best of all possible worlds for early stage life sciences companies and related devices.

So why the evident disconnect between the (anecdotal) views of funders and actual macroeconomic trends?

As the saying goes, where you stand, depends on where you sit.  As an independent life sciences consultant, and – full disclosure – CEO / Managing Director of Amrita Therapeutics Ltd., an early-stage bio-discovery company seeking funding, my experiences likely are very different from those of VCs in the JP Morgan bubble.

From their perspective, life sciences may be trending positively, particularly given compression of early valuation –  where companies are saddled with much lower valuations as compared to fifteen, ten or even five years ago, using the same metrics.  This gives VCs and Pharma Venture Funds the opportunity to gain far greater leverage over small companies at lower capital commitments, essentially transferring value from founders to funders.

The really good news for funders is that at these lowered valuations, early stage biotech is a great investment opportunity, particularly given mounting evidence that the life sciences are a much better bet for IPOs than IT and Social Media.

For 2013, let’s hope that VC’s can take a break from their Facebook accounts, and make their own perceptions of greater early stage funding a reality for more bio-pharma companies!

About the author:
President of Finston Consulting LLC since 2005, Susan works with innovative biotechnology and other clients ranging from start-up to Fortune-100, providing support for legal, transactional, policy and “doing business” issues. Susan has extensive background and special expertise relating to intellectual property and knowledge-economy issues in advanced developing countries including India and South Asia, Latin America and the Middle East North Africa (MENA) region. She also works with governments, s and NGOs on capacity building and related educational programs through BayhDole25. Together with biotechnology pioneer Ananda Chakrabarty, she also is co-founder of Amrita Therapeutics Ltd., an emerging biopharmaceutical company based in India with cancer peptide drugs entering in vivo research. Previous experience includes 11 years in the U.S Foreign Service with overseas tours in London, Tel Aviv, and Manila and at the Department of State in Washington DC. For more information on latest presentations and publications please visit finstonconsulting.com.

This is a guest post from Susan K Finston, President of Finston Consulting. Do you have a response to Susan’s post? Respond in the comments section below.

Susan Kling Finston When I started working for PhRMA nearly 15 years ago, the mantra was “Fail early, Fail cheap.”

Given the exponentially increased cost of advancing compounds pre-clinical into clinical research and through pivotal Phase II and larger Phase III trials, it makes sense for companies to investigate as many compounds as possible through early stage pre-clinical research and then cherry pick compounds for clinical trials based on a well-developed understanding of the compounds structure, toxicity and other key characteristics.

An R&D program that fails at the pre-clinical stage is far less costly than one that makes it through the Investigational New Drug (IND) application and into clinical trials, only to tank due to lack of efficacy or safety. So why are Bio-Pharma companies taking the opposite tack –  investing huge sums in late-stage compounds for R&D programs, with faltering results in late clinical stage trials?

Why are companies no longer ‘failing early’?

The same factors driving bio-pharma M&A strategy motivate companies to acquire late-stage research assets to fill depleted pipelines.
And cash-rich bio-pharma companies competing for a limited pool of late stage programs, bidding up the cost of acquisition (perhaps at times also hindering full due diligence).

In theory, these assets are lower-risk than early stage programs because they have reached the clinical trial stage. In practice, this has resulted in 30% failure rates at the Phase III clinical trial stage, with a further 50% attrition rate between the clinic and the marketplace, where  “peak sales projection is more art than science, and the art often looks rather comical in retrospect.” In sum, only about one third of launched drugs make back their R&D costs.

Good in theory, bad in practice …

It may be time to recognize that in terms of net-present value, later-stage compounds are not lower-risk than pre-clinical programs factoring in Phase III trial costs, likelihood of failure at Phase III (or before launch), and more realistic revenue projections,  into valuation of late-stage assets.

Given the foregoing, taking a case-by-case approach to acquisition of R&D programs at earlier stages of development would reduce overall risk, providing better long-run returns.

Failure is always going to be with us.  With ever increasing complexity and cost of human clinical trials, Bio Pharma would be better off taking the long view and at least failing earlier in the process at a fraction of the cost!

About the author:

President of Finston Consulting LLC since 2005, Susan works with innovative biotechnology and other clients ranging from start-up to Fortune-100, providing support for legal, transactional, policy and “doing business” issues. Susan has extensive background and special expertise relating to intellectual property and knowledge-economy issues in advanced developing countries including India and South Asia, Latin America and the Middle East North Africa (MENA) region. She also works with governments, s and NGOs on capacity building and related educational programs through BayhDole25. Together with biotechnology pioneer Ananda Chakrabarty, she also is co-founder of Amrita Therapeutics Ltd., an emerging biopharmaceutical company based in India with cancer peptide drugs entering in vivo research. Previous experience includes 11 years in the U.S Foreign Service with overseas tours in London, Tel Aviv, and Manila and at the Department of State in Washington DC. For more information on latest presentations and publications please visit finstonconsulting.com.

This is a guest post from Susan K Finston, President of Finston Consulting. Do you have a response to Susan’s post? Respond in the comments section below.

Susan Kling FinstonIt looks like the ‘new normal’ is not just for biotechnology startups and SME pharma companies  – the CROs (and other vendors) who previously benefited from better R&D funding streams also feel the chill of the ongoing funding freeze.

Discussions with CROs and research consultants in the US and abroad reveal concerns relating to lengthy delays between initial discussion of research and final approval, reduced budgets for research programs, and generally falling demand for CRO services.  Their concerns are not just anecdotal.  A recent National Academy of Sciences (NAS) study sees the U.S. Clinical Trial Enterprise in particular as in decline:

There is ample evidence that U.S. trials are be-coming more expensive (DeVol et al., 2011). Worse, 90 percent fail to meet enrollment goals, and additional evidence points to disillusionment among American investigators (Getz, 2005). The rate of attrition among U.S. investigators is increasing, even among experienced researchers with strong track records of productivity, while 45 percent of first-time investigators abandon the field after their first trial. The system has become so inefficient that even the NIH is offshoring clinical trials at a substantial rate (Califf, 2011; Kim et al., 2011), using taxpayer funding to conduct trials in countries with less expensive and more efficient CTEs, despite concerns about generalizability as noted above.

The EU has seen a 25% decline in registration of new clinical trials and has begun a legislative process to improve the research environment in Europe. More broadly, an interesting Canadian clinical trial survey available here, shows a decrease in trials and related sites globally between 2008 – 26,241 sites and 990 trials – and 2010 – 22,358 sites and 760 trials respectively.  While finding increases in clinical trial activities in developing countries in Asia, they note the overall global trend of reduced clinical trial starts.

So the fundamental realignment of early-stage biotech valuation that makes it more challenging for start-ups and SMEs also has had unintended consequences for Clinical Research Organizations (CROs) providing pre-clinical and clinical research services.  And research budgets are falling across the board, more broadly, as larger companies and even public research institutions face cost containment pressures.

Given the critical importance of the clinical research enterprise for generation of social and economic good, it will be interesting to see how policy makers respond and / or if the market will rebound if economic growth increases.

About the author:

President of Finston Consulting LLC since 2005, Susan works with innovative biotechnology and other clients ranging from start-up to Fortune-100, providing support for legal, transactional, policy and “doing business” issues. Susan has extensive background and special expertise relating to intellectual property and knowledge-economy issues in advanced developing countries including India and South Asia, Latin America and the Middle East North Africa (MENA) region. She also works with governments, s and NGOs on capacity building and related educational programs through BayhDole25. Together with biotechnology pioneer Ananda Chakrabarty, she also is co-founder of Amrita Therapeutics Ltd., an emerging biopharmaceutical company based in India with cancer peptide drugs entering in vivo research. Previous experience includes 11 years in the U.S Foreign Service with overseas tours in London, Tel Aviv, and Manila and at the Department of State in Washington DC. For more information on latest presentations and publications please visit finstonconsulting.com.

This is a guest post from the BiotechBlog Intern,  Fintan Burke. Fintan is a student at the School of Biotechnology at Dublin City University. Do you have a response to Fintan’s post? Respond in the comments section below.

It has now been more than a year since the FDA published a welcomed revision of the 1987 guideline “Process validation: General Principles and Practices” in early 2011. This guideline serves as a complete rewrite to its previous incarnation, eliminating outdated practices, putting more emphasis on scientific evidence and completely redefining its definition of “process validation.” This update follows in a wave of acts by the FDA to reinvent the way it tackles an aspect of life science that was initially poorly understood.

Initially this guideline was developed in response to concerns raised in the early 1970s that end-product testing to test process standards was insufficient. While this guideline led to a more direct and formal protocol for process validation, the broad language used throughout left many companies to follow their own interpretation. For example, the 1987 guideline states that the process efficacy should be validated by producing a certain number of batches in a row successfully, though it never actually mentioned an appropriate number of trials (eventually the idea of “Three Golden Batches” emerged among companies). As expresspharma also pointed out, the guideline put pressure on manufacturers to maintain the process that created these “Golden Batches” without needing to understand, or even control, the parameters that caused them.

In other words, as long as a process was proven to work enough times, being disciplined in this procedure took precedent to understanding it.

What followed were years of criticism of the FDA’s attitude to Good Manufacturing Process (GMP) guidelines. As discussed at globepharm, the FDA routinely failed to update its GMP standards by claiming such standards were only the minimum held, thereby shifting the responsibility to companies to keep their standards current. Though revisions were often promised, they never seemed to have come to fruition.

This all changed as part of the “Pharmaceutical cGMPs for the 21st Century” initiative launched in 2002, when the FDA began its review of the guideline to better the approach of quality control for the pharmaceutical industry.

The most obvious change to the 2011 edition was the new definition of process validation, which shifted from requiring only the end product as proof of good process to instead consistently checking the process designs throughout the product’s life cycle to see if standards are met. This meant that the 3 Golden Batches concept became obsolete and a new measure of process evaluation was needed. To this end the FDA also explicitly emphasised in its guidelines the need for a 3 stage process evaluation; a process is first designed based on previous process knowledge, proven to generate reproducible results and finally routinely verified to ensure the whole process is controlled.

This more holistic, life-cycle based approach is seen as a massive improvement to the 1987 guideline particularly since the wording now vividly states what is expected of the industry, while still being applicable to individual companies. Several new questions have surfaced, however. By deserting the 3 Golden Batches concept, companies must reassess how many batches must be tried to prove standards are met. This, as Dr Mike Long et al discuss, is one area where the new guideline remains obscure; it instead merely states that enough tests should be run to statistically justify going into commercial production. Another problem is that the guideline clashes at some points with EU regulations. Apart from differences in the wording of some principles, Annex 15 (article 25) of the EU Guide to GMP also seems to recommend the 3 Golden Batches method, directly contrasting the FDA’s efforts to quell its practice.

While changing regulation, the FDA is being proactive in its approach to enforcing these regulations. For example, a recent event held in Colorado allowed industry representatives to discuss with FDA representatives from that district some of the challenges encountered during inspections. In recognition of the growing multinational behaviour of product manufacture the FDA is also collaborating with the European Regulatory Network to monitor foreign companies in their own territories. This follows a successful application by the FDA to become a member of the Pharmaceutical Inspection Co-operation Scheme (PIC/S) in November of 2010, a process that required a lengthy review period that involved each division of the FDA applying individually to PIC/S in order to gain admittance, thus furthering its modernisation and easing pressure to establishing GMP guidelines.

There can be no question that the FDA has recently taken a turn for the better in trying to modernise its efforts in bioprocess regulation, an area it initially had failed to develop. Despite these improvements, criticism remains. It appears too that the FDA’s recent efforts are just the beginning of an international trend in reviewing bioprocess standards, with a major update to the European Commission’s GMPs in bioprocesses coming into effect early next year.

About the author:

Fintan Burke is a student at the School of Biotechnology at Dublin City University. His main fields of interest include biomedical therapies and recombinant organisms.  Fintan may be contacted at fintan.burke2@mail.dcu.ie .

For Third Rock Ventures' Mark Levin, personalized medicine is nothing new. Instead, he sees the history of pharma as a gradual homing in on the roots of disease, and genomic mapping is just the next step. By Damien Garde, Fierce Biotech. From the herbal remedies of early civilizations to the dawn of modern pharmaceuticals, researchers have been slowly drilling down, Levin said, personalizing treatments one step at a time. Now, with the help of sequencing, we can move beyond treating phenotypic effects and get to the heart of disease: its genotypic origins. That is, if we can all get along. Levin spoke to a packed house Tuesday at the Personalized Medicine Coalition's State of Personalized Medicine Luncheon in Washington, DC, and his message was simple: Personalized medicine has the potential to revolutionize biotech and pharma, but only if stakeholders work together. Levin's had a long career in the industry, going from an engineer and project leader at Eli Lilly ($LLY) and Genentech to CEO of Millennium Pharmaceuticals. Since 2007, he's been at the helm of Third Rock Ventures, a VC firm that invests in and builds innovative biotech companies. Through 36 years in the field, Levin said he's seen the great promise of genomics inspire minds around the industry, only to be slowed by the usual suspects: companies unwilling to collaborate, regulators reluctant to cooperate and researchers getting a little ahead of themselves in the news media. Now, however, as the cost of <b>...</b>
Views: 136
0 ratings
Time: 07:32More in Science & Technology

Part 2 of 2. Social media is becoming an important gateway to patients and consumers for biotechnology companies wishing to commercialize without the major investment required for a sales force. Greg Stanley, Chief Commercialization Officer of Oncimmune, talks to Journal of Commercial Biotechnology reporter Rolf Taylor about engaging with smokers and ex-smokers on Facebook.
Views: 54
0 ratings
Time: 09:27More in Science & Technology

Part 1 of 2. Social media is becoming an important gateway to patients and consumers for biotechnology companies wishing to commercialize without the major investment required for a sales force. Greg Stanley, Chief Commercialization Officer of Oncimmune, talks to Journal of Commercial Biotechnology reporter Rolf Taylor about engaging with smokers and ex-smokers on Facebook.
Views: 63
0 ratings
Time: 12:32More in Science & Technology

This is a guest post from the BiotechBlog Intern,  Fintan Burke. Fintan is a student at the School of Biotechnology at Dublin City University. Do you have a response to Fintan’s post? Respond in the comments section below.

As researchers continue to investigate the complex nature of cell tissues and their behaviour, it is becoming increasingly apparent that conventional tissue culture methods such as Petri dishes and well plates are only capable of giving a fraction of the picture.

Over the last few years, there has been increased interest in novel approaches that allow cell cultures to grow in a 3D media. Indeed, 3D culture boasts many benefits over conventional 2D media. In a 2007 Nature article, Pampaloni et al argue that 3D culture has the potential to represent a true in vivo cellular environment without the need of ethically questionable animal testing. This type of culture can also give better insight into cell architecture, signalling and mechanics, which has already been recognised in cancer research; a 2008 study by Fischbach et al showed that tumor cells grown in 3D culture “recreated tumor microenvironmental cues” as well as increased tumor vascularisation compared to that of 2D cell cultures.

Demand for 3D culture is expected to grow as researchers search for new approaches to cellular research while lessening the need for animal testing.  From this demand, several approaches have been taken to develop 3D culture methods:

One method involves offsetting the natural sedimentation of cells in an aqueous media by gently rotating the bioreactor in an apparatus called a rotating wall vessel bioreactor. Cells will typically be attached to microcarrier bead “scaffolds” to allow for 3D chemical responses in the bioreactor. Originally developed by NASA to examine microbial growth in zero gravity, the culture method boasts the advantage of replicating the natural low shear effect found in the body which has been found to be influential in a pathogen’s infection potential.

Another system employs magnetism to develop 3D tissue cultures. This method, termed magnetic cell levitation, uses loaded bacteriophages to “infect” the cells for culture with faint amounts of iron oxide and gold. These cells are then left in a Petri dish to grow while a ring placed on top of the dish subjects them to magnetic forces, causing them to hover in suspension. In a 2010 issue of Nature Nanotechnology, Souza et al argue that this method has the potential to “be more feasible for long-term multicellular studies” as well as its ease of control and cost-effectiveness in research.

Recently attention has been paid to developing 3D culture media without an external influence. Microtissues Inc. has developed a form of tissue culture that rids the need of scaffolds in the culture. The result, claims CEO Jeffrey Morgan, is that uniform cells are prepared more efficiently and with more constant results than when scaffolds are used. Another company, Microtissues.com, also claim their 3D Petri dish maximises cell-cell interactions and allows controllable cell size.

These examples represent only a fraction of the new methods being constantly developed for 3D culturing of cells. As recently as last month, TAP Biosystems unveiled their newest collagen-based 3D cell culturing method for 96-well plates.  This recent boom in development is undoubtedly due to the realisation that research using the (since now) conventional 2D culture is nearing its end. Though 3D culture has the potential to become the fundamental choice for research into cancer and drug therapy, some issues remain. Modern microscopic imaging may struggle with the denser tissue samples. A common standard also needs to emerge in order to establish a unified protocol in research. Should these concerns be addressed, there can be little doubt that 3D cell culture will emerge as the cheap, informative and dominant research method for years to come.

This is a guest post from the BiotechBlog Intern,  Fintan Burke. Fintan is a student at the School of Biotechnology at Dublin City University. Do you have a response to Fintan’s post? Respond in the comments section below.

One of the most overlooked but consistent problems facing many governments is waste management. Despite healthy recycling attitudes in both the US and UK, an EPA report showed US total waste production in 2010 was still around 250 million tons, while there are concerns that the UK will run out of landfill sites by 2018.

For many years, the only viable alternative to landfills was incineration. Despite its efficiency over landfill sites (incineration can reduce the waste mass by around 90%), concerns over small energy generation efficiency (estimated at 20-25%) as well as public protest over environmental impact mean incineration can never be a permanent solution.

As public and private sectors are beginning to shift their attention to cleaner, more efficient alternatives to waste disposal, one of the leading candidates is gasification.

Gasification has been with us in various forms since the 1840s. The process involves extracting combustible gases by subjecting dehydrated carbonaceous materials to intense temperatures and reacting the resulting ‘char’ with oxygen and/or steam. Originally coal and wood were used in the process and so bore little difference to incineration. Since the 1970s, however, focus has shifted from using these conventional inputs to biomass.

From this change in focus, several companies have been set up to offer biomass gasification as an effective renewable resource. One such company, Thermoselect, claims that for every 100kg of waste processed, 890kg of “pure synthesis gas” is created for energy generation. Another company, ZeroPoint Clean Tech Inc., is keen to demonstrate gasification’s use in generating renewable gas, heat, water and electricity.

This development has been embraced by both the US and UK governments, welcoming the opportunity to reduce their carbon footprint as well as municipal waste. In April 2011, the US Air Force Special Operations Command invested in a new plasma-based transportable gasification system, with the aim of reducing its waste output by 4,200 tons a year in air bases across the country. Later that year, Britain approved the first advanced gasification plant in the country, with the potential to generate 49 megawatts of renewable energy (enough to power around 8,000-16,000 US households). Some have even speculated that this new technology could be used to spark a boom in hydrogen cell powered vehicles in the future.

Not everyone has embraced the new technique, however. The proposal for a biomass gasification plant in DeKalb County, Georgia was met with protests from locals, fearing carcinogenic emissions. Furthermore, a 2009 report by The Blue Ridge Environmental Defence League warned that gasification shares many similarities with incineration, including the formation of pollutants and greenhouse gasses.

Despite these arguments, the gasification of biomass has several benefits. The high temperatures make them an ideal means of processing bio-hazardous waste from hospitals and the plants themselves occupy very little physical space. As with any emerging technology, however, uptake is cautiously slow. Many of the new plants are in trial stages and it is uncertain whether gasification will have any long-term environmental effects. Should the existent plants prove to be successful, there is no reason to doubt that gasification will become a realistic solution for environmentally sound energy generation.

 

About the author:

Fintan Burke is a student at the School of Biotechnology at Dublin City University. His main fields of interest include biomedical therapies and recombinant organisms.  Fintan may be contacted at fintan.burke2@mail.dcu.ie .