Guest content

Part 2 of 2. Social media is becoming an important gateway to patients and consumers for biotechnology companies wishing to commercialize without the major investment required for a sales force. Greg Stanley, Chief Commercialization Officer of Oncimmune, talks to Journal of Commercial Biotechnology reporter Rolf Taylor about engaging with smokers and ex-smokers on Facebook.
Views: 54
0 ratings
Time: 09:27 More in Science & Technology

Part 1 of 2. Social media is becoming an important gateway to patients and consumers for biotechnology companies wishing to commercialize without the major investment required for a sales force. Greg Stanley, Chief Commercialization Officer of Oncimmune, talks to Journal of Commercial Biotechnology reporter Rolf Taylor about engaging with smokers and ex-smokers on Facebook.
Views: 63
0 ratings
Time: 12:32 More in Science & Technology

This is a guest post from the BiotechBlog Intern,  Fintan Burke. Fintan is a student at the School of Biotechnology at Dublin City University. Do you have a response to Fintan’s post? Respond in the comments section below.

As researchers continue to investigate the complex nature of cell tissues and their behaviour, it is becoming increasingly apparent that conventional tissue culture methods such as Petri dishes and well plates are only capable of giving a fraction of the picture.

Over the last few years, there has been increased interest in novel approaches that allow cell cultures to grow in a 3D media. Indeed, 3D culture boasts many benefits over conventional 2D media. In a 2007 Nature article, Pampaloni et al argue that 3D culture has the potential to represent a true in vivo cellular environment without the need of ethically questionable animal testing. This type of culture can also give better insight into cell architecture, signalling and mechanics, which has already been recognised in cancer research; a 2008 study by Fischbach et al showed that tumor cells grown in 3D culture “recreated tumor microenvironmental cues” as well as increased tumor vascularisation compared to that of 2D cell cultures.

Demand for 3D culture is expected to grow as researchers search for new approaches to cellular research while lessening the need for animal testing.  From this demand, several approaches have been taken to develop 3D culture methods:

One method involves offsetting the natural sedimentation of cells in an aqueous media by gently rotating the bioreactor in an apparatus called a rotating wall vessel bioreactor. Cells will typically be attached to microcarrier bead “scaffolds” to allow for 3D chemical responses in the bioreactor. Originally developed by NASA to examine microbial growth in zero gravity, the culture method boasts the advantage of replicating the natural low shear effect found in the body which has been found to be influential in a pathogen’s infection potential.

Another system employs magnetism to develop 3D tissue cultures. This method, termed magnetic cell levitation, uses loaded bacteriophages to “infect” the cells for culture with faint amounts of iron oxide and gold. These cells are then left in a Petri dish to grow while a ring placed on top of the dish subjects them to magnetic forces, causing them to hover in suspension. In a 2010 issue of Nature Nanotechnology, Souza et al argue that this method has the potential to “be more feasible for long-term multicellular studies” as well as its ease of control and cost-effectiveness in research.

Recently attention has been paid to developing 3D culture media without an external influence. Microtissues Inc. has developed a form of tissue culture that rids the need of scaffolds in the culture. The result, claims CEO Jeffrey Morgan, is that uniform cells are prepared more efficiently and with more constant results than when scaffolds are used. Another company,, also claim their 3D Petri dish maximises cell-cell interactions and allows controllable cell size.

These examples represent only a fraction of the new methods being constantly developed for 3D culturing of cells. As recently as last month, TAP Biosystems unveiled their newest collagen-based 3D cell culturing method for 96-well plates.  This recent boom in development is undoubtedly due to the realisation that research using the (since now) conventional 2D culture is nearing its end. Though 3D culture has the potential to become the fundamental choice for research into cancer and drug therapy, some issues remain. Modern microscopic imaging may struggle with the denser tissue samples. A common standard also needs to emerge in order to establish a unified protocol in research. Should these concerns be addressed, there can be little doubt that 3D cell culture will emerge as the cheap, informative and dominant research method for years to come.

This is a guest post from the BiotechBlog Intern,  Fintan Burke. Fintan is a student at the School of Biotechnology at Dublin City University. Do you have a response to Fintan’s post? Respond in the comments section below.

One of the most overlooked but consistent problems facing many governments is waste management. Despite healthy recycling attitudes in both the US and UK, an EPA report showed US total waste production in 2010 was still around 250 million tons, while there are concerns that the UK will run out of landfill sites by 2018.

For many years, the only viable alternative to landfills was incineration. Despite its efficiency over landfill sites (incineration can reduce the waste mass by around 90%), concerns over small energy generation efficiency (estimated at 20-25%) as well as public protest over environmental impact mean incineration can never be a permanent solution.

As public and private sectors are beginning to shift their attention to cleaner, more efficient alternatives to waste disposal, one of the leading candidates is gasification.

Gasification has been with us in various forms since the 1840s. The process involves extracting combustible gases by subjecting dehydrated carbonaceous materials to intense temperatures and reacting the resulting ‘char’ with oxygen and/or steam. Originally coal and wood were used in the process and so bore little difference to incineration. Since the 1970s, however, focus has shifted from using these conventional inputs to biomass.

From this change in focus, several companies have been set up to offer biomass gasification as an effective renewable resource. One such company, Thermoselect, claims that for every 100kg of waste processed, 890kg of “pure synthesis gas” is created for energy generation. Another company, ZeroPoint Clean Tech Inc., is keen to demonstrate gasification’s use in generating renewable gas, heat, water and electricity.

This development has been embraced by both the US and UK governments, welcoming the opportunity to reduce their carbon footprint as well as municipal waste. In April 2011, the US Air Force Special Operations Command invested in a new plasma-based transportable gasification system, with the aim of reducing its waste output by 4,200 tons a year in air bases across the country. Later that year, Britain approved the first advanced gasification plant in the country, with the potential to generate 49 megawatts of renewable energy (enough to power around 8,000-16,000 US households). Some have even speculated that this new technology could be used to spark a boom in hydrogen cell powered vehicles in the future.

Not everyone has embraced the new technique, however. The proposal for a biomass gasification plant in DeKalb County, Georgia was met with protests from locals, fearing carcinogenic emissions. Furthermore, a 2009 report by The Blue Ridge Environmental Defence League warned that gasification shares many similarities with incineration, including the formation of pollutants and greenhouse gasses.

Despite these arguments, the gasification of biomass has several benefits. The high temperatures make them an ideal means of processing bio-hazardous waste from hospitals and the plants themselves occupy very little physical space. As with any emerging technology, however, uptake is cautiously slow. Many of the new plants are in trial stages and it is uncertain whether gasification will have any long-term environmental effects. Should the existent plants prove to be successful, there is no reason to doubt that gasification will become a realistic solution for environmentally sound energy generation.


About the author:

Fintan Burke is a student at the School of Biotechnology at Dublin City University. His main fields of interest include biomedical therapies and recombinant organisms.  Fintan may be contacted at .

This is a guest post from Erin M. Hall. Erin is the Technical Leader at Genetica DNA Laboratories, Inc. located in Cincinnati, OH. Do you have a response to Erin’s post? Respond in the comments section below.

It is estimated that 18-36% of all actively growing cell line cultures are misidentified and/or cross-contaminated with another cell line (1).  For researchers in any field of biomedical science, this could mean that a significant amount of the experimental data published in current and past journals is of questionable value.  Every year, millions of dollars of public money are spent on potentially wasted research and this is happening not just here in the United States but around the world as well.

Cell line misidentification and cross-contamination has been around for more than 50 years.  It was finally brought to light in 1966 by Stanley Gartler, who reported that 19 supposedly independent human cell lines were in fact HeLa cells(2), which are known to be extremely robust, fast growing and able to contaminate other cultures by aerosol droplets.  There was much resistance to his findings and scientists didn’t want to admit that the research done using those contaminated cell lines may be questionable and potentially irreproducible.  Walter Nelson-Rees was one scientist who supported Gartler’s findings.  Nelson-Rees highlighted the papers and the scientists who were publishing experimental data using misidentified cell lines and for this, in 1981, he lost his contract with the National Institutes of Health (NIH) because his behavior was deemed “unscientific”(3).  From 1981 and on, misidentification went unchecked and even cell line repositories continued to distribute lines under their false names (3).

To exacerbate the problem, certain cell culture practices may be aiding cell misidentification and cross-contamination, including the practice of assessing the phenotypic characteristics, such as protein expression, as the only way to properly identify the cell population.  It has been proven that phenotypic expression can change with an increased passage number or even with changes in growth medium or other cell culture conditions(4).  The modern way of assessing the correct identity of the cell line (“cell line authentication”) is to perform short tandem repeat (STR) DNA testing.  The STR DNA profile of a human cell line is similar to a person’s fingerprint; it is unique to that individual.  STR testing is now the “gold standard” of human identification testing and is routinely used by the FBI in profiling convicted offenders (CODIS).  STR profiling is a straightforward and effective way to confirm that the cell line you think you have been using for the past 5 years, is in fact, the genuine cell line.

The reason the problem continues today is because it has not been properly brought to the attention of researchers.  Many researchers learn about the service the hard way, i.e. at the last minute when the journal requests confirmation of authentication before considering your article for publication.  In a survey that profiled 483 researchers who actively use cell cultures, only 33% authenticate their cell lines and 35% obtained their lines from other laboratories rather than a cell line repository, such as American Type Culture Collection (ATCC) (3).  We, as researchers, expect to use only the best reagents and supplies but the one aspect of the experiment that may be the most important, i.e. the cell line, is consistently and explicitly overlooked.  ATCC recommends verifying the identity of all cell lines before you start your experiments, every two months during active growth, and just prior to publication.

The NIH now officially recognizes that cell line misidentification is a serious problem in the scientific community.  They state in a formal notice issued on their website (NOT-OD-08-017) that grant applications that fail to employ acceptable experimental practices would not be looked upon favorably and would potentially not fare well in the journal review process.  The NIH encourages all peer reviewers and researchers to consider this problem carefully “in order to protect and promote the validity of the science [they] support”.  Many journals, such as those published by the American Association for Cancer Research (AACR) require a statement in the “Materials and Methods” section as to whether cells used in the submitted manuscript were authenticated.  Not properly authenticating the lines may prohibit the article from being published when peer reviewed.   To continue the advancement towards the elimination of the problem of cell line misidentification and cross-contamination, ATCC, in early 2012, released a set of guidelines written by the international Standard Development Organization (SDO) workgroup; these guidelines provide researchers with information on the use of STR DNA profiling for the purpose of cell line authentication.  In the near future, with the help of all of these influential supporters, cell line authentication will become a routine quality control check in every laboratory in the United States and around the world.

I would love to hear other thoughts and comments on this topic.  Tell us about your experiences with cell line authentication – good or bad!

(1)   Editorial – Nature 457, 935-936 (2009).
(2)   Gartler, SM. Second Decennial Review Conference on Cell Tissue and Organ Culture: 167-195 (1967).
(3)   ATCC SDO Workgroup.  Cell line misidentification: the beginning of the end: 441- 448 (2010).
(4)   Kerrigan, L.  Authentication of human cell-based products: the role of a new consensus standard: 255-260 (2011).

About the author:

Erin is the Technical Leader at Genetica DNA Laboratories, Inc. located in Cincinnati, OH. She is responsible for the technical operations of the laboratory, as well as, all aspects of the daily casework involving DNA identity sample processing and quality assurance. She received her Master’s degree in Forensic Science from PACE University in NYC and her Bachelor’s degree in Molecular Biology from the College of Mount Saint Joseph in Cincinnati, OH. For more information on Genetica, visit or their website dedicated to cell line authentication, .

Prior to joining Genetica, Erin worked in New York City as a laboratory manager and researcher in the Pharmacology department at Cornell University’s Medical School. She designed and executed complex experiments that examined the effects of environmental toxins on liver enzyme production utilizing HPLC, UV/vis spectroscopy, Western blotting and PCR analysis. Her work contributed to several published journal papers (under Erin Labitzke, if you want to read them!), most recently including being cited as first author on a paper related to enzymes present in mitochondria.

Erin may be contacted at

This is a guest post from the BiotechBlog Intern,  Fintan Burke. Fintan is a student at the School of Biotechnology at Dublin City University. Do you have a response to Fintan’s post? Respond in the comments section below.

According to a BDO industry report, a smallUS biotech company in 2010 enjoyed average revenues of around $42m while larger firms reported average revenue of around $124m. Additionally the European biotech sector also enjoyed a sizeable success with revenues totalling €13bn the same year. Global biotechnology revenues are estimated to grow to €103bn by 2013, bolstered by the pharmaceutical market which is expected to become a trillion-dollar industry by 2014.

These high revenues can attract more than just investors; smaller companies are seeing the benefits of asserting breach of their own patents in order to attain lawsuit settlements or licensing fees. Though more well-known in the technology sector, these ‘Patent Trolls’ have started to attract attention in biotech circles.

A standout case was that of Classen Immunotherapies Inc. which brought four biotechnology companies and a medical group to court for infringing on their patent of an immunisation schedule that could curb the risk of developing chronic diseases. Although the lawsuit was first thrown out by the district court as only a mental abstract, on appeal the federal court ruled in Classen’s favour citing that Classen has a “statutory process” that allows for patent protection.

This has set a troubling precedent in biotech law; since the Classen patents were somewhat broad, there could soon be a flood of similar companies trying to claim patent infringement based in immunisation or dosage schedules.

Indeed, there is proof of some small firms already trying to build a portfolio of biotech patents. These ’non-practicing entities’ deliberately gather patents – not in order to develop products – but rather extort other companies for settlements or licensing fees. There are already specialized law firms which help companies obtain and enforce biotech-specific patents. Such companies have been known to damage stock prices, delay production and eat into revenues – all of which is completely legal.

Many identify these frivolous litigations to lie not in the vagueness of the patents, but rather in unspecific patent legislation. In Ronald I. Eisenstein’s 2006 column in The Scientist, he notes that “One size does not fit all in terms of approaching patents.” Any legislation passed to curtail the practice of ‘Trolling’ in the technology sector may inadvertently harm smaller biotech companies and universities that rely on larger companies in the FDA approval process.

In his 2008 book Intellectual Property and Biotechnology: Biological Inventions, Dr. Matthew Rimmer offers some solutions to this growing problem. “Novelty and utility are the criteria used to judge whether something is inventive or not” he writes. “It is really those doctrinal concepts that need to be tightened.”

In a 2011 Forbes article Colleen Chien also offered some advice to defend against the trolls. She notes that many trolls will use contingent fee based lawyers to manage costs. Firms that pay via successful disposal of a suit or minimise settlement costs cn likewise minimise legal fees and increase the lawyer’s incentive to defend them. Furthermore, larger firms could be better off outsourcing their defence to specialist lawyers, rather than solely relying on their own legal team.

Patent trolls remain a very real problem in the world of technology. In the most infamous case, Research In Motion (producers of the Blackberry) paid a $600m settlement to NTP Inc for infringing their wireless email patents. Fortunately steps have been taken at a federal level. The passing of the Leahy-Smith American Patents Act in September 2011 has allowed any firm threatened with infringement to petition for a patent review within 4 months of being sued. Nonetheless the biotechnology sector must begin to reassess its patent rights and monitor such changes in legislation if it is to further grow as an industry.

About the author:

Fintan Burke is a student at the School of Biotechnology at Dublin City University. His main fields of interest include biomedical therapies and recombinant organisms.  Fintan may be contacted at .

This is a guest post by Jurgita Ashley

You are a small company ready to take a leap into the public market, whether it is for growth, liquidity or to attract greater investor interest. But, oh man, those dollar figures for IPOs would make anyone’s head spin. But wait, don’t discount it yet as a viable alternative. If done right, going public does not have to cost a fortune. Small companies can take advantage of the SEC’s relaxed reporting regime, may strategically decide to list on the Bulletin Board (the OTCBB) rather than the NYSE or NASDAQ, and can significantly limit their corporate governance-related expenses.

Your first major—and unavoidable—expense will be the preparation of a registration statement. Inevitably, it will require management’s time, preparation of audited financials and legal fees. You do not have to be charged, however, $1,000 per hour or other exorbitant fees, and your IPO team does not have to include 50 professionals. If you are a “smaller reporting company,” which is a company with a public non-affiliate common equity float of less than $75 million (or annual revenue of less than $50 million if the float cannot be calculated), your reporting requirements will be limited. Your registration statement – whether it is on a Form S-1 (involving an immediate capital raise) or a Form 10 (initial registration with the SEC to position the company for a subsequent capital raise) – will include less financial information and disclosures than is required for larger companies. In addition, this first registration statement is the “meat and bones,” so your subsequent filings will build upon this information and will involve much less drafting from scratch. This first registration statement will most likely be reviewed by the SEC, which will issue comments requiring one or more amendments. This review, however, will most likely be limited. With the Dodd-Frank and other legislative initiatives and demands on the SEC’s resources, the days of 100 plus comments are largely over. As long as your accounting is in order and your legal advice is good, you should be able to maneuver through the SEC’s comment process without excessive delays or expense.

Now, let’s say your primary goal is to obtain greater investor interest in the company and to create an avenue to sell stock. To achieve this, it is not necessary to pay the NYSE’s or NASDAQ’s listing fees or to become subject to their reporting and governance requirements. Although the OTCBB is usually not the market of first choice, it can be an effective vehicle to provide some liquidity and disseminate information about the company. To list on the OTCBB, a company only needs a market maker and to file reports with the SEC. By listing on the OTCBB, the company becomes subject to the oversight of FINRA, but there are no listing fees, no additional reporting requirements, and no special governance requirements. In addition, if down the road you are ready to transition to the NYSE or the NASDAQ, your platform will already be in place.

As a company that is listed on the OTCBB only, you are subject to limited corporate governance requirements (imposed by the SEC). Yes, some of your directors should be independent, committees should operate under board-approved charters and the company should have a code of ethics and reasonable internal controls, but all of these policies and procedures need not become all consuming. There is no need for a small company with limited financial resources to adopt all the latest “best practices” in governance or add a whole department to address the company’s new reporting obligations. Pursuant to recent SEC relief, “smaller reporting companies” also do not need to obtain—and pay for—an auditor’s report on internal control over financial reporting. Reasonable disclosure controls and procedures are important and, in some instances, improving the company’s policies and procedures is desirable and appropriate. In many cases, however, most new obligations of a small public company can be satisfied without exorbitant expense.

Nearly 50 percent of all public companies in the United States are “smaller reporting companies.”(1) Of course, not every small private company will find it desirable to go public, and for some, a full-blown—and expensive—IPO is an appropriate option. The perceived costs, however, should not discourage other companies from evaluating the option of a lower cost IPO. Access to the public markets is no longer insurmountable.

Jurgita Ashley is an attorney in the Cleveland, Ohio office of Thompson Hine LLP and is a member of its Corporate Transactions and Securities practice group. Her practice is focused on public company matters, primarily securities law, corporate governance and takeover matters. She can be reached at or through The views expressed in this article are attributable to the author and do not necessarily reflect the views of Thompson Hine LLP or its clients. The author would like to thank Derek D. Bork, a partner at Thompson Hine LLP, for his review and invaluable input on this article.

(1) Forty-eight percent of all U.S. companies filing annual reports on Form 10-K with the SEC were “smaller reporting companies” for the period from October 1, 2009 through September 30, 2010, which is the SEC’s latest fiscal year for which data is available. Proxy Disclosure Blog by Mark Borges at (December 3, 2010), available at

This is a guest post from BiotechBlog reader Jack Lundee

Technology Continues to Fight Aids

In 2008, Sub-Saharan Africa was populated with over 22 million HIV+ inhabitants, and currently there are over 5 million Southern Africans infected with the virus. Worldwide, there are upwards of 40 million people infected with HIV, a very frightening number. But with the coming of the 22nd annual World AIDS Day, it’s important to take note the progress that has been made in the fight against HIV/AIDS. At the same time, it’s very vital we familiarize ourselves with a couple great HIV research and technology investors.

Granted, there have already been major advances concerning affordable microbicides and vaccines as preventative measures against the virus. Similarly, the introduction of low-cost antiretroviral drugs has allowed people already infected to lead longer, healthier and happy lives.

This can most certainly be attributed to tremendous associations like the CGI (Clinton Global Initiative). The Clinton Global Initiative has put a tremendous amount of money into AIDS research. Known for his work in raising money for Hurricane/Tsnuami victims, former President Clinton and his close personal aide Doug Band also have great interest in tackling one of the deadliest STDs in the world, HIV/AIDS. Back in 2006, Clinton helped open people’s eyes to the severity of the disease in foreign states by traveling deep into Burma with the crew of 60 Minutes.

Before this however, he introduced CHAI (Clinton Health Access Initiative), outlined specifically as “a global health organization committed to strengthening integrated health systems in the developing world and expanding access to care and treatment for HIV/AIDS, malaria and tuberculosis.” Their main objective was to travel to these third world countries like Burma, and distribute various treatments, which weren’t currently available to sufferers. Since it’s beginning, the organization has helped more than 2 million people gain access to medicines needed for treatment. But the efforts of Former President Clinton and his close personal aide did not end there. The CGI continues to receive funding for HIV related projects in third world countries like Southern Africa.

In their latest endeavor, they’ve joined forces with HP (Hewlett Packard) to deliver technologies that will capture, manage and return early diagnosis for infants. This translates to indentifying the virus in an infant within one to two days, which is a huge improvement from previous paper based systems. How is this important? Newly borne are especially susceptible to the disease as their carriers can very easily transmit. Similarly, it’s very crucial that they begin treatment as soon as possible to ensure survival; without, they are typically unable to survive past age two. In a statement to the press, Clinton stated, “I’m pleased HP’s technology and expertise will enable the partnership with CHAI to save the lives of more than 100,000 infants in Kenya each year, and in the process, demonstrate how the private sector can and should operate in the developing world.”

Within their first year, HP is expected to return results concerning HIV testing for nearly 70,000 infants in Kenya. The technologies introduced will also allow for real-time medical data, which will be viewable to health professionals across Kenya.

Known for it’s incredibly high number of HIV+ citizens, Africa remains one of the greatest challenges for organizations like CHAI/CGI today. Recent advancements in technology combined with the help of Doug Band and Former President Clinton have helped lessen casualty rates and permitted people to live more productive lives. And although a cure remains unfound, HP and the CGI have provided great technological steps in the right direction towards eliminating the virus for good.

Jack Lundee is a writer for and With a graduates from the Newhouse School of Communications, he’s an avid supporter of all things left and progressive.

This is a guest post from Keith Bradbury, Executive Director of Drug Information at Medco Health Solutions, Inc.

Biosimilar drugs to gain greater priority as decade progresses

The Patient Protection and Affordable Care Act will heighten the degree of competition in the field of biotech drugs, a fast growing area of drug therapy that is accounting for a larger portion of drug spending. The law creates a pathway for biosimilars, which are comparable versions of biologics and are also known as “follow on biologics,” to enter the marketplace. These medicines could create a wave of lower cost competition in the biotech industry starting in 2013, leading to savings by as much as 30 percent for some of the costliest drugs.

Biologic and recombinant drugs have been instrumental in treating a variety of conditions such as cancer, diabetes, immune deficiency, metabolic disorders, and autoimmune conditions, as well as rare medical conditions such as Pompe disease, Fabry disease and Gaucher disease. The difficulty making these drugs, the absence of competition and small patient populations in which some of these drugs are used has made biologics among the most expensive drugs currently prescribed, ranging from $6,000 to more than $400,000 annually.

The Congressional Budget Office had projected $25 billion in total savings from biosimilars between 2009 and 2018. Others have estimated substantially larger savings. For employers, health plans and patients, this could represent substantial relief from the double digit growth rates of specialty drug spending. According to Medco Health Solution’s 2010 Drug Trend Report, spending on specialty drugs, a group of drugs that is mostly recombinant proteins, represented 5.6 percent of overall prescription costs in 2003, but by 2009, the figure had soared to 14.2 percent.

Biosimilars will have to undergo analytical studies to demonstrate that they’re “highly similar to the reference product notwithstanding minor differences in clinically inactive components.” The biosimilar must utilize the “same mechanisms of action” and follow the same prescribing instructions and indications as the original product. In other words, there can be no clinically meaningful differences between the biosimilar and the reference product in regards to safety, purity, and potency. The FDA will determine the level of clinical studies needed for biosimilar drugs to gain approval, but some will likely be needed. Also, cross-over studies will be needed to allow a determination of interchangeability.

There are significant protections for the makers of the original product. The law provides reference product biologic manufacturers 12 years of exclusivity for data used in the submission, starting from the date of FDA approval. That data is a necessary part of any filing for a biosimilar to gain approval under the pathway.
The marketplace for biosimilar drugs is likely to be competitive with some leading pharmaceutical makers – namely Eli Lilly, AstraZeneca, and Merck & Co. – entering the area. But biosimilars are not likely to be a significant force in the marketplace until 2014 or 2015.

The drug categories where we’re likely to see significant competition include the following:

  • Human growth hormones are likely to be among the first to face increased biosimilar competition, since they were among the first recombinant proteins to appear in the marketplace.
  • Recombinant insulins and modified recombinant insulins, such as Humulin and Novolin, are apt to be early follow-on biologics, since the reference drugs’ patents have long expired. These insulins should be relatively easily to replicate and biosimilar versions of these drugs could be introduced between 2013-2015. However, as much of the insulin use has switched to modified recombinant insulins, this may not be a large opportunity.
  • Follow-on versions of epoetin alfa, which has been sold under the brand names Epogen® and Procrit® to treat patients with kidney disease or chemotherapy-induced anemia, can have a significant effect upon specialty drug spending. However, it is not clear if cardiovascular safety data will be needed for follow-on versions of these drugs.
  • Leukine® (sagramostim), a treatment to prevent opportunistic infections, could face follow-on competition in 2014.
  • Drugs to combat neutropenia are already facing challenges to patents with Teva’s Tevagrastim seeking to compete against Neupogen® (filgrastim), which has a patent expiration in 2013.
  • Interferon-alfa based treatments were among the first biologic drugs to reach the marketplace and have found uses treating leukemia, cancer, genital warts, hepatitis and multiple sclerosis. Many patents governing these drugs have long expired. Some of the pegylated versions of interferon drugs, such as Peg-Intron (peginterferon alfa-2b) or Pegasys (peginterferon alfa-2a), gained significant extra time on their patents because the reformulation creates a different molecule that improves the efficacy of treatment.
  • Rheumatoid arthritis treatments and other biologics for autoimmune disorders are among the fastest growing drug categories, but this group is not likely to face a biosimilar competition until 2014 or 2015.

Later this decade, many treatments that gained FDA marketing approval from the year 2000 on may face greater competition from biosimilars, as well as new treatments under development. Herceptin® (trastuzumab), Avastin® (bevacizumab), Erbitux® (cetuximab) will be vulnerable to competition in the later second half of the decade, as well as some of the priciest drugs for rare enzyme disorders.

This new regulatory pathway for biosimilars could be a catalyst to greater competition in the biotechnology industry, much like the introduction of generic drugs under the Hatch-Waxman Act spurred competition among traditional small molecule drugs. Many of today’s blockbuster drugs emerged as manufacturers had to replace old revenue sources with new products. Although biosimilars are not exactly like the original products, the prospect of competition could drive the biotech industry to deliver new medicines that further improve the quality of patient care.

About the author: Keith Bradbury is Executive Director of Drug Information at Medco Health Solutions, where he has been employed for the past 14 years. Bradbury has more than 30 years experience in hospital pharmacy, managing pharmacy benefits for health plans, drug information services, developing drug formularies, and managing pharmaceutical benefits provided by a large PBM.
Bradbury also oversees Medco’s new drug pipeline management process, and is the lead author for the drug forecast section of the Medco Annual Drug Trend Report.

These papers are from the 2010 final projects in the NIH Foundation for Advanced Education in the Sciences TECH 366 — Biotechnology Management. The students were asked to tell a story based on the course lectures, and to expand with general lessons biotechnology company management:

If You Build It, Will They Come?
Derek Francis

Profitability and Orphans: The Role of Price and Incentives in Four Different Markets
Nate Hafer

Patent Analysis: A Tool for Making Strategic Business Decisions
Eric Norman

2009 final projects are posted here.