Home Guest content

This is a guest post from Susan K Finston, President of Finston Consulting. Do you have a response to Susan’s post? Respond in the comments section below.

Susan Kling FinstonVCs are starting to talk again about the ‘perception’ of scarce funding for early stage bio-pharma:

At the New Paradigms meeting (a satellite of JPM Conference), a panel I was involved with discussed the perceived funding gap and whether great companies were still getting financing.  The unanimous view was that innovative new startups were continuing to attract capital.

Whenever I hear VCs affirmations that all the good, innovative startups are funded, or that the problem is optics, my antennae go up and I want to see the data.

In other words:  Show me the money – and exactly where it is being invested.

Fortunately we live in a world where VC investment is measured and analyzed obsessively – and not just by bio-pharma startups.

Rather than rely on a straw poll of self-selected VCs, we can consult data for 2012 from the National Venture Capital Association’s MoneyTree report that shows an overall 10% fall in 2012 as compared to 2011, and double-digit year-on-year declines for biotech (down 15%) and medical devices (down 13%).

Sadly, the MoneyTree report confirms that it is not the best of all possible worlds for early stage life sciences companies and related devices.

So why the evident disconnect between the (anecdotal) views of funders and actual macroeconomic trends?

As the saying goes, where you stand, depends on where you sit.  As an independent life sciences consultant, and – full disclosure – CEO / Managing Director of Amrita Therapeutics Ltd., an early-stage bio-discovery company seeking funding, my experiences likely are very different from those of VCs in the JP Morgan bubble.

From their perspective, life sciences may be trending positively, particularly given compression of early valuation –  where companies are saddled with much lower valuations as compared to fifteen, ten or even five years ago, using the same metrics.  This gives VCs and Pharma Venture Funds the opportunity to gain far greater leverage over small companies at lower capital commitments, essentially transferring value from founders to funders.

The really good news for funders is that at these lowered valuations, early stage biotech is a great investment opportunity, particularly given mounting evidence that the life sciences are a much better bet for IPOs than IT and Social Media.

For 2013, let’s hope that VC’s can take a break from their Facebook accounts, and make their own perceptions of greater early stage funding a reality for more bio-pharma companies!

About the author:
President of Finston Consulting LLC since 2005, Susan works with innovative biotechnology and other clients ranging from start-up to Fortune-100, providing support for legal, transactional, policy and “doing business” issues. Susan has extensive background and special expertise relating to intellectual property and knowledge-economy issues in advanced developing countries including India and South Asia, Latin America and the Middle East North Africa (MENA) region. She also works with governments, s and NGOs on capacity building and related educational programs through BayhDole25. Together with biotechnology pioneer Ananda Chakrabarty, she also is co-founder of Amrita Therapeutics Ltd., an emerging biopharmaceutical company based in India with cancer peptide drugs entering in vivo research. Previous experience includes 11 years in the U.S Foreign Service with overseas tours in London, Tel Aviv, and Manila and at the Department of State in Washington DC. For more information on latest presentations and publications please visit finstonconsulting.com.

This is a guest post from Susan K Finston, President of Finston Consulting. Do you have a response to Susan’s post? Respond in the comments section below.

Susan Kling Finston When I started working for PhRMA nearly 15 years ago, the mantra was “Fail early, Fail cheap.”

Given the exponentially increased cost of advancing compounds pre-clinical into clinical research and through pivotal Phase II and larger Phase III trials, it makes sense for companies to investigate as many compounds as possible through early stage pre-clinical research and then cherry pick compounds for clinical trials based on a well-developed understanding of the compounds structure, toxicity and other key characteristics.

An R&D program that fails at the pre-clinical stage is far less costly than one that makes it through the Investigational New Drug (IND) application and into clinical trials, only to tank due to lack of efficacy or safety. So why are Bio-Pharma companies taking the opposite tack –  investing huge sums in late-stage compounds for R&D programs, with faltering results in late clinical stage trials?

Why are companies no longer ‘failing early’?

The same factors driving bio-pharma M&A strategy motivate companies to acquire late-stage research assets to fill depleted pipelines.
And cash-rich bio-pharma companies competing for a limited pool of late stage programs, bidding up the cost of acquisition (perhaps at times also hindering full due diligence).

In theory, these assets are lower-risk than early stage programs because they have reached the clinical trial stage. In practice, this has resulted in 30% failure rates at the Phase III clinical trial stage, with a further 50% attrition rate between the clinic and the marketplace, where  “peak sales projection is more art than science, and the art often looks rather comical in retrospect.” In sum, only about one third of launched drugs make back their R&D costs.

Good in theory, bad in practice …

It may be time to recognize that in terms of net-present value, later-stage compounds are not lower-risk than pre-clinical programs factoring in Phase III trial costs, likelihood of failure at Phase III (or before launch), and more realistic revenue projections,  into valuation of late-stage assets.

Given the foregoing, taking a case-by-case approach to acquisition of R&D programs at earlier stages of development would reduce overall risk, providing better long-run returns.

Failure is always going to be with us.  With ever increasing complexity and cost of human clinical trials, Bio Pharma would be better off taking the long view and at least failing earlier in the process at a fraction of the cost!

About the author:

President of Finston Consulting LLC since 2005, Susan works with innovative biotechnology and other clients ranging from start-up to Fortune-100, providing support for legal, transactional, policy and “doing business” issues. Susan has extensive background and special expertise relating to intellectual property and knowledge-economy issues in advanced developing countries including India and South Asia, Latin America and the Middle East North Africa (MENA) region. She also works with governments, s and NGOs on capacity building and related educational programs through BayhDole25. Together with biotechnology pioneer Ananda Chakrabarty, she also is co-founder of Amrita Therapeutics Ltd., an emerging biopharmaceutical company based in India with cancer peptide drugs entering in vivo research. Previous experience includes 11 years in the U.S Foreign Service with overseas tours in London, Tel Aviv, and Manila and at the Department of State in Washington DC. For more information on latest presentations and publications please visit finstonconsulting.com.

This is a guest post from Susan K Finston, President of Finston Consulting. Do you have a response to Susan’s post? Respond in the comments section below.

Susan Kling FinstonIt looks like the ‘new normal’ is not just for biotechnology startups and SME pharma companies  – the CROs (and other vendors) who previously benefited from better R&D funding streams also feel the chill of the ongoing funding freeze.

Discussions with CROs and research consultants in the US and abroad reveal concerns relating to lengthy delays between initial discussion of research and final approval, reduced budgets for research programs, and generally falling demand for CRO services.  Their concerns are not just anecdotal.  A recent National Academy of Sciences (NAS) study sees the U.S. Clinical Trial Enterprise in particular as in decline:

There is ample evidence that U.S. trials are be-coming more expensive (DeVol et al., 2011). Worse, 90 percent fail to meet enrollment goals, and additional evidence points to disillusionment among American investigators (Getz, 2005). The rate of attrition among U.S. investigators is increasing, even among experienced researchers with strong track records of productivity, while 45 percent of first-time investigators abandon the field after their first trial. The system has become so inefficient that even the NIH is offshoring clinical trials at a substantial rate (Califf, 2011; Kim et al., 2011), using taxpayer funding to conduct trials in countries with less expensive and more efficient CTEs, despite concerns about generalizability as noted above.

The EU has seen a 25% decline in registration of new clinical trials and has begun a legislative process to improve the research environment in Europe. More broadly, an interesting Canadian clinical trial survey available here, shows a decrease in trials and related sites globally between 2008 – 26,241 sites and 990 trials – and 2010 – 22,358 sites and 760 trials respectively.  While finding increases in clinical trial activities in developing countries in Asia, they note the overall global trend of reduced clinical trial starts.

So the fundamental realignment of early-stage biotech valuation that makes it more challenging for start-ups and SMEs also has had unintended consequences for Clinical Research Organizations (CROs) providing pre-clinical and clinical research services.  And research budgets are falling across the board, more broadly, as larger companies and even public research institutions face cost containment pressures.

Given the critical importance of the clinical research enterprise for generation of social and economic good, it will be interesting to see how policy makers respond and / or if the market will rebound if economic growth increases.

About the author:

President of Finston Consulting LLC since 2005, Susan works with innovative biotechnology and other clients ranging from start-up to Fortune-100, providing support for legal, transactional, policy and “doing business” issues. Susan has extensive background and special expertise relating to intellectual property and knowledge-economy issues in advanced developing countries including India and South Asia, Latin America and the Middle East North Africa (MENA) region. She also works with governments, s and NGOs on capacity building and related educational programs through BayhDole25. Together with biotechnology pioneer Ananda Chakrabarty, she also is co-founder of Amrita Therapeutics Ltd., an emerging biopharmaceutical company based in India with cancer peptide drugs entering in vivo research. Previous experience includes 11 years in the U.S Foreign Service with overseas tours in London, Tel Aviv, and Manila and at the Department of State in Washington DC. For more information on latest presentations and publications please visit finstonconsulting.com.

This is a guest post from the BiotechBlog Intern,  Fintan Burke. Fintan is a student at the School of Biotechnology at Dublin City University. Do you have a response to Fintan’s post? Respond in the comments section below.

It has now been more than a year since the FDA published a welcomed revision of the 1987 guideline “Process validation: General Principles and Practices” in early 2011. This guideline serves as a complete rewrite to its previous incarnation, eliminating outdated practices, putting more emphasis on scientific evidence and completely redefining its definition of “process validation.” This update follows in a wave of acts by the FDA to reinvent the way it tackles an aspect of life science that was initially poorly understood.

Initially this guideline was developed in response to concerns raised in the early 1970s that end-product testing to test process standards was insufficient. While this guideline led to a more direct and formal protocol for process validation, the broad language used throughout left many companies to follow their own interpretation. For example, the 1987 guideline states that the process efficacy should be validated by producing a certain number of batches in a row successfully, though it never actually mentioned an appropriate number of trials (eventually the idea of “Three Golden Batches” emerged among companies). As expresspharma also pointed out, the guideline put pressure on manufacturers to maintain the process that created these “Golden Batches” without needing to understand, or even control, the parameters that caused them.

In other words, as long as a process was proven to work enough times, being disciplined in this procedure took precedent to understanding it.

What followed were years of criticism of the FDA’s attitude to Good Manufacturing Process (GMP) guidelines. As discussed at globepharm, the FDA routinely failed to update its GMP standards by claiming such standards were only the minimum held, thereby shifting the responsibility to companies to keep their standards current. Though revisions were often promised, they never seemed to have come to fruition.

This all changed as part of the “Pharmaceutical cGMPs for the 21st Century” initiative launched in 2002, when the FDA began its review of the guideline to better the approach of quality control for the pharmaceutical industry.

The most obvious change to the 2011 edition was the new definition of process validation, which shifted from requiring only the end product as proof of good process to instead consistently checking the process designs throughout the product’s life cycle to see if standards are met. This meant that the 3 Golden Batches concept became obsolete and a new measure of process evaluation was needed. To this end the FDA also explicitly emphasised in its guidelines the need for a 3 stage process evaluation; a process is first designed based on previous process knowledge, proven to generate reproducible results and finally routinely verified to ensure the whole process is controlled.

This more holistic, life-cycle based approach is seen as a massive improvement to the 1987 guideline particularly since the wording now vividly states what is expected of the industry, while still being applicable to individual companies. Several new questions have surfaced, however. By deserting the 3 Golden Batches concept, companies must reassess how many batches must be tried to prove standards are met. This, as Dr Mike Long et al discuss, is one area where the new guideline remains obscure; it instead merely states that enough tests should be run to statistically justify going into commercial production. Another problem is that the guideline clashes at some points with EU regulations. Apart from differences in the wording of some principles, Annex 15 (article 25) of the EU Guide to GMP also seems to recommend the 3 Golden Batches method, directly contrasting the FDA’s efforts to quell its practice.

While changing regulation, the FDA is being proactive in its approach to enforcing these regulations. For example, a recent event held in Colorado allowed industry representatives to discuss with FDA representatives from that district some of the challenges encountered during inspections. In recognition of the growing multinational behaviour of product manufacture the FDA is also collaborating with the European Regulatory Network to monitor foreign companies in their own territories. This follows a successful application by the FDA to become a member of the Pharmaceutical Inspection Co-operation Scheme (PIC/S) in November of 2010, a process that required a lengthy review period that involved each division of the FDA applying individually to PIC/S in order to gain admittance, thus furthering its modernisation and easing pressure to establishing GMP guidelines.

There can be no question that the FDA has recently taken a turn for the better in trying to modernise its efforts in bioprocess regulation, an area it initially had failed to develop. Despite these improvements, criticism remains. It appears too that the FDA’s recent efforts are just the beginning of an international trend in reviewing bioprocess standards, with a major update to the European Commission’s GMPs in bioprocesses coming into effect early next year.

About the author:

Fintan Burke is a student at the School of Biotechnology at Dublin City University. His main fields of interest include biomedical therapies and recombinant organisms.  Fintan may be contacted at fintan.burke2@mail.dcu.ie .

For Third Rock Ventures' Mark Levin, personalized medicine is nothing new. Instead, he sees the history of pharma as a gradual homing in on the roots of disease, and genomic mapping is just the next step. By Damien Garde, Fierce Biotech. From the herbal remedies of early civilizations to the dawn of modern pharmaceuticals, researchers have been slowly drilling down, Levin said, personalizing treatments one step at a time. Now, with the help of sequencing, we can move beyond treating phenotypic effects and get to the heart of disease: its genotypic origins. That is, if we can all get along. Levin spoke to a packed house Tuesday at the Personalized Medicine Coalition's State of Personalized Medicine Luncheon in Washington, DC, and his message was simple: Personalized medicine has the potential to revolutionize biotech and pharma, but only if stakeholders work together. Levin's had a long career in the industry, going from an engineer and project leader at Eli Lilly ($LLY) and Genentech to CEO of Millennium Pharmaceuticals. Since 2007, he's been at the helm of Third Rock Ventures, a VC firm that invests in and builds innovative biotech companies. Through 36 years in the field, Levin said he's seen the great promise of genomics inspire minds around the industry, only to be slowed by the usual suspects: companies unwilling to collaborate, regulators reluctant to cooperate and researchers getting a little ahead of themselves in the news media. Now, however, as the cost of <b>...</b>
Views: 136
0 ratings
Time: 07:32More in Science & Technology

Part 2 of 2. Social media is becoming an important gateway to patients and consumers for biotechnology companies wishing to commercialize without the major investment required for a sales force. Greg Stanley, Chief Commercialization Officer of Oncimmune, talks to Journal of Commercial Biotechnology reporter Rolf Taylor about engaging with smokers and ex-smokers on Facebook.
Views: 54
0 ratings
Time: 09:27More in Science & Technology

Part 1 of 2. Social media is becoming an important gateway to patients and consumers for biotechnology companies wishing to commercialize without the major investment required for a sales force. Greg Stanley, Chief Commercialization Officer of Oncimmune, talks to Journal of Commercial Biotechnology reporter Rolf Taylor about engaging with smokers and ex-smokers on Facebook.
Views: 63
0 ratings
Time: 12:32More in Science & Technology

This is a guest post from the BiotechBlog Intern,  Fintan Burke. Fintan is a student at the School of Biotechnology at Dublin City University. Do you have a response to Fintan’s post? Respond in the comments section below.

As researchers continue to investigate the complex nature of cell tissues and their behaviour, it is becoming increasingly apparent that conventional tissue culture methods such as Petri dishes and well plates are only capable of giving a fraction of the picture.

Over the last few years, there has been increased interest in novel approaches that allow cell cultures to grow in a 3D media. Indeed, 3D culture boasts many benefits over conventional 2D media. In a 2007 Nature article, Pampaloni et al argue that 3D culture has the potential to represent a true in vivo cellular environment without the need of ethically questionable animal testing. This type of culture can also give better insight into cell architecture, signalling and mechanics, which has already been recognised in cancer research; a 2008 study by Fischbach et al showed that tumor cells grown in 3D culture “recreated tumor microenvironmental cues” as well as increased tumor vascularisation compared to that of 2D cell cultures.

Demand for 3D culture is expected to grow as researchers search for new approaches to cellular research while lessening the need for animal testing.  From this demand, several approaches have been taken to develop 3D culture methods:

One method involves offsetting the natural sedimentation of cells in an aqueous media by gently rotating the bioreactor in an apparatus called a rotating wall vessel bioreactor. Cells will typically be attached to microcarrier bead “scaffolds” to allow for 3D chemical responses in the bioreactor. Originally developed by NASA to examine microbial growth in zero gravity, the culture method boasts the advantage of replicating the natural low shear effect found in the body which has been found to be influential in a pathogen’s infection potential.

Another system employs magnetism to develop 3D tissue cultures. This method, termed magnetic cell levitation, uses loaded bacteriophages to “infect” the cells for culture with faint amounts of iron oxide and gold. These cells are then left in a Petri dish to grow while a ring placed on top of the dish subjects them to magnetic forces, causing them to hover in suspension. In a 2010 issue of Nature Nanotechnology, Souza et al argue that this method has the potential to “be more feasible for long-term multicellular studies” as well as its ease of control and cost-effectiveness in research.

Recently attention has been paid to developing 3D culture media without an external influence. Microtissues Inc. has developed a form of tissue culture that rids the need of scaffolds in the culture. The result, claims CEO Jeffrey Morgan, is that uniform cells are prepared more efficiently and with more constant results than when scaffolds are used. Another company, Microtissues.com, also claim their 3D Petri dish maximises cell-cell interactions and allows controllable cell size.

These examples represent only a fraction of the new methods being constantly developed for 3D culturing of cells. As recently as last month, TAP Biosystems unveiled their newest collagen-based 3D cell culturing method for 96-well plates.  This recent boom in development is undoubtedly due to the realisation that research using the (since now) conventional 2D culture is nearing its end. Though 3D culture has the potential to become the fundamental choice for research into cancer and drug therapy, some issues remain. Modern microscopic imaging may struggle with the denser tissue samples. A common standard also needs to emerge in order to establish a unified protocol in research. Should these concerns be addressed, there can be little doubt that 3D cell culture will emerge as the cheap, informative and dominant research method for years to come.

This is a guest post from the BiotechBlog Intern,  Fintan Burke. Fintan is a student at the School of Biotechnology at Dublin City University. Do you have a response to Fintan’s post? Respond in the comments section below.

One of the most overlooked but consistent problems facing many governments is waste management. Despite healthy recycling attitudes in both the US and UK, an EPA report showed US total waste production in 2010 was still around 250 million tons, while there are concerns that the UK will run out of landfill sites by 2018.

For many years, the only viable alternative to landfills was incineration. Despite its efficiency over landfill sites (incineration can reduce the waste mass by around 90%), concerns over small energy generation efficiency (estimated at 20-25%) as well as public protest over environmental impact mean incineration can never be a permanent solution.

As public and private sectors are beginning to shift their attention to cleaner, more efficient alternatives to waste disposal, one of the leading candidates is gasification.

Gasification has been with us in various forms since the 1840s. The process involves extracting combustible gases by subjecting dehydrated carbonaceous materials to intense temperatures and reacting the resulting ‘char’ with oxygen and/or steam. Originally coal and wood were used in the process and so bore little difference to incineration. Since the 1970s, however, focus has shifted from using these conventional inputs to biomass.

From this change in focus, several companies have been set up to offer biomass gasification as an effective renewable resource. One such company, Thermoselect, claims that for every 100kg of waste processed, 890kg of “pure synthesis gas” is created for energy generation. Another company, ZeroPoint Clean Tech Inc., is keen to demonstrate gasification’s use in generating renewable gas, heat, water and electricity.

This development has been embraced by both the US and UK governments, welcoming the opportunity to reduce their carbon footprint as well as municipal waste. In April 2011, the US Air Force Special Operations Command invested in a new plasma-based transportable gasification system, with the aim of reducing its waste output by 4,200 tons a year in air bases across the country. Later that year, Britain approved the first advanced gasification plant in the country, with the potential to generate 49 megawatts of renewable energy (enough to power around 8,000-16,000 US households). Some have even speculated that this new technology could be used to spark a boom in hydrogen cell powered vehicles in the future.

Not everyone has embraced the new technique, however. The proposal for a biomass gasification plant in DeKalb County, Georgia was met with protests from locals, fearing carcinogenic emissions. Furthermore, a 2009 report by The Blue Ridge Environmental Defence League warned that gasification shares many similarities with incineration, including the formation of pollutants and greenhouse gasses.

Despite these arguments, the gasification of biomass has several benefits. The high temperatures make them an ideal means of processing bio-hazardous waste from hospitals and the plants themselves occupy very little physical space. As with any emerging technology, however, uptake is cautiously slow. Many of the new plants are in trial stages and it is uncertain whether gasification will have any long-term environmental effects. Should the existent plants prove to be successful, there is no reason to doubt that gasification will become a realistic solution for environmentally sound energy generation.

 

About the author:

Fintan Burke is a student at the School of Biotechnology at Dublin City University. His main fields of interest include biomedical therapies and recombinant organisms.  Fintan may be contacted at fintan.burke2@mail.dcu.ie .

This is a guest post from Erin M. Hall. Erin is the Technical Leader at Genetica DNA Laboratories, Inc. located in Cincinnati, OH. Do you have a response to Erin’s post? Respond in the comments section below.

It is estimated that 18-36% of all actively growing cell line cultures are misidentified and/or cross-contaminated with another cell line (1).  For researchers in any field of biomedical science, this could mean that a significant amount of the experimental data published in current and past journals is of questionable value.  Every year, millions of dollars of public money are spent on potentially wasted research and this is happening not just here in the United States but around the world as well.

Cell line misidentification and cross-contamination has been around for more than 50 years.  It was finally brought to light in 1966 by Stanley Gartler, who reported that 19 supposedly independent human cell lines were in fact HeLa cells(2), which are known to be extremely robust, fast growing and able to contaminate other cultures by aerosol droplets.  There was much resistance to his findings and scientists didn’t want to admit that the research done using those contaminated cell lines may be questionable and potentially irreproducible.  Walter Nelson-Rees was one scientist who supported Gartler’s findings.  Nelson-Rees highlighted the papers and the scientists who were publishing experimental data using misidentified cell lines and for this, in 1981, he lost his contract with the National Institutes of Health (NIH) because his behavior was deemed “unscientific”(3).  From 1981 and on, misidentification went unchecked and even cell line repositories continued to distribute lines under their false names (3).

To exacerbate the problem, certain cell culture practices may be aiding cell misidentification and cross-contamination, including the practice of assessing the phenotypic characteristics, such as protein expression, as the only way to properly identify the cell population.  It has been proven that phenotypic expression can change with an increased passage number or even with changes in growth medium or other cell culture conditions(4).  The modern way of assessing the correct identity of the cell line (“cell line authentication”) is to perform short tandem repeat (STR) DNA testing.  The STR DNA profile of a human cell line is similar to a person’s fingerprint; it is unique to that individual.  STR testing is now the “gold standard” of human identification testing and is routinely used by the FBI in profiling convicted offenders (CODIS).  STR profiling is a straightforward and effective way to confirm that the cell line you think you have been using for the past 5 years, is in fact, the genuine cell line.

The reason the problem continues today is because it has not been properly brought to the attention of researchers.  Many researchers learn about the service the hard way, i.e. at the last minute when the journal requests confirmation of authentication before considering your article for publication.  In a survey that profiled 483 researchers who actively use cell cultures, only 33% authenticate their cell lines and 35% obtained their lines from other laboratories rather than a cell line repository, such as American Type Culture Collection (ATCC) (3).  We, as researchers, expect to use only the best reagents and supplies but the one aspect of the experiment that may be the most important, i.e. the cell line, is consistently and explicitly overlooked.  ATCC recommends verifying the identity of all cell lines before you start your experiments, every two months during active growth, and just prior to publication.

The NIH now officially recognizes that cell line misidentification is a serious problem in the scientific community.  They state in a formal notice issued on their website (NOT-OD-08-017) that grant applications that fail to employ acceptable experimental practices would not be looked upon favorably and would potentially not fare well in the journal review process.  The NIH encourages all peer reviewers and researchers to consider this problem carefully “in order to protect and promote the validity of the science [they] support”.  Many journals, such as those published by the American Association for Cancer Research (AACR) require a statement in the “Materials and Methods” section as to whether cells used in the submitted manuscript were authenticated.  Not properly authenticating the lines may prohibit the article from being published when peer reviewed.   To continue the advancement towards the elimination of the problem of cell line misidentification and cross-contamination, ATCC, in early 2012, released a set of guidelines written by the international Standard Development Organization (SDO) workgroup; these guidelines provide researchers with information on the use of STR DNA profiling for the purpose of cell line authentication.  In the near future, with the help of all of these influential supporters, cell line authentication will become a routine quality control check in every laboratory in the United States and around the world.

I would love to hear other thoughts and comments on this topic.  Tell us about your experiences with cell line authentication – good or bad!

(1)   Editorial – Nature 457, 935-936 (2009).
(2)   Gartler, SM. Second Decennial Review Conference on Cell Tissue and Organ Culture: 167-195 (1967).
(3)   ATCC SDO Workgroup.  Cell line misidentification: the beginning of the end: 441- 448 (2010).
(4)   Kerrigan, L.  Authentication of human cell-based products: the role of a new consensus standard: 255-260 (2011).

About the author:

Erin is the Technical Leader at Genetica DNA Laboratories, Inc. located in Cincinnati, OH. She is responsible for the technical operations of the laboratory, as well as, all aspects of the daily casework involving DNA identity sample processing and quality assurance. She received her Master’s degree in Forensic Science from PACE University in NYC and her Bachelor’s degree in Molecular Biology from the College of Mount Saint Joseph in Cincinnati, OH. For more information on Genetica, visit www.genetica.com or their website dedicated to cell line authentication, www.celllineauthentication.com .

Prior to joining Genetica, Erin worked in New York City as a laboratory manager and researcher in the Pharmacology department at Cornell University’s Medical School. She designed and executed complex experiments that examined the effects of environmental toxins on liver enzyme production utilizing HPLC, UV/vis spectroscopy, Western blotting and PCR analysis. Her work contributed to several published journal papers (under Erin Labitzke, if you want to read them!), most recently including being cited as first author on a paper related to enzymes present in mitochondria.

Erin may be contacted at erinhall@genetica.com.