Home Guest content

This is a guest post from Susan K Finston, President of Finston Consulting. Do you have a response to Susan’s post? Respond in the comments section below.

Susan Kling FinstonIt looks like the ‘new normal’ is not just for biotechnology startups and SME pharma companies  – the CROs (and other vendors) who previously benefited from better R&D funding streams also feel the chill of the ongoing funding freeze.

Discussions with CROs and research consultants in the US and abroad reveal concerns relating to lengthy delays between initial discussion of research and final approval, reduced budgets for research programs, and generally falling demand for CRO services.  Their concerns are not just anecdotal.  A recent National Academy of Sciences (NAS) study sees the U.S. Clinical Trial Enterprise in particular as in decline:

There is ample evidence that U.S. trials are be-coming more expensive (DeVol et al., 2011). Worse, 90 percent fail to meet enrollment goals, and additional evidence points to disillusionment among American investigators (Getz, 2005). The rate of attrition among U.S. investigators is increasing, even among experienced researchers with strong track records of productivity, while 45 percent of first-time investigators abandon the field after their first trial. The system has become so inefficient that even the NIH is offshoring clinical trials at a substantial rate (Califf, 2011; Kim et al., 2011), using taxpayer funding to conduct trials in countries with less expensive and more efficient CTEs, despite concerns about generalizability as noted above.

The EU has seen a 25% decline in registration of new clinical trials and has begun a legislative process to improve the research environment in Europe. More broadly, an interesting Canadian clinical trial survey available here, shows a decrease in trials and related sites globally between 2008 – 26,241 sites and 990 trials – and 2010 – 22,358 sites and 760 trials respectively.  While finding increases in clinical trial activities in developing countries in Asia, they note the overall global trend of reduced clinical trial starts.

So the fundamental realignment of early-stage biotech valuation that makes it more challenging for start-ups and SMEs also has had unintended consequences for Clinical Research Organizations (CROs) providing pre-clinical and clinical research services.  And research budgets are falling across the board, more broadly, as larger companies and even public research institutions face cost containment pressures.

Given the critical importance of the clinical research enterprise for generation of social and economic good, it will be interesting to see how policy makers respond and / or if the market will rebound if economic growth increases.

About the author:

President of Finston Consulting LLC since 2005, Susan works with innovative biotechnology and other clients ranging from start-up to Fortune-100, providing support for legal, transactional, policy and “doing business” issues. Susan has extensive background and special expertise relating to intellectual property and knowledge-economy issues in advanced developing countries including India and South Asia, Latin America and the Middle East North Africa (MENA) region. She also works with governments, s and NGOs on capacity building and related educational programs through BayhDole25. Together with biotechnology pioneer Ananda Chakrabarty, she also is co-founder of Amrita Therapeutics Ltd., an emerging biopharmaceutical company based in India with cancer peptide drugs entering in vivo research. Previous experience includes 11 years in the U.S Foreign Service with overseas tours in London, Tel Aviv, and Manila and at the Department of State in Washington DC. For more information on latest presentations and publications please visit finstonconsulting.com.

This is a guest post from the BiotechBlog Intern,  Fintan Burke. Fintan is a student at the School of Biotechnology at Dublin City University. Do you have a response to Fintan’s post? Respond in the comments section below.

It has now been more than a year since the FDA published a welcomed revision of the 1987 guideline “Process validation: General Principles and Practices” in early 2011. This guideline serves as a complete rewrite to its previous incarnation, eliminating outdated practices, putting more emphasis on scientific evidence and completely redefining its definition of “process validation.” This update follows in a wave of acts by the FDA to reinvent the way it tackles an aspect of life science that was initially poorly understood.

Initially this guideline was developed in response to concerns raised in the early 1970s that end-product testing to test process standards was insufficient. While this guideline led to a more direct and formal protocol for process validation, the broad language used throughout left many companies to follow their own interpretation. For example, the 1987 guideline states that the process efficacy should be validated by producing a certain number of batches in a row successfully, though it never actually mentioned an appropriate number of trials (eventually the idea of “Three Golden Batches” emerged among companies). As expresspharma also pointed out, the guideline put pressure on manufacturers to maintain the process that created these “Golden Batches” without needing to understand, or even control, the parameters that caused them.

In other words, as long as a process was proven to work enough times, being disciplined in this procedure took precedent to understanding it.

What followed were years of criticism of the FDA’s attitude to Good Manufacturing Process (GMP) guidelines. As discussed at globepharm, the FDA routinely failed to update its GMP standards by claiming such standards were only the minimum held, thereby shifting the responsibility to companies to keep their standards current. Though revisions were often promised, they never seemed to have come to fruition.

This all changed as part of the “Pharmaceutical cGMPs for the 21st Century” initiative launched in 2002, when the FDA began its review of the guideline to better the approach of quality control for the pharmaceutical industry.

The most obvious change to the 2011 edition was the new definition of process validation, which shifted from requiring only the end product as proof of good process to instead consistently checking the process designs throughout the product’s life cycle to see if standards are met. This meant that the 3 Golden Batches concept became obsolete and a new measure of process evaluation was needed. To this end the FDA also explicitly emphasised in its guidelines the need for a 3 stage process evaluation; a process is first designed based on previous process knowledge, proven to generate reproducible results and finally routinely verified to ensure the whole process is controlled.

This more holistic, life-cycle based approach is seen as a massive improvement to the 1987 guideline particularly since the wording now vividly states what is expected of the industry, while still being applicable to individual companies. Several new questions have surfaced, however. By deserting the 3 Golden Batches concept, companies must reassess how many batches must be tried to prove standards are met. This, as Dr Mike Long et al discuss, is one area where the new guideline remains obscure; it instead merely states that enough tests should be run to statistically justify going into commercial production. Another problem is that the guideline clashes at some points with EU regulations. Apart from differences in the wording of some principles, Annex 15 (article 25) of the EU Guide to GMP also seems to recommend the 3 Golden Batches method, directly contrasting the FDA’s efforts to quell its practice.

While changing regulation, the FDA is being proactive in its approach to enforcing these regulations. For example, a recent event held in Colorado allowed industry representatives to discuss with FDA representatives from that district some of the challenges encountered during inspections. In recognition of the growing multinational behaviour of product manufacture the FDA is also collaborating with the European Regulatory Network to monitor foreign companies in their own territories. This follows a successful application by the FDA to become a member of the Pharmaceutical Inspection Co-operation Scheme (PIC/S) in November of 2010, a process that required a lengthy review period that involved each division of the FDA applying individually to PIC/S in order to gain admittance, thus furthering its modernisation and easing pressure to establishing GMP guidelines.

There can be no question that the FDA has recently taken a turn for the better in trying to modernise its efforts in bioprocess regulation, an area it initially had failed to develop. Despite these improvements, criticism remains. It appears too that the FDA’s recent efforts are just the beginning of an international trend in reviewing bioprocess standards, with a major update to the European Commission’s GMPs in bioprocesses coming into effect early next year.

About the author:

Fintan Burke is a student at the School of Biotechnology at Dublin City University. His main fields of interest include biomedical therapies and recombinant organisms.  Fintan may be contacted at fintan.burke2@mail.dcu.ie .

For Third Rock Ventures' Mark Levin, personalized medicine is nothing new. Instead, he sees the history of pharma as a gradual homing in on the roots of disease, and genomic mapping is just the next step. By Damien Garde, Fierce Biotech. From the herbal remedies of early civilizations to the dawn of modern pharmaceuticals, researchers have been slowly drilling down, Levin said, personalizing treatments one step at a time. Now, with the help of sequencing, we can move beyond treating phenotypic effects and get to the heart of disease: its genotypic origins. That is, if we can all get along. Levin spoke to a packed house Tuesday at the Personalized Medicine Coalition's State of Personalized Medicine Luncheon in Washington, DC, and his message was simple: Personalized medicine has the potential to revolutionize biotech and pharma, but only if stakeholders work together. Levin's had a long career in the industry, going from an engineer and project leader at Eli Lilly ($LLY) and Genentech to CEO of Millennium Pharmaceuticals. Since 2007, he's been at the helm of Third Rock Ventures, a VC firm that invests in and builds innovative biotech companies. Through 36 years in the field, Levin said he's seen the great promise of genomics inspire minds around the industry, only to be slowed by the usual suspects: companies unwilling to collaborate, regulators reluctant to cooperate and researchers getting a little ahead of themselves in the news media. Now, however, as the cost of <b>...</b>
Views: 136
0 ratings
Time: 07:32More in Science & Technology

Part 2 of 2. Social media is becoming an important gateway to patients and consumers for biotechnology companies wishing to commercialize without the major investment required for a sales force. Greg Stanley, Chief Commercialization Officer of Oncimmune, talks to Journal of Commercial Biotechnology reporter Rolf Taylor about engaging with smokers and ex-smokers on Facebook.
Views: 54
0 ratings
Time: 09:27More in Science & Technology

Part 1 of 2. Social media is becoming an important gateway to patients and consumers for biotechnology companies wishing to commercialize without the major investment required for a sales force. Greg Stanley, Chief Commercialization Officer of Oncimmune, talks to Journal of Commercial Biotechnology reporter Rolf Taylor about engaging with smokers and ex-smokers on Facebook.
Views: 63
0 ratings
Time: 12:32More in Science & Technology

This is a guest post from the BiotechBlog Intern,  Fintan Burke. Fintan is a student at the School of Biotechnology at Dublin City University. Do you have a response to Fintan’s post? Respond in the comments section below.

As researchers continue to investigate the complex nature of cell tissues and their behaviour, it is becoming increasingly apparent that conventional tissue culture methods such as Petri dishes and well plates are only capable of giving a fraction of the picture.

Over the last few years, there has been increased interest in novel approaches that allow cell cultures to grow in a 3D media. Indeed, 3D culture boasts many benefits over conventional 2D media. In a 2007 Nature article, Pampaloni et al argue that 3D culture has the potential to represent a true in vivo cellular environment without the need of ethically questionable animal testing. This type of culture can also give better insight into cell architecture, signalling and mechanics, which has already been recognised in cancer research; a 2008 study by Fischbach et al showed that tumor cells grown in 3D culture “recreated tumor microenvironmental cues” as well as increased tumor vascularisation compared to that of 2D cell cultures.

Demand for 3D culture is expected to grow as researchers search for new approaches to cellular research while lessening the need for animal testing.  From this demand, several approaches have been taken to develop 3D culture methods:

One method involves offsetting the natural sedimentation of cells in an aqueous media by gently rotating the bioreactor in an apparatus called a rotating wall vessel bioreactor. Cells will typically be attached to microcarrier bead “scaffolds” to allow for 3D chemical responses in the bioreactor. Originally developed by NASA to examine microbial growth in zero gravity, the culture method boasts the advantage of replicating the natural low shear effect found in the body which has been found to be influential in a pathogen’s infection potential.

Another system employs magnetism to develop 3D tissue cultures. This method, termed magnetic cell levitation, uses loaded bacteriophages to “infect” the cells for culture with faint amounts of iron oxide and gold. These cells are then left in a Petri dish to grow while a ring placed on top of the dish subjects them to magnetic forces, causing them to hover in suspension. In a 2010 issue of Nature Nanotechnology, Souza et al argue that this method has the potential to “be more feasible for long-term multicellular studies” as well as its ease of control and cost-effectiveness in research.

Recently attention has been paid to developing 3D culture media without an external influence. Microtissues Inc. has developed a form of tissue culture that rids the need of scaffolds in the culture. The result, claims CEO Jeffrey Morgan, is that uniform cells are prepared more efficiently and with more constant results than when scaffolds are used. Another company, Microtissues.com, also claim their 3D Petri dish maximises cell-cell interactions and allows controllable cell size.

These examples represent only a fraction of the new methods being constantly developed for 3D culturing of cells. As recently as last month, TAP Biosystems unveiled their newest collagen-based 3D cell culturing method for 96-well plates.  This recent boom in development is undoubtedly due to the realisation that research using the (since now) conventional 2D culture is nearing its end. Though 3D culture has the potential to become the fundamental choice for research into cancer and drug therapy, some issues remain. Modern microscopic imaging may struggle with the denser tissue samples. A common standard also needs to emerge in order to establish a unified protocol in research. Should these concerns be addressed, there can be little doubt that 3D cell culture will emerge as the cheap, informative and dominant research method for years to come.

This is a guest post from the BiotechBlog Intern,  Fintan Burke. Fintan is a student at the School of Biotechnology at Dublin City University. Do you have a response to Fintan’s post? Respond in the comments section below.

One of the most overlooked but consistent problems facing many governments is waste management. Despite healthy recycling attitudes in both the US and UK, an EPA report showed US total waste production in 2010 was still around 250 million tons, while there are concerns that the UK will run out of landfill sites by 2018.

For many years, the only viable alternative to landfills was incineration. Despite its efficiency over landfill sites (incineration can reduce the waste mass by around 90%), concerns over small energy generation efficiency (estimated at 20-25%) as well as public protest over environmental impact mean incineration can never be a permanent solution.

As public and private sectors are beginning to shift their attention to cleaner, more efficient alternatives to waste disposal, one of the leading candidates is gasification.

Gasification has been with us in various forms since the 1840s. The process involves extracting combustible gases by subjecting dehydrated carbonaceous materials to intense temperatures and reacting the resulting ‘char’ with oxygen and/or steam. Originally coal and wood were used in the process and so bore little difference to incineration. Since the 1970s, however, focus has shifted from using these conventional inputs to biomass.

From this change in focus, several companies have been set up to offer biomass gasification as an effective renewable resource. One such company, Thermoselect, claims that for every 100kg of waste processed, 890kg of “pure synthesis gas” is created for energy generation. Another company, ZeroPoint Clean Tech Inc., is keen to demonstrate gasification’s use in generating renewable gas, heat, water and electricity.

This development has been embraced by both the US and UK governments, welcoming the opportunity to reduce their carbon footprint as well as municipal waste. In April 2011, the US Air Force Special Operations Command invested in a new plasma-based transportable gasification system, with the aim of reducing its waste output by 4,200 tons a year in air bases across the country. Later that year, Britain approved the first advanced gasification plant in the country, with the potential to generate 49 megawatts of renewable energy (enough to power around 8,000-16,000 US households). Some have even speculated that this new technology could be used to spark a boom in hydrogen cell powered vehicles in the future.

Not everyone has embraced the new technique, however. The proposal for a biomass gasification plant in DeKalb County, Georgia was met with protests from locals, fearing carcinogenic emissions. Furthermore, a 2009 report by The Blue Ridge Environmental Defence League warned that gasification shares many similarities with incineration, including the formation of pollutants and greenhouse gasses.

Despite these arguments, the gasification of biomass has several benefits. The high temperatures make them an ideal means of processing bio-hazardous waste from hospitals and the plants themselves occupy very little physical space. As with any emerging technology, however, uptake is cautiously slow. Many of the new plants are in trial stages and it is uncertain whether gasification will have any long-term environmental effects. Should the existent plants prove to be successful, there is no reason to doubt that gasification will become a realistic solution for environmentally sound energy generation.


About the author:

Fintan Burke is a student at the School of Biotechnology at Dublin City University. His main fields of interest include biomedical therapies and recombinant organisms.  Fintan may be contacted at fintan.burke2@mail.dcu.ie .

This is a guest post from Erin M. Hall. Erin is the Technical Leader at Genetica DNA Laboratories, Inc. located in Cincinnati, OH. Do you have a response to Erin’s post? Respond in the comments section below.

It is estimated that 18-36% of all actively growing cell line cultures are misidentified and/or cross-contaminated with another cell line (1).  For researchers in any field of biomedical science, this could mean that a significant amount of the experimental data published in current and past journals is of questionable value.  Every year, millions of dollars of public money are spent on potentially wasted research and this is happening not just here in the United States but around the world as well.

Cell line misidentification and cross-contamination has been around for more than 50 years.  It was finally brought to light in 1966 by Stanley Gartler, who reported that 19 supposedly independent human cell lines were in fact HeLa cells(2), which are known to be extremely robust, fast growing and able to contaminate other cultures by aerosol droplets.  There was much resistance to his findings and scientists didn’t want to admit that the research done using those contaminated cell lines may be questionable and potentially irreproducible.  Walter Nelson-Rees was one scientist who supported Gartler’s findings.  Nelson-Rees highlighted the papers and the scientists who were publishing experimental data using misidentified cell lines and for this, in 1981, he lost his contract with the National Institutes of Health (NIH) because his behavior was deemed “unscientific”(3).  From 1981 and on, misidentification went unchecked and even cell line repositories continued to distribute lines under their false names (3).

To exacerbate the problem, certain cell culture practices may be aiding cell misidentification and cross-contamination, including the practice of assessing the phenotypic characteristics, such as protein expression, as the only way to properly identify the cell population.  It has been proven that phenotypic expression can change with an increased passage number or even with changes in growth medium or other cell culture conditions(4).  The modern way of assessing the correct identity of the cell line (“cell line authentication”) is to perform short tandem repeat (STR) DNA testing.  The STR DNA profile of a human cell line is similar to a person’s fingerprint; it is unique to that individual.  STR testing is now the “gold standard” of human identification testing and is routinely used by the FBI in profiling convicted offenders (CODIS).  STR profiling is a straightforward and effective way to confirm that the cell line you think you have been using for the past 5 years, is in fact, the genuine cell line.

The reason the problem continues today is because it has not been properly brought to the attention of researchers.  Many researchers learn about the service the hard way, i.e. at the last minute when the journal requests confirmation of authentication before considering your article for publication.  In a survey that profiled 483 researchers who actively use cell cultures, only 33% authenticate their cell lines and 35% obtained their lines from other laboratories rather than a cell line repository, such as American Type Culture Collection (ATCC) (3).  We, as researchers, expect to use only the best reagents and supplies but the one aspect of the experiment that may be the most important, i.e. the cell line, is consistently and explicitly overlooked.  ATCC recommends verifying the identity of all cell lines before you start your experiments, every two months during active growth, and just prior to publication.

The NIH now officially recognizes that cell line misidentification is a serious problem in the scientific community.  They state in a formal notice issued on their website (NOT-OD-08-017) that grant applications that fail to employ acceptable experimental practices would not be looked upon favorably and would potentially not fare well in the journal review process.  The NIH encourages all peer reviewers and researchers to consider this problem carefully “in order to protect and promote the validity of the science [they] support”.  Many journals, such as those published by the American Association for Cancer Research (AACR) require a statement in the “Materials and Methods” section as to whether cells used in the submitted manuscript were authenticated.  Not properly authenticating the lines may prohibit the article from being published when peer reviewed.   To continue the advancement towards the elimination of the problem of cell line misidentification and cross-contamination, ATCC, in early 2012, released a set of guidelines written by the international Standard Development Organization (SDO) workgroup; these guidelines provide researchers with information on the use of STR DNA profiling for the purpose of cell line authentication.  In the near future, with the help of all of these influential supporters, cell line authentication will become a routine quality control check in every laboratory in the United States and around the world.

I would love to hear other thoughts and comments on this topic.  Tell us about your experiences with cell line authentication – good or bad!

(1)   Editorial – Nature 457, 935-936 (2009).
(2)   Gartler, SM. Second Decennial Review Conference on Cell Tissue and Organ Culture: 167-195 (1967).
(3)   ATCC SDO Workgroup.  Cell line misidentification: the beginning of the end: 441- 448 (2010).
(4)   Kerrigan, L.  Authentication of human cell-based products: the role of a new consensus standard: 255-260 (2011).

About the author:

Erin is the Technical Leader at Genetica DNA Laboratories, Inc. located in Cincinnati, OH. She is responsible for the technical operations of the laboratory, as well as, all aspects of the daily casework involving DNA identity sample processing and quality assurance. She received her Master’s degree in Forensic Science from PACE University in NYC and her Bachelor’s degree in Molecular Biology from the College of Mount Saint Joseph in Cincinnati, OH. For more information on Genetica, visit www.genetica.com or their website dedicated to cell line authentication, www.celllineauthentication.com .

Prior to joining Genetica, Erin worked in New York City as a laboratory manager and researcher in the Pharmacology department at Cornell University’s Medical School. She designed and executed complex experiments that examined the effects of environmental toxins on liver enzyme production utilizing HPLC, UV/vis spectroscopy, Western blotting and PCR analysis. Her work contributed to several published journal papers (under Erin Labitzke, if you want to read them!), most recently including being cited as first author on a paper related to enzymes present in mitochondria.

Erin may be contacted at erinhall@genetica.com.

This is a guest post from the BiotechBlog Intern,  Fintan Burke. Fintan is a student at the School of Biotechnology at Dublin City University. Do you have a response to Fintan’s post? Respond in the comments section below.

According to a BDO industry report, a smallUS biotech company in 2010 enjoyed average revenues of around $42m while larger firms reported average revenue of around $124m. Additionally the European biotech sector also enjoyed a sizeable success with revenues totalling €13bn the same year. Global biotechnology revenues are estimated to grow to €103bn by 2013, bolstered by the pharmaceutical market which is expected to become a trillion-dollar industry by 2014.

These high revenues can attract more than just investors; smaller companies are seeing the benefits of asserting breach of their own patents in order to attain lawsuit settlements or licensing fees. Though more well-known in the technology sector, these ‘Patent Trolls’ have started to attract attention in biotech circles.

A standout case was that of Classen Immunotherapies Inc. which brought four biotechnology companies and a medical group to court for infringing on their patent of an immunisation schedule that could curb the risk of developing chronic diseases. Although the lawsuit was first thrown out by the district court as only a mental abstract, on appeal the federal court ruled in Classen’s favour citing that Classen has a “statutory process” that allows for patent protection.

This has set a troubling precedent in biotech law; since the Classen patents were somewhat broad, there could soon be a flood of similar companies trying to claim patent infringement based in immunisation or dosage schedules.

Indeed, there is proof of some small firms already trying to build a portfolio of biotech patents. These ’non-practicing entities’ deliberately gather patents – not in order to develop products – but rather extort other companies for settlements or licensing fees. There are already specialized law firms which help companies obtain and enforce biotech-specific patents. Such companies have been known to damage stock prices, delay production and eat into revenues – all of which is completely legal.

Many identify these frivolous litigations to lie not in the vagueness of the patents, but rather in unspecific patent legislation. In Ronald I. Eisenstein’s 2006 column in The Scientist, he notes that “One size does not fit all in terms of approaching patents.” Any legislation passed to curtail the practice of ‘Trolling’ in the technology sector may inadvertently harm smaller biotech companies and universities that rely on larger companies in the FDA approval process.

In his 2008 book Intellectual Property and Biotechnology: Biological Inventions, Dr. Matthew Rimmer offers some solutions to this growing problem. “Novelty and utility are the criteria used to judge whether something is inventive or not” he writes. “It is really those doctrinal concepts that need to be tightened.”

In a 2011 Forbes article Colleen Chien also offered some advice to defend against the trolls. She notes that many trolls will use contingent fee based lawyers to manage costs. Firms that pay via successful disposal of a suit or minimise settlement costs cn likewise minimise legal fees and increase the lawyer’s incentive to defend them. Furthermore, larger firms could be better off outsourcing their defence to specialist lawyers, rather than solely relying on their own legal team.

Patent trolls remain a very real problem in the world of technology. In the most infamous case, Research In Motion (producers of the Blackberry) paid a $600m settlement to NTP Inc for infringing their wireless email patents. Fortunately steps have been taken at a federal level. The passing of the Leahy-Smith American Patents Act in September 2011 has allowed any firm threatened with infringement to petition for a patent review within 4 months of being sued. Nonetheless the biotechnology sector must begin to reassess its patent rights and monitor such changes in legislation if it is to further grow as an industry.

About the author:

Fintan Burke is a student at the School of Biotechnology at Dublin City University. His main fields of interest include biomedical therapies and recombinant organisms.  Fintan may be contacted at fintan.burke2@mail.dcu.ie .

This is a guest post by Jurgita Ashley

You are a small company ready to take a leap into the public market, whether it is for growth, liquidity or to attract greater investor interest. But, oh man, those dollar figures for IPOs would make anyone’s head spin. But wait, don’t discount it yet as a viable alternative. If done right, going public does not have to cost a fortune. Small companies can take advantage of the SEC’s relaxed reporting regime, may strategically decide to list on the Bulletin Board (the OTCBB) rather than the NYSE or NASDAQ, and can significantly limit their corporate governance-related expenses.

Your first major—and unavoidable—expense will be the preparation of a registration statement. Inevitably, it will require management’s time, preparation of audited financials and legal fees. You do not have to be charged, however, $1,000 per hour or other exorbitant fees, and your IPO team does not have to include 50 professionals. If you are a “smaller reporting company,” which is a company with a public non-affiliate common equity float of less than $75 million (or annual revenue of less than $50 million if the float cannot be calculated), your reporting requirements will be limited. Your registration statement – whether it is on a Form S-1 (involving an immediate capital raise) or a Form 10 (initial registration with the SEC to position the company for a subsequent capital raise) – will include less financial information and disclosures than is required for larger companies. In addition, this first registration statement is the “meat and bones,” so your subsequent filings will build upon this information and will involve much less drafting from scratch. This first registration statement will most likely be reviewed by the SEC, which will issue comments requiring one or more amendments. This review, however, will most likely be limited. With the Dodd-Frank and other legislative initiatives and demands on the SEC’s resources, the days of 100 plus comments are largely over. As long as your accounting is in order and your legal advice is good, you should be able to maneuver through the SEC’s comment process without excessive delays or expense.

Now, let’s say your primary goal is to obtain greater investor interest in the company and to create an avenue to sell stock. To achieve this, it is not necessary to pay the NYSE’s or NASDAQ’s listing fees or to become subject to their reporting and governance requirements. Although the OTCBB is usually not the market of first choice, it can be an effective vehicle to provide some liquidity and disseminate information about the company. To list on the OTCBB, a company only needs a market maker and to file reports with the SEC. By listing on the OTCBB, the company becomes subject to the oversight of FINRA, but there are no listing fees, no additional reporting requirements, and no special governance requirements. In addition, if down the road you are ready to transition to the NYSE or the NASDAQ, your platform will already be in place.

As a company that is listed on the OTCBB only, you are subject to limited corporate governance requirements (imposed by the SEC). Yes, some of your directors should be independent, committees should operate under board-approved charters and the company should have a code of ethics and reasonable internal controls, but all of these policies and procedures need not become all consuming. There is no need for a small company with limited financial resources to adopt all the latest “best practices” in governance or add a whole department to address the company’s new reporting obligations. Pursuant to recent SEC relief, “smaller reporting companies” also do not need to obtain—and pay for—an auditor’s report on internal control over financial reporting. Reasonable disclosure controls and procedures are important and, in some instances, improving the company’s policies and procedures is desirable and appropriate. In many cases, however, most new obligations of a small public company can be satisfied without exorbitant expense.

Nearly 50 percent of all public companies in the United States are “smaller reporting companies.”(1) Of course, not every small private company will find it desirable to go public, and for some, a full-blown—and expensive—IPO is an appropriate option. The perceived costs, however, should not discourage other companies from evaluating the option of a lower cost IPO. Access to the public markets is no longer insurmountable.

Jurgita Ashley is an attorney in the Cleveland, Ohio office of Thompson Hine LLP and is a member of its Corporate Transactions and Securities practice group. Her practice is focused on public company matters, primarily securities law, corporate governance and takeover matters. She can be reached at  Jurgita.Ashley@ThompsonHine.com or through www.ThompsonHine.com. The views expressed in this article are attributable to the author and do not necessarily reflect the views of Thompson Hine LLP or its clients. The author would like to thank Derek D. Bork, a partner at Thompson Hine LLP, for his review and invaluable input on this article.

(1) Forty-eight percent of all U.S. companies filing annual reports on Form 10-K with the SEC were “smaller reporting companies” for the period from October 1, 2009 through September 30, 2010, which is the SEC’s latest fiscal year for which data is available. Proxy Disclosure Blog by Mark Borges at CompensationStandards.com (December 3, 2010), available at www.compensationstandards.com.