Advertisement
Articles
Advertisement

Gene Expression: Better Tools, More Complex Results

Thu, 05/11/2006 - 8:10am
Gene expression has become a mainstream technique in biology, medicine, and drug development. Broadly speaking, gene expression comes in two flavors: analytical and functional. The former refers to detecting and quantifying genes, especially by comparing two samples (treated/untreated, diseased/normal). With functional gene expression, the aim is to obtain some quantity of protein.
     Judy Macemon, Business Manager for Transfection at Invitrogen (Carlsbad, CA), says her customers divide about 2-1, with those interested in gene expression/knockdown for investigating gene function about double the number of those using gene expression to make protein.

Express yourself
Classical gene expression profiling compares native cells or tissues with those that are diseased or have been treated with a drug. To validate the effect of a compound on genes, researchers turn to knock-down experiments using siRNA or antisense compounds. These techniques, especially siRNA, are all the rage, as they allow the specific silencing of genes or groups of genes. "Gene knockdowns are the flavor of the year," notes Paul Todd, Ph.D., genomics product manager at Open BioSystems (Huntsville, AL), but not the only way to confirm gene activity.
Model of cDNA over-expression. Courtesy Open Biosystems.
Click here to enlarge

Dr. Todd, whose company sells siRNA reagents, makes a good case for cDNA over-expression, which he calls the "logical reverse" of gene knockdown. Unlike siRNA, which silences a gene to determine its downstream effects, cDNA introduces many copies of the target gene, causing the super-expression of that gene and all its downstream products. "In many ways you get better information from cDNA than from siRNA, and there's no design involved," Todd said.
With siRNA, researchers must decide which short oligonucleotide sequence, out of several possibilities, will best knock down the target gene. Not all oligomers work equally well, and not all genes respond predictably. Typical knockdowns are on the order of 20-80%. With cDNA, over-expression is a certainty, provided enough copies of the desired gene are inserted. "cDNA is guaranteed to generate more messenger RNA, which will produce more protein or downstream effects," Todd adds.
The principal benefit of cDNA over siRNA is a much wider potential dynamic range for its effect. The best analogy here is of selling a stock short or buying it long. With short sales it is only possible to earn the original stock price, whereas the moon is the limit with long purchases since the stock can double or triple. The signal from an siRNA experiment is limited to the activity of the gene in its original state, since it declines from that level to, at most, zero. With cDNA one can up-regulate a gene's activity by several thousand percent. Furthermore, the resulting protein and its distribution may be visualized in living cells by fusing the gene of interest to the gene for green fluorescent protein.
Gene expression has many applications to drug discovery. At a recent conference at the New York Academy of Sciences (New York, NY), Paul Young, Ph.D., VP of research at Avalon Pharmaceuticals (Germantown, MD) showed how simple genetic profiles or signatures could be used to identify new drug candidates. Young and his team first observed and validated the differential gene expression caused by siRNA knockdowns of known "bad agents" in various diseases in panels of test cells. Using standard gene microarrays, he analyzed which genes were activated and deactivated, then picked between five and 20 genes whose changes were most stable or predictable. From the activity of this gene panel, Young constructed a panel or "bar code" that serves as a molecular signature specific to the disrupted pathway or target. New compounds tested against this panel of genes that show similar activity as the knockdown are considered "hits" in drug discovery parlance. Traditional medicinal chemists are no doubt looking over their shoulders at techniques like this, since they require no knowledge of the drug target.

It's not the gene, it's the context
When it comes to interpreting differences in gene expression, context is everything, says Donna Mendrick, Ph.D., scientific fellow and VP at Gene Logic (Gaithersburg, MD). "The value of any large database is in understanding the normal variability of a parameter. You may see a large number of genetic changes or differences in one experiment involving thousands of samples. Results may appear abnormal, but it may not be at all because a particular gene may fluctuate quite a bit."
     Gene Logic maintains one of the largest toxicogenomic gene expression databases for animal and human tissues in the world. Profiles are constructed from human gene expression (from biopsy samples) and data from clinical charts, or standard rat hepatocyte toxicology and gene expression profiles. From these databases, the company constructs statistically validated predictive models that are used to prioritize drug candidates, investigate biomarkers, or search for mechanisms of toxicity. Customers may also tap into two databases to investigate bridging biomarkers that are relevant in both preclinical and clinical studies.
     In the dark ages of the genomic era (about eight years ago), methods were much slower and less reliable than today. Gene chips provided a high level of parallelism and varying degrees of robustness (see next section). Since high-throughput methods were not very high-throughput, and tools like gene chips were rather costly, biologists constantly grappled with the question of how many tissue samples or animals were required for each data point to assure statistical reliability. Many experimenters believed they could overcome the hurdle of statistical significance by pooling data, say from ten rats, and analyzing on one gene chip or gel. "This was done mostly because of cost concerns," says Dr. Mendrick. Investigators at ILSI Health and Environmental Institute, a non-profit consortium of companies, academics, and regulators, determined that statistics do not apply unless actual replicates are taken and measured.
Gene Logic constructs biological pathways from clinical data, tissue pathology, and gene expression experiments.
Click here to enlarge


Non-buyers beware
Since the early days of microarray technology, academic (and some industrial) research groups have chosen to make their own gene and protein microarrays. Once a group owns the spotting equipment, chips can be turned out for the cost of reagents, substrates, and graduate student time. Besides saving research groups money, home-brewed chips can be tailored to specific organisms, tissues, or families of related genes and proteins.
     According to a study headed by Peter Spencer, Ph.D., professor of neurology at the Oregon Health Sciences University, home-made gene expression chips are unreliable. In a paper published last year in Nature Methods, Spencer and colleagues found that commercially produced gene chips produce comparable results no matter where the work takes place, but the same cannot be said for chips cobbled together in university labs. Discrepancies in spotting technique, instrumentation, and technician skill result in high variability from chip to chip and batch to batch.
     "As scientists, our results are only believable when the technology we use is reliable and reproducible," Spencer told Bioscience Technology. "Our data suggest that research published in the early days of microarrays, in which scientists used mostly home-made chips, may be peculiar to that institution or fabrication platform.
     Commercial manufacturers have always maintained much better control over manufacturing, especially with respect to chip-to-chip reproducibility, Spencer said. And they continue to improve, to the point where microarrays have become high-tech devices. Spencer specifically mentioned arrays from Illumina (San Diego, CA), which are based on oligonucleotide-coated micron-sized beads that self-assemble into microplate wells. In one setup, each array is fitted with 50,000 chemically etched fiber optic strands, which are bundled together to read the signal from beads as they light up.

Transient expression systems
Transient gene expression has been around for at least 20 years. The idea is simple: introduce a gene rapidly into a cell, and hope it finds its way to the nucleus where cellular machinery will convert its sequence into RNA and, ultimately, into protein. Transient expression systems utilize the gamut of transfection agents, from viruses that infect cells to mechanical techniques that propel genes into the nucleus. Transiently-transfected cell lines produce protein as long as they remain alive, usually in a short, powerful burst soon after transfection. Since the foreign gene is not part of the cell's genome, the new gene is not passed on to offspring.
     Transient techniques can produce hundreds of milligrams of protein within three to five days of transfection. Actual yields vary according to a half-dozen parameters, not the least of which is the size of the cell culture.
     For the last decade Wayne Curtis, Ph.D., professor of chemical engineering at The Pennsylvania State University (University Park, PA), has been working on rapid transient gene expression technology (patented in 2004) that delivers genes to plant tissues grown in bioreactors. The system uses Agrobacterium auxotrophs, mutant strains that lack one or more growth factors. Agrobacteria are pathogens that deliver genes to plants through a specialized mechanism of t-DNA transfer.
     "What we're trying to do is create the plant equivalent of the baculovirus system," Curtis said, referring to the transient expression vehicle for insect tissue culture, "which gets protein into your hands quickly."
     "Transient gene expression was not supposed to compete with permanent transfection as a production platform," says Dr. Curtis. Nevertheless Florian Wurm, Ph.D., considered the world's leading expert on cell culture, believes this is possible. Wurm has championed two simple transfection systems based on calcium phosphate for transiently introducing genes into mammalian cells, and claims scale-up to about 100 liters. Such volumes will not instill fear in large biomanufacturers any time soon, but in the right hands they can produce many grams of protein.

More than a buzzword
What impresses most is that gene expression studies are nowhere close to going out of style. If anything, interest in differential gene expression has mushroomed during the so-called post-genomic era.
     Even the U.S. Food and Drug Administration, which is not known for promoting swashbuckling science, is getting into the act. In March, 2005 the agency issued an industry guidance, Pharmacogenomic Data Submissions, which is perhaps the single most important pharmaceutical regulatory directive of the past decade. In it, FDA encouraged drug companies that regularly generate genomics information to show this data to the FDA. For the time being at least, the agency will not hang up an investigational new drug application based on pharmacogenomic data that might suggest toxicity or less than optimal efficacy among certain genotypes. The FDA seems genuinely interested in getting drug companies to use pharmacogenomic data wisely, to make better, safer drugs. "With this guidance, the agency has opened up the technology of gene expression to 'live' compounds," says Donna Mendrick of Gene Logic.
Advertisement

Share this Story

X
You may login with either your assigned username or your e-mail address.
The password field is case sensitive.
Loading