Monthly Archives: September 2012

lens-free holographic on-chip microscopy

 

 

 

Lens free Imaging:

Lens-free on-chip imaging refers to using a digital optoelectronic sensor array, such as a charge-coupled device (CCD) or complementary metal-oxide semiconductor (CMOS) chip to directly sample the light transmitted through a specimen without the use of any imaging lenses between the object and the sensor planes. ….The advancements in this type of microscopy are being spearheaded by the development of sensor chips that are continually being improved and introduced into consumer electronics products, particularly cell phones and high-end digital cameras.For a lens-free on-chip microscope, there are various design choices that one can select from. ….In general we can categorize bright-field lens-free microscopes into two main streams: (i) contact-mode shadow imaging–based microscopes and (ii) diffraction-based lens-free microscope….

Key components of lens-free holographic on-chip microscopy

In a partially coherent holographic on-chip microscope the source can simply be an LED or an array of LEDs. In case wavelength tunability is desired, a monochromator can also be used that is coupled to a multimode fiber. The spectral bandwidth of the source can vary from a few nanometers to 20–30 nm depending on the sample-to-detector distance and the resolution requirement of the system. Because the sample plane is close to the detector plane (typically ≤0.1–2 mm), the scattered light rays and the background light can still interfere at the sensor chip even though the temporal coherence lengths of such broadband sources are significantly shorter than those of lasers.

 

Read More…http://www.nature.com/nmeth/journal/v9/n9/full/nmeth.2114.html

Greenbaum et al, Nature Methods, Vol 9, pp 889–895, (2012)

 

Leave a comment

Filed under Uncategorized

Laser transmission spectroscopy (LTS)

Laser transmission spectroscopy (LTS) is a powerful solution offering real-time, DNA-based species detection in the field. Study found that LTS can measure the size, shape and number of nanoparticles in a solution and was used to detect size shifts resulting from hybridization of the polymerase chain reaction product to nanoparticles functionalized with species-specific oligonucleotide probes or with the species-specific oligonucleotide probes alone. A series of DNA detection experiments was carried out using the invasive freshwater quagga mussel (Dreissena bugensis) to evaluate the capability of the LTS platform for invasive
species detection. Specifically, LTS sensitivity was tested to (i) DNA concentrations of a single target species, (ii) the presence of a target species within a mixed sample of other closely related species, (iii) species-specific functionalized nanoparticles versus species-specific oligonucleotide probes alone, and (iv) amplified DNA fragments versus unamplified genomic DNA.We demonstrate that LTS is a highly sensitive technique for rapid target species detection, with detection limits in the picomolar range, capable of successful identification in multispecies samples containing target and non-target species DNA.

These results indicate that the LTS DNA detection platform will be useful for field application of target species. Additionally, it was found that LTS detection is effective with species-specific oligonucleotide tags alone or when they are attached to polystyrene nanobeads and with both amplified and unamplified DNA,
indicating that the technique may also have versatility for broader applications.

Lodge et al, DNA-based species detection capabilities using laser transmission spectroscopy, Interface Journal (RSJ), Sept 26,2012

Leave a comment

Filed under Uncategorized

Method: Synchrotron X-ray microfocus spectroscopy

Synchrotron X-ray microfocus spectroscopy was used to investigate and to detect trace distributions of Ti in thin tissue sections at a high resolution. This investigation has demonstrated, for the first time, a scattered and heterogeneous distribution of Ti in inflamed tissues taken from around failing skin-penetrating Ti implants. The tissue taken was adjacent to a CPTi device that was not exposed to obvious macroscopic wear or loading in service. Furthermore, the location of the distributed Ti, which was deep with respect to the skin surface, suggests that wear processes are unlikely to be a major contributor. Debris from implant insertion is highly unlikely to lead to the observed widespread distribution of fine fragments of both oxide and metal. In the absence of obvious macroscopic wear or loading processes, this paper proposes that the Ti in the tissue results from micro-motion and localized corrosion in surface crevices.

Addison O et al, Birmingham Univ, Do ‘passive’ medical titanium surfaces deteriorate in service in the absence of wear? J R Soc. Interface 2012, Vol 9, 3161-3164

Leave a comment

Filed under Uncategorized

Free Access to Chemical Information

In the latest effort to provide free access to chemical information, the London based company SureChem (owned by Digital Science a sister company to Nature Publishing Group) said that it has released data on 10 million molecules patented by the pharmaceutical industry since 1976.

Harvested automatically from some 20 million patents, the data could lower barriers to drug discovery by academic researchers.

The announcement made on 26 March 2012 at the spring meeting of the American Chemical Society ACS in San Diego California follows a similar move by computing giant IBM last December. IBM deposited computer-harvested data on about 2.4 million small molecules into PubChem the world’s largest free chemistry repository, which is run by the US National Library of Medicine in Bethesda, Maryland. Both data releases serve in part to promote the companies subscription services for patent and structure analysis. But Michael Walters, a chemist working in academic drug discovery at the Univ of Minnesota in Minneapolis, thinks that the initiatives could mark a sea change in the way in which patent data are accessed and analyzed.

Academic drug discovery will get another boost in Sept when a consortium of eight pharmaceutical firms three biotechnology companies and a number of leading informaticians releases its own free online drug discovery platform, the Open Pharmacological Concepts Triple Store (openPHACTS) data on small molecules and their biological effects, to provide a library  of compounds that anyone can download and explore

Unlike biologists who are swamped by free databases on genes and proteins, chemists have always expected to pay for their data. Until a few years ago the market in chemical information was monopolized by ACS Chemical Abstracts Service a manually curated registry that now holds more than 65 million structures, charges individual users thousands of dollars a year for access and does not allow large downloads or repurposing of its information. Its SciFinder service offers tools to make sense of the data. Similar analytical services are sold by firms such as IBM, Thomson Reuters and Elsevier in Amsterdam, which offers the Reaxys tool (see Chemistry breaks free).

But in 2004 the Us National Inst. of Health NIH created PubChem into which anyone can deposit data on structures and their biological activity. In 2005 the ACS sought to restrict PubChem’s reach to molecules characterized by NIH funded researchers, but was unsuccessful….(Source: Nature, March2012)

Leave a comment

Filed under Uncategorized

Research studies on carbon nanotubes

SEM image of carbon nanotubes deposited on a trench coated in hafnium oxide (HfO2) showing extremely high density and excellent selectivity (scale bar: 2 µm). (Image: IBM)

IBM researchers demonstrate the first sub-10 nm CNT transistor, which is shown to outperform the best competing silicon devices with more than four times the diameter-normalized current density (2.41 mA/μm) at a low operating voltage of 0.5 V. The nanotube transistor exhibits an impressively small inverse subthreshold slope of 94 mV/decadenearly half of the value expected from a previous theoretical study. Numerical simulations show the critical role of the metal−CNT contacts in determining the performance of sub-10 nm channel length transistors, signifying the need for more accurate theoretical modeling of transport between the metal and nanotube. The superior low-voltage performance of the sub-10 nm CNT transistor proves the viability of nanotubes for consideration in future aggressively scaled transistor technologies.

Source:

IBM Sub-10nm Carbon Nanotubes transistors, J Nanoletters, 2012: http://pubs.acs.org/doi/pdf/10.1021/nl203701g

IBM has demonstrated a placement density of one billion carbon nanotubes per square centimeter using this approach – leading the way for dramatically smaller, faster and more efficient computer chips. (Image: IBM)

 

 

 

 

 

Manufacturing Techniques:

Development of spray process for manufacturing new CNT-modified films

The aim is to design and develop a new spray process for manufacturing novel CNT-modified films suitable for various applications, for example, capacitor films. Part of the study will be to investigate the dispersion behaviour or CNTs in various matrices. The project is in collaboration with Prof Patrick Grant, Prof Steven Sheard (Department of Engineering of Oxford univ) and Dr Peter Rocket (Department of Engineering). www.materials.ox.ac.uk

Transport calculations have already revealed that a carbon nanotube CNT can act as a sensitive spin measurement device or as a spin characterization tool. Using master equation techniques, methods are sought by which magnetic resonance might be detected in systems by measuring the electrical current through them. The research work is carried out in collaboration with the experimental semiconductor group of Prof Charles Smith at the University of Cambridge http://www.sp.phy.cam.ac.uk/SPWeb/home/cgs4.html

Nanostructures for energy applications; Supercapacitors usually use activated meso-porous graphite for their electrodes, but alternatives with higher power capability are being studied intensively, including entangled, meso-porous carbon nanotube CNT films , an application that makes use of the natural tendency of the CNTs to entangle and percolate current at low volume fractions. We are fabricating comparatively large amounts of both multi-walled CNTs by chemical vapour deposition or single wall SWCNTsby arc discharge in house, purifying them, functionalizing their surface to improve their ion storage capability, and then processing them into large area films or buckypaper, on a variety of flexible or stiff substrates.

To address lack of dispersability and electric homogeneity of nanostructures, MoSI nanowires (range of stoichiometries Mo6S9-xIx) is used for advantages containing S atoms in their structure. As such they can form covalent bonds to very diverse molecular entities. Connection to gold nanoparticles and thiol-containing proteins with high yields has been demonstrated. Functionalization of these nanowires will be sought.

Zheng et al. _187_ studied the influence of nanotube chirality on the interfacial bonding characteristics in a PMMA polymer using MD. They considered five different SWCNTs with similar lengths, diameters, and atomic compositions but with varying chiral indices. They conducted pull-out simulations to investigate the interaction energy, interfacial bonding energy, and shear stress of the composite. It was shown that all the above attain highest values for the armchair system, while the zigzag nanotube composite system produced the lowest values. Therefore, for SWCNTs with similar molecular weights, diameters, and lengths, the armchair will act as the best reinforcing agent. www.materials.ox.ac.uk

Source: Research projects 2011-2012, University of Oxford’s Materials Dept. : www.materials.ox.ac.uk

Univeristy of Toronto: Carbon nanotubes (CNTs) have shown great promise as sensing elements in nanoelectromechanical sensors. The electrical, mechanical, and electromechanical properties of CNTs used in sensing applications are reviewed. This investigation indicates which nanotube properties should be carefully considered when designing nanotube-based sensors. The primary techniques used for the integration of nanotubes into devices are discussed with a description of sensors that have been developed using CNTs as active sensing elements. Development of Carbon Nanotube-Based Sensors—A Review: http://amnl.mie.utoronto.ca/data/J23.pdf

Carbon nanotubes in medicine: http://en.wikipedia.org/wiki/Carbon_nanotubes_in_medicine

Reports of toxicity of CNTs are usually presented when the mode of delivery is free floating in media, not for a substrate with CNTs embedded, whereas CNTs embedded into a substrate or made as a substrate promote cellular growth. Depending on the mode of exposure and/or material presentation, CNTs may lead to toxicity or promoted tissue growth (although significantly more research is required). Furthermore, it is important to emphasize that not all CNTs are the same. Some impurities may be left over from unreacted catalysis known to be toxic to cells, while others have fully reacted catalysts presenting relatively pure CNTs to cells. Such differences in CNTs can clearly have profound influences on CNT toxicity.

David A. Stout  and Thomas J. Webster, Materials Today, JULY-AUGUST 2012 | VOLUME 15 | NUMBER 7-8| pp 315

Great efforts have been undertaken in both academia and industry to develop new reagents that can improve the PCR (Polymerase chain reaction) amplification efficiency and enhance the specificity, with particular attention to nanomaterials due to their unique physical and chemical properties. AuNPs increased the sensitivity of PCR by 5- to 10-fold as compared to a conventional qPCR and at least 10.000-fold in a qPCR system with shortened reaction time (rapid protocol).

However, how to correctly use those enhancers for long PCR remains unclear. Recently, it has been suggested that single-walled carbon nanotubes (SWCNTs) can act similarly to Mg2+ ions in maintaining + the high activity of DNA polymerase in PCR, but the potential effects of different types of carbon nanotubes (CNTs) on long PCR remain to be elucidated.

Long PCR has become increasingly important in the postgenome era. CNTs with dimensions ranging from 1-2 upto 10-20 nm were tested to have enhancing effects for long PCR. The serious smear bands were significantly diminished, and the nonspecific trailing bands gradually disappeared by increasing the concentration of CNTs. The optimal final concentrations of both SWCNTs and MWCNTs are close to 1 mg/mL.

Zhang et al, BioTechniques, Aqueous suspension of carbon nanotubes enhances the specificity of long P2008, 44, 537–545.

Zhang, M. C. Wang and H. J. An, Nanotechnology, 2007, 18, 355706.

Leave a comment

Filed under Uncategorized

Monitor vital signs via webcam

Oxford spin-out provides software to monitor vital signs via webcam

A new Oxford spin-out, OxeHealth, spun out from Oxford’s Institute of Biomedical Engineering, will allow patient’s health to be monitored using a webcam and a software application.

The software will detect a patient’s heart rate, respiratory rate and oxygen saturation even in artificial light without the need for any physical contact or additional hardware.

Isis Innovation, which commercialises research from the University of Oxford, announced that the new company will receive up to £500,000 in funding from IP Group, subject to certain milestones being met.

Professor Tarassenko, Director of the Institute of Biomedical Engineering, said: “Our research has transformed the ubiquitous webcam into a non-contact sensor for monitoring the most important vital signs. Our close collaboration with biomedical scientists in the University and clinicians in the NHS Trust has enabled rapid translation from the lab to the ward. We believe that our webcam software offers a step change in the way that the health of individuals can be assessed in the home or the hospital.”

Read more: http://www.isis-innovation.com/news/news/OxeHealth.html

Leave a comment

Filed under Uncategorized

Multi spectral imaging application to reveal ancient manuscripts

 

 

 

 

 

 

 

 

The library of Congress used multi-spectral imaging to reveal how Thomas Jefferson changed ‘fellow subjects’ to ‘fellow citizens’ in a draft of the Declaration of independence:

http://www.loc.gov/today/pr/2010/10-161.html

Leave a comment

Filed under Uncategorized

Images of individual atoms in graphene

 

 

 

 

Dislocation-Driven Deformations in Graphene

A team of researchers from Oxford’s Department of Materials and industry partner JEOL, have obtained images of individual atoms in graphene with unprecedented resolution using Oxford’s state-of-the-art transmission electron microscope. Reported in the journal Science, 337, 209 (2012), the team studied edge dislocations in graphene and revealed for the first time the strain induced in graphe

Source: http://www.materials.ox.ac.uk/blog/108/544/Dislocation-Driven-Deformations-in-Graphene.htmlne’s lattice by edge dislocations.

 

A report entitled “world market for graphene to 2017” by the future markets, Inc. 2011 estimates that the production volume of graphene in 2010 was 28 tonnes and is projected to grow to 573 Tonnes by 2017.

The quality of graphene plays a crucial role as the presence of defects, impurities, grain boundaries, multiple domains, structural disorders, wrinkles in the graphene sheet can have an adverse effect on its electronic and optical properties. In electronic applications, the major bottleneck is the requirement of large size samples, which is possible only in the case of CVD process, but it is difficult to produce high quality and single crystalline graphene thin films possessing very high electrical and thermal conductivities along with excellent optical transparency. Another issue of concern in the synthesis of graphene by conventional methods involves the use of toxic chemicals and these methods usually result in the generation hazardous waste and poisonous gases.

http://www.nanowerk.com/spotlight/spotid=25744.php#at_pco=cfd-1.0

 

Leave a comment

Filed under Uncategorized

Innovating with technology

We’re all much better attuned at processing images rather than text and data. Half our cerebral cortex is devoted to visualization. Technologies developed in the computer games and film industries — think Toy Story and World of Warcraft — are being used to help innovators in areas ranging from pharmaceuticals to emergency response units in cities. The capacity, which these new technologies bring to produce dynamic images of what was previously opaque technical information, underlies the greater engagement in innovation by a wider range of people.

source:

Innovating with technology

Mark Dodgson is Director of Technology and Innovation Management Centre, University of Queensland Business School, and David Gann is Head of Innovation and Entrepreneurship at Imperial College London. They co-authored Innovation: A Very Short Introduction.

1 Comment

Filed under Uncategorized

Innovation is halted by aggressive marketing of unneeded drugs

An analysis of Canada’s pharmaceutical expenditures found that 80% of the increase in its drug budget is spent on new medicines that offer few new benefits. Major contributors included newer hypertension, gastrointestinal, and cholesterol drugs, including atorvastatin, the fifth statin on the Canadian market.

…………….We need to revive the Norwegian “medical need” clause that limited approval of new drugs to those that offered a therapeutic advantage over existing products.39 This approach led to Norway having seven non-steroidal anti-inflammatory drugs on the market compared with 22 in the Netherlands.40 Norway’s medical need clause was eliminated in 1996 when it harmonized its drug approval process with that in the EU. EU countries are paying billions more than necessary for drugs that provide little health gain because prices are not being set to reward new drugs in proportion to their added clinical value.

……………Data from companies, the United States National Science Foundation, and government reports indicate that companies have been spending only 1.3% of revenues on basic research to discover new molecules, net of taxpayer subsidies.23

…………….More than four fifths of all funds for basic research to discover new drugs and vaccines come from public sources.24 Moreover, despite the industry’s frequent claims that the cost of new drug discovery is now $1.3bn (£834m; €1bn),25 this figure, which comes from the industry supported Tufts Center,26 has been heavily criticised. Half that total comes from estimating how much profit would have been made if the money had been invested in an index fund of pharmaceutical companies that increased in value 11% a year, compounded over 15 years.26 While used by finance committees to estimate whether a new venture is worth investing in, these presumed profits (far greater than the rise in the value of pharmaceutical stocks) should not be counted as research and development costs on which profits are to be made. Half of the remaining $0.65bn is paid by taxpayers through company deductions and credits, bringing the estimate down to one quarter of $1.3bn or $0.33bn.27

Source:

Light DW. Basic research funds to discover important new drugs: who contributes how much. In: Burke MA, ed. Monitoring the financial flows for health research 2005: behind the global numbers. Global Forum for Health Research, 2006:27-43.

Pharmaceutical Research and Manufacturers of America. 2011 profile: pharmaceutical industry. PhRMA, 2011.

Light D, Lexchin J, Pharmaceutical research and development:what do we get for all that money? BMJ 2012;344:e4348 doi: 10.1136/bmj.e4348 (Published 7 August 2012), D W Light dlight@princeton.edu

Prof John Sulston’s lecture on ‘What is science for’ at Oxford’s Sheldonian Theatre, and Prof John Harris made following remarks:

Who owns science? that is an important question, and an emotive one. Since 2000 there is a trend to increase investment from private sector and decrease public funding, but the consequence of driving science into profitable areas, sees all sort of dysfunctionality in West. There are unneeded drugs – you may criticize that, but I say that, and it is absolutely true – which is sold by aggressive and unethical marketing practices. With regard to human health there are consequences.

The trend to make science profitable has consequences – what should drive science is curiosity.

When Genetic Engineering Came Of Age

Today marks the 30th anniversary of an event that kicked off an important new era in drug therapies – the approval by the FDA of human insulin synthesized in genetically engineered bacteria.  The saga is remarkable in several ways, not least of which is that although both the drugmakers and regulators were exploring unknown territory, the development of the drug and its regulatory review progressed smoothly and rapidly…………………………………………..Regrettably, the early salubrious regulatory climate has changed.  Even with a toolbox of improved technologies and greater knowledge of pharmacogenetics, bringing a new drug to market on average now takes 10-15 years and costs over $1.4 billion.  Regulators have adopted a highly risk-averse and even adversarial mindset, few new drugs are approved without convening extramural advisory committees, and decisions are sometimes hijacked by political forces outside the FDA.  Approval of a drug or other FDA-regulated product made with a brand new technology now would probably be further delayed by navel-gazing at a series of government-sponsored, “consensus-building” conferences.

The result is that fewer drugs enter the development pipeline and become available for patients who would benefit from them.  Over the years government regulation hasn’t aged as gracefully as recombinant DNA technology itself.

Henry Miller, a physician, is the Robert Wesson Fellow in Scientific Philosophy and Public Policy at Stanford University‘s Hoover Institution.  He was the founding director of the FDA’s Office of Biotechnology.   His most recent book is “The Frankenfood Myth.”

Source:

http://www.forbes.com/sites/henrymiller/2012/10/29/when-genetic-engineering-came-of-age/

Leave a comment

Filed under Uncategorized

The gene causing insulin sensitivity was discovered by Oxford Scientists

The first single gene cause of increased sensitivity to the hormone insulin has been discovered by a team of Oxford University researchers.

The opposite condition – insulin resistance – is a common feature of type 2 diabetes, so finding this cause of insulin sensitivity could offer new opportunities for pursuing novel treatments for diabetes.

Although mutations in the PTEN gene cause a rare condition with increased risk of cancer, the biological pathways the gene is involved in could offer promising targets for new drugs.

The Oxford University researchers, along with colleagues at the Babraham Institute in Cambridge and the Churchill Hospital in Oxford, report their findings in the New England Journal of Medicine. The study was funded by the Wellcome Trust, the Medical Research Council, the National Institute for Health Research Oxford Biomedical Research Centre, and the Biotechnology and Biological Sciences Research Council.

Read more: http://www.ox.ac.uk/media/news_stories/2012/120913.html

 

Leave a comment

Filed under Uncategorized

Chosing Algorithms for analyzing biological images

 

Annotated high-throughput microscopy image sets for validation

Choosing among algorithms for analyzing biological images can be a daunting task, especially for nonexperts. Software toolboxes such as CellProfiler1,2 and ImageJ3 make it easy to try out algorithms on a researcher’s own data, but it can still be difficult to assess whether an algorithm will be robust across an entire experiment based on the small subset of images that is practical to examine or annotate. Even if controls are available, a pilot high-throughput experiment may be insufficient to show that an algorithm will robustly identify rare phenotypes and handle the experimental artifacts that will invariably be present in a high-throughput experiment. It is therefore useful to know that a particular algorithm has proven superior on several similar image sets. The performance comparisons presented in papers that introduce new algorithms are often not very helpful for assessing this because each study typically relies on a different test image set (often to the advantage of the proposed algorithm), thealgorithms compared may not be the ones the researcher is mostinterested in and the authors may not have implemented otheralgorithms as optimally as their own. Although biologists shouldalways also validate algorithms on their own images, it wouldbe useful if developers would quantitatively test new algorithmsagainst a publicly available established collection of image sets. Inthis way, objective comparison can be made to other algorithms,as tested by the developers of those algorithms. We see a need forsuch a collection of image of image sets, together with ground truth andwell-defined performance metrics.Here we present the Broad Bioimage Benchmark Collection(BBBC), a publicly available collection of microscopy images intendedas a resource for testing and validating automated image-analysisalgorithms. The BBBC is particularly useful for high-throughputexperiments and for providing biological ground truth for evaluatingimage-analysis algorithms. If an algorithm is sufficiently robustacross samples to handle high-throughput experiments, lowthoughputapplications also benefit because tolerance to variabilityin sample preparation and imaging makes the algorithm more likelyto generalize to new image sets.Each image set in the BBBC is accompanied by a brief descriptionof its motivating biological application and a set of groundtruthdata against which algorithms can be evaluated. The groundtruth sets can consist of cell or nucleus counts, foreground andbackground pixels, outlines of individual objects, or biologicallabels based on treatment conditions or orthogonal assays (such asa dose-response curve or positive- and negative-control images).We describe canonical ways to measure an algorithm’s performanceso that algorithms can be compared against each other fairly, andwe provide an optional framework to do so conveniently withinCellProfiler. For each image set, we list any published results ofwhich we are aware.

The BBBC is freely available from http://www.broadinstitute. org/bbbc/. The collection currently contains 18 image sets, including images of cells (Homo sapiens and Drosophila melanogaster) as well as of whole organisms (Caenorhabditis elegans) assayed in high throughput. We are continuing to extend the collection during the course of our research, and we encourage the submission of additional image sets, ground truth and published results of algorithms.

Carpenter et al,  Broad Institute of Massachusetts Institute of Technology and Harvard, Cambridge,Massachusetts, USA, nature methods | VOL.9 NO.7 | JULY 2012 | pp 637

 

2 Comments

Filed under Uncategorized