Difference between revisions of "Team:Cambridge-JIC/TestHome"
Line 48: | Line 48: | ||
if($(".navbar-collapse").is(":visible")) { | if($(".navbar-collapse").is(":visible")) { | ||
$(".downarrow").show() | $(".downarrow").show() | ||
+ | $("#sidebar").show() | ||
} else { | } else { | ||
$(".downarrow").hide() | $(".downarrow").hide() | ||
+ | $("#sidebar").hide() | ||
} | } | ||
Latest revision as of 02:18, 4 August 2015
Abstract
Fluorescence microscopy has become a ubiquitous part of biological research and synthetic biology, but hardware can often be large and prohibitively expensive. This is particularly true for labs with small budgets, including those in the DIY Bio community and developing countries. Queuing systems imposed in labs for use of a few expensive microscopes can make research even more laborious and time-consuming than it needs to be. Furthermore, this makes it almost impossible to perform time-lapse imaging or imaging in environments such as in an incubator or in a fume hood.
We aim to provide a well documented, physically compact, easily modifiable and high quality fluorescence microscope to address all of these problems. We are designing it in a modular fashion such that it can be used standalone and also be incorporated into larger frameworks, with various pluggable stages.
The mechanics of the microscope will be 3D printable, and all other parts will be cheap and accessible. These introduce a novel method (developed by Dr Richard Bowman, Cambridge) for precise positioning and control which exploits the flexibility of the printed parts. The microscope will also utilise the developed-in-Cambridge Raspberry Pi board and camera module for image capture. Ultimately we are aiming for 4 micron resolution, both in brightfield and fluorescence modes.
Furthermore, software used to control commercial microscopes is very much focused upon translating the physical experience of using a microscope into a computer. We aim to leverage the full computational potential of a digital microscope, carefully considering functional UX design to allow control (locally and also over a network) via a Google Maps-like interface and implementing background image processing, annotation and stitching, as well as allowing fully autonomous operation. As a proof of principle, we are also developing automated screening systems on our microscope architecture.
ABOUT US
We are a team of Cambridge undergraduates, competing in the Hardware track in iGEM 2015.
read moreLOCATION
Department of Plant Sciences,
University of Cambridge
Downing Street
CB2 3EA
CONTACT US
Email: igemcambridge2015@gmail.com
Tel: +447721944314