Difference between revisions of "Team:Cambridge-JIC"

Line 110: Line 110:
 
             <h2>Microscopy awaits you</h2>
 
             <h2>Microscopy awaits you</h2>
 
<p>innovative <span class="hl_1">3D printed stage</span> for precise movement</p>
 
<p>innovative <span class="hl_1">3D printed stage</span> for precise movement</p>
 +
<p><span class="hl_1">motorized</span> or manual control </p>
 
<p>powered by open-source electronics: <span class="hl_1">Raspberry Pi</span> and <span class="hl_1">Arduino</span></p>
 
<p>powered by open-source electronics: <span class="hl_1">Raspberry Pi</span> and <span class="hl_1">Arduino</span></p>
 
<p>Raspberry Pi camera as an objective giving <span class="hl_1">4 micron resolution</span></p>
 
<p>Raspberry Pi camera as an objective giving <span class="hl_1">4 micron resolution</span></p>
Line 126: Line 127:
 
     <div class="slide" style="background-image:url(//2015.igem.org/wiki/images/b/b5/CamJIC-Panel-3.png)">
 
     <div class="slide" style="background-image:url(//2015.igem.org/wiki/images/b/b5/CamJIC-Panel-3.png)">
 
         <div style="padding-right: 50px; padding-left: 170px; padding-top: 60px; font-size: 20px;" class="padleft">
 
         <div style="padding-right: 50px; padding-left: 170px; padding-top: 60px; font-size: 20px;" class="padleft">
 +
<p><span class="hl_1">streams</span> image online</p>
 +
<p><span class="hl_1">software package</span> available for image processing, annotation, and stitching</p>
 +
<p>can be <span class="hl_1">remotely controlled</span></p>
 +
<p>supports <span class="hl_1">autofocus</span> and <span class="hl_1">image recognition</span></p>
 
<p>The mechanics of the microscope will be 3D printable, and all other parts will be cheap and accessible. These introduce a novel method (developed by Dr Richard Bowman, Cambridge) for <span class="hl_2">precise positioning and control</span> which exploits the flexibility of the printed parts. The microscope will also utilise the developed-in-Cambridge Raspberry Pi board and camera module for image capture. Ultimately we are aiming for <span class="hl_2">4 micron resolution</span>, both in brightfield and fluorescence modes.</p>
 
<p>The mechanics of the microscope will be 3D printable, and all other parts will be cheap and accessible. These introduce a novel method (developed by Dr Richard Bowman, Cambridge) for <span class="hl_2">precise positioning and control</span> which exploits the flexibility of the printed parts. The microscope will also utilise the developed-in-Cambridge Raspberry Pi board and camera module for image capture. Ultimately we are aiming for <span class="hl_2">4 micron resolution</span>, both in brightfield and fluorescence modes.</p>
 
           <p>Furthermore, software used to control commercial microscopes is very much focused upon translating the physical experience of using a microscope into a computer. We aim to leverage the full computational potential of a digital microscope, <span class="hl_2">carefully considering functional UX design</span> to allow control (locally and also over a network) via a Google Maps-like interface and implementing <span class="hl_2">background image processing</span>, <span class="hl_2">annotation</span> and <span class="hl_2">stitching</span>, as well as allowing <span class="hl_2">fully autonomous operation</span>.  As a proof of principle, we are also developing automated screening systems on our microscope architecture.</p>
 
           <p>Furthermore, software used to control commercial microscopes is very much focused upon translating the physical experience of using a microscope into a computer. We aim to leverage the full computational potential of a digital microscope, <span class="hl_2">carefully considering functional UX design</span> to allow control (locally and also over a network) via a Google Maps-like interface and implementing <span class="hl_2">background image processing</span>, <span class="hl_2">annotation</span> and <span class="hl_2">stitching</span>, as well as allowing <span class="hl_2">fully autonomous operation</span>.  As a proof of principle, we are also developing automated screening systems on our microscope architecture.</p>

Revision as of 15:39, 6 September 2015

Microscopy awaits you

innovative 3D printed stage for precise movement

motorized or manual control

powered by open-source electronics: Raspberry Pi and Arduino

Raspberry Pi camera as an objective giving 4 micron resolution

supports brightfield and fluorescence modes

compact: will easily fit into your backpack

Fluorescence microscopy has become a ubiquitous part of biological research and synthetic biology, but hardware can often be large and prohibitively expensive. This is particularly true for labs with small budgets, including those in the DIY Bio community and developing countries. Queuing systems imposed in labs for use of a few expensive microscopes can make research even more laborious and time-consuming than it needs to be. Furthermore, this makes it almost impossible to perform time-lapse imaging or imaging in environments such as in an incubator or in a fume hood.

We aim to provide a well documented, physically compact, easily modifiable and high quality fluorescence microscope to address all of these problems. We are designing it in a modular fashion such that it can be used standalone and also be incorporated into larger frameworks, with various pluggable stages.

streams image online

software package available for image processing, annotation, and stitching

can be remotely controlled

supports autofocus and image recognition

The mechanics of the microscope will be 3D printable, and all other parts will be cheap and accessible. These introduce a novel method (developed by Dr Richard Bowman, Cambridge) for precise positioning and control which exploits the flexibility of the printed parts. The microscope will also utilise the developed-in-Cambridge Raspberry Pi board and camera module for image capture. Ultimately we are aiming for 4 micron resolution, both in brightfield and fluorescence modes.

Furthermore, software used to control commercial microscopes is very much focused upon translating the physical experience of using a microscope into a computer. We aim to leverage the full computational potential of a digital microscope, carefully considering functional UX design to allow control (locally and also over a network) via a Google Maps-like interface and implementing background image processing, annotation and stitching, as well as allowing fully autonomous operation. As a proof of principle, we are also developing automated screening systems on our microscope architecture.