Difference between revisions of "Team:Cambridge-JIC"
Simonhkswan (Talk | contribs) |
Simonhkswan (Talk | contribs) m |
||
Line 17: | Line 17: | ||
<section style="background-color: #ffa8a3"> | <section style="background-color: #ffa8a3"> | ||
<div class="slide" style="background-image:url(//2015.igem.org/wiki/images/b/b5/CamJIC-Panel-3.png)"> | <div class="slide" style="background-image:url(//2015.igem.org/wiki/images/b/b5/CamJIC-Panel-3.png)"> | ||
− | + | <div style="width: 78%; margin: 50px; font-size: 20px;"> | |
<p>The mechanics of the microscope will be 3D printable, and all other parts will be cheap and accessible. These introduce a novel method (developed by Dr Richard Bowman, Cambridge) for <span class="hl_2">precise positioning and control</span> which exploits the flexibility of the printed parts. The microscope will also utilise the developed-in-Cambridge Raspberry Pi board and camera module for image capture. Ultimately we are aiming for <span class="hl_2"><10 micron resolution</span>, both in brightfield and fluorescence modes.</p> | <p>The mechanics of the microscope will be 3D printable, and all other parts will be cheap and accessible. These introduce a novel method (developed by Dr Richard Bowman, Cambridge) for <span class="hl_2">precise positioning and control</span> which exploits the flexibility of the printed parts. The microscope will also utilise the developed-in-Cambridge Raspberry Pi board and camera module for image capture. Ultimately we are aiming for <span class="hl_2"><10 micron resolution</span>, both in brightfield and fluorescence modes.</p> | ||
<p>Furthermore, software used to control commercial microscopes is very much focused upon translating the physical experience of using a microscope into a computer. We aim to leverage the full computational potential of a digital microscope, <span class="hl_2">carefully considering functional UX design</span> to allow control via a Google Maps-like interface and implementing <span class="hl_2">background image processing, annotation and stitching</span>, as well as allowing <span class="hl_2">fully autonomous operation</span>.</p> | <p>Furthermore, software used to control commercial microscopes is very much focused upon translating the physical experience of using a microscope into a computer. We aim to leverage the full computational potential of a digital microscope, <span class="hl_2">carefully considering functional UX design</span> to allow control via a Google Maps-like interface and implementing <span class="hl_2">background image processing, annotation and stitching</span>, as well as allowing <span class="hl_2">fully autonomous operation</span>.</p> | ||
− | + | </div> | |
</div> | </div> | ||
</section> | </section> |
Revision as of 16:27, 29 July 2015