Difference between revisions of "Team:Cambridge-JIC/Description"
KaterinaMN (Talk | contribs) |
KaterinaMN (Talk | contribs) |
||
Line 17: | Line 17: | ||
<div style="width: 100%; padding: 0% 10%; margin: 30px 0px;color:#000"> | <div style="width: 100%; padding: 0% 10%; margin: 30px 0px;color:#000"> | ||
<p><b><span style="font-size:200%">The chassis</span></b> is 3D printed, allowing simple modification. The plastic is cheap, biodegradable and flexible. Stage translation, based on work by Dr Richard Bowman, makes use of the flexibility to give fine control. </p> | <p><b><span style="font-size:200%">The chassis</span></b> is 3D printed, allowing simple modification. The plastic is cheap, biodegradable and flexible. Stage translation, based on work by Dr Richard Bowman, makes use of the flexibility to give fine control. </p> | ||
− | <p><b>The mechanics</b> of the stage can be automated using stepper motors. The user has remote control of the microscope, and can introduce tailor-made programs to facilitate their experiments.</p> | + | <p><b><span style="font-size:200%">The mechanics</span></b> of the stage can be automated using stepper motors. The user has remote control of the microscope, and can introduce tailor-made programs to facilitate their experiments.</p> |
− | <p><b>The optics</b> are low-cost, low-energy and modular. Illumination using LEDs means reducing power consumption and cost. A Raspberry Pi camera makes the microscope digital, and an epi-fluorescence cube makes imaging GFP a reality. With sub-micrometer resolution in brightfield and darkfield modes, you are ready to image single cells or whole tissues.</p> | + | <p><b><span style="font-size:200%">The optics</span></b> are low-cost, low-energy and modular. Illumination using LEDs means reducing power consumption and cost. A Raspberry Pi camera makes the microscope digital, and an epi-fluorescence cube makes imaging GFP a reality. With sub-micrometer resolution in brightfield and darkfield modes, you are ready to image single cells or whole tissues.</p> |
− | <p><b>The software</b> uses OpenCV and forms a core part of the project. The Webshell gives you real-time control over the microscope live-stream: from time-lapse to scale-bars. MicroMaps uses image stitching and sample recognition algorithms to give you the whole sample field in one, ready for annotation and screening. Autofocus capabilities allow automation of OpenScope’s motors, letting you image dynamic samples without supervision.</p> | + | <p><b>The <span style="font-size:200%">software</span></b> uses OpenCV and forms a core part of the project. The Webshell gives you real-time control over the microscope live-stream: from time-lapse to scale-bars. MicroMaps uses image stitching and sample recognition algorithms to give you the whole sample field in one, ready for annotation and screening. Autofocus capabilities allow automation of OpenScope’s motors, letting you image dynamic samples without supervision.</p> |
<p><b>The documentation</b> is comprehensive, non-proprietary and easy to access. And its licensed to make sure it stays that way.</p> | <p><b>The documentation</b> is comprehensive, non-proprietary and easy to access. And its licensed to make sure it stays that way.</p> | ||
<p><b>The community</b> of ‘makers’ is free to develop, modify and redistribute the documentation. OpenScope can evolve, improve and adapt to different needs.</p> | <p><b>The community</b> of ‘makers’ is free to develop, modify and redistribute the documentation. OpenScope can evolve, improve and adapt to different needs.</p> |
Revision as of 11:23, 17 September 2015