Difference between revisions of "Team:Cambridge-JIC/TestHome"

 
(44 intermediate revisions by the same user not shown)
Line 6: Line 6:
 
width: 256px;
 
width: 256px;
 
height:256px;
 
height:256px;
background: url(//2015.igem.org/wiki/images/b/be/CamJIC-Downarrow.png)
+
background: url(//2015.igem.org/wiki/images/b/be/CamJIC-Downarrow.png);
 +
z-index:1;
 +
position:relative;
 +
}
 +
#sidebar {
 +
position:fixed;
 +
top:130px;
 +
right:20px;
 +
width: 23px;
 +
z-index:1;
 +
}
 +
 
 +
.sidebar-item {
 +
width: 20px;
 +
height:20px;
 +
float: left;
 +
border-radius: 50%;
 +
border: #555 solid;
 +
margin-bottom: 5px;
 +
cursor:pointer;
 
}
 
}
 
</style>
 
</style>
  
 
<script>
 
<script>
 +
$(window).ready(function(){
 +
 +
$(".downarrow").each(function(){
 +
    if($(this).attr("id") !== "overlay") {
 +
        $(this).on("click", function(){
 +
            console.log($(this).parents("section").next().offset().top)
 +
$("html, body").animate({ scrollTop: $(this).parents("section").next().offset().top }, 1000);
 +
        })
 +
        $(this).css("cursor", "pointer")
 +
    }
 +
})
 +
 +
$(window).on("resize", function(){
 +
 +
$("#sidebar").offset({right: $(".cam-container section:first-of-type div").offset().right - 30})
 +
 +
if($(".navbar-collapse").is(":visible")) {
 +
$(".downarrow").show()
 +
$("#sidebar").show()
 +
} else {
 +
$(".downarrow").hide()
 +
$("#sidebar").hide()
 +
}
 +
 +
}).resize();
 +
 +
var ix = 0;
 +
$("section").each(function(){
 +
if($(this).attr("id") != "footer-sec") {
 +
    $("#sidebar").append("<div onclick=\"scrollsection("+ix+")\" class=\"sidebar-item\""+(ix==0?" style=\"background: #555\"":"")+"></div>")
 +
}
 +
ix++
 +
})
 +
 +
 +
 +
$(window).on("scroll", function(){
 +
    var i=0;
 +
    $("section").each(function(){
 +
        if($(this).offset().top <= $(window).scrollTop()) {
 +
            $(".sidebar-item").css("background", "none")
 +
            $($(".sidebar-item").get(i)).css("background", "#555")
 +
        }
 +
        i++;
 +
    })
 +
})
 +
 
$(window).on("scroll", function(){
 
$(window).on("scroll", function(){
  
 
$('.downarrow').each(function(){
 
$('.downarrow').each(function(){
var position = $(this).offset().top - $(window).scrollTop();
+
var position = $(this).offset().top +$(this).height()/2 - $(window).scrollTop();
 
$(this).css("opacity", position/$(window).height())
 
$(this).css("opacity", position/$(window).height())
 
})
 
})
Line 20: Line 86:
 
})
 
})
  
$(document).ready(function(){
 
$("#overlay").width($("#intro").width());
 
$("#overlay").height($("#intro").height());
 
$("#overlay").css("position", "absolute");
 
$("#overlay").css("top", $("#intro").css("top")+"px");
 
$("#overlay").css("left", $("#intro").css("left")+"px");
 
// todo: remove on mobile
 
 
});
 
});
 +
 +
function scrollsection(id) {
 +
$("html, body").animate({ scrollTop: $($("section").get(id)).offset().top }, 1000);
 +
}
 
</script>
 
</script>
 +
<div id="sidebar"></div>
  
         <div class="downarrow" id="overlay" style="background:url(//2015.igem.org/wiki/images/6/60/CamJIC-Overlay_guide.png); background-size: 100% 100%;" />
+
         <div class="downarrow" id="overlay" style="background:url(//2015.igem.org/wiki/images/6/60/CamJIC-Overlay_guide.png); background-size: 100% 100%;width:100%;height:27%;position:absolute;z-index:0"></div>
<section id="intro" style="background-color: #fff">
+
<section id="intro" style="background-color: #fff; padding-top:0">
 
     <div class="slide" style="background-image: url(//2015.igem.org/wiki/images/f/f8/CamJIC-Panel-Main.png)" data-mobimg="url(//2015.igem.org/wiki/images/f/f8/CamJIC-Logo2.png)">
 
     <div class="slide" style="background-image: url(//2015.igem.org/wiki/images/f/f8/CamJIC-Panel-Main.png)" data-mobimg="url(//2015.igem.org/wiki/images/f/f8/CamJIC-Logo2.png)">
 
         <div style="width: 40%; margin: 450px 530px;"></div>
 
         <div style="width: 40%; margin: 450px 530px;"></div>
Line 41: Line 105:
 
         <div style="width: 78%; margin: 50px; font-size: 20px;">
 
         <div style="width: 78%; margin: 50px; font-size: 20px;">
 
             <h2>Abstract</h2>
 
             <h2>Abstract</h2>
             <p>Fluorescence microscopy has become a ubiquitous part of biological research and synthetic biology, but hardware can often be <span class="hl_1">large and prohibitively expensive</span>. This is particularly true for labs with small budgets, including those in the DIY Bio community and developing countries. Queuing systems imposed in labs for use of a few expensive microscopes can make research even more <span class="hl_1">laborious and time-consuming</span> than it needs to be.</p>
+
             <p>Fluorescence microscopy has become a ubiquitous part of biological research and synthetic biology, but hardware can often be <span class="hl_1">large and prohibitively expensive</span>. This is particularly true for labs with small budgets, including those in the DIY Bio community and developing countries. Queuing systems imposed in labs for use of a few expensive microscopes can make research even more <span class="hl_1">laborious and time-consuming</span> than it needs to be. Furthermore, this makes it almost impossible to perform time-lapse imaging or imaging in environments such as in an incubator or in a fume hood.</p>
          <p>Furthermore, this makes it almost impossible to perform time-lapse imaging or imaging in environments such as in an incubator or in a fume hood. We aim to provide a <span class="hl_1">well documented, physically compact, easily modifiable and high quality fluorescence microscope</span> to address all of these problems. We are designing it in a modular fashion such that it can be used standalone and also be incorporated into larger frameworks.</p>
+
          <p>We aim to provide a <span class="hl_1">well documented, physically compact, easily modifiable and high quality fluorescence microscope</span> to address all of these problems. We are designing it in a modular fashion such that it can be used standalone and also be incorporated into larger frameworks, with various pluggable stages.</p>
  
  
Line 54: Line 118:
 
         <div style="padding-right: 50px; padding-left: 170px; padding-top: 60px; font-size: 20px;" class="padleft">
 
         <div style="padding-right: 50px; padding-left: 170px; padding-top: 60px; font-size: 20px;" class="padleft">
 
<p>The mechanics of the microscope will be 3D printable, and all other parts will be cheap and accessible. These introduce a novel method (developed by Dr Richard Bowman, Cambridge) for <span class="hl_2">precise positioning and control</span> which exploits the flexibility of the printed parts. The microscope will also utilise the developed-in-Cambridge Raspberry Pi board and camera module for image capture. Ultimately we are aiming for <span class="hl_2">4 micron resolution</span>, both in brightfield and fluorescence modes.</p>
 
<p>The mechanics of the microscope will be 3D printable, and all other parts will be cheap and accessible. These introduce a novel method (developed by Dr Richard Bowman, Cambridge) for <span class="hl_2">precise positioning and control</span> which exploits the flexibility of the printed parts. The microscope will also utilise the developed-in-Cambridge Raspberry Pi board and camera module for image capture. Ultimately we are aiming for <span class="hl_2">4 micron resolution</span>, both in brightfield and fluorescence modes.</p>
           <p>Furthermore, software used to control commercial microscopes is very much focused upon translating the physical experience of using a microscope into a computer. We aim to leverage the full computational potential of a digital microscope, <span class="hl_2">carefully considering functional UX design</span> to allow control (locally and also over a network) via a Google Maps-like interface and implementing <span class="hl_2">background image processing</span>, <span class="hl_2">annotation</span> and <span class="hl_2">stitching</span>, as well as allowing <span class="hl_2">fully autonomous operation</span>.</p>
+
           <p>Furthermore, software used to control commercial microscopes is very much focused upon translating the physical experience of using a microscope into a computer. We aim to leverage the full computational potential of a digital microscope, <span class="hl_2">carefully considering functional UX design</span> to allow control (locally and also over a network) via a Google Maps-like interface and implementing <span class="hl_2">background image processing</span>, <span class="hl_2">annotation</span> and <span class="hl_2">stitching</span>, as well as allowing <span class="hl_2">fully autonomous operation</span>.  As a proof of principle, we are also developing automated screening systems on our microscope architecture.</p>
 
         </div>
 
         </div>
 
     </div>
 
     </div>

Latest revision as of 02:18, 4 August 2015

Abstract

Fluorescence microscopy has become a ubiquitous part of biological research and synthetic biology, but hardware can often be large and prohibitively expensive. This is particularly true for labs with small budgets, including those in the DIY Bio community and developing countries. Queuing systems imposed in labs for use of a few expensive microscopes can make research even more laborious and time-consuming than it needs to be. Furthermore, this makes it almost impossible to perform time-lapse imaging or imaging in environments such as in an incubator or in a fume hood.

We aim to provide a well documented, physically compact, easily modifiable and high quality fluorescence microscope to address all of these problems. We are designing it in a modular fashion such that it can be used standalone and also be incorporated into larger frameworks, with various pluggable stages.

The mechanics of the microscope will be 3D printable, and all other parts will be cheap and accessible. These introduce a novel method (developed by Dr Richard Bowman, Cambridge) for precise positioning and control which exploits the flexibility of the printed parts. The microscope will also utilise the developed-in-Cambridge Raspberry Pi board and camera module for image capture. Ultimately we are aiming for 4 micron resolution, both in brightfield and fluorescence modes.

Furthermore, software used to control commercial microscopes is very much focused upon translating the physical experience of using a microscope into a computer. We aim to leverage the full computational potential of a digital microscope, carefully considering functional UX design to allow control (locally and also over a network) via a Google Maps-like interface and implementing background image processing, annotation and stitching, as well as allowing fully autonomous operation. As a proof of principle, we are also developing automated screening systems on our microscope architecture.