Difference between revisions of "Team:Cambridge-JIC/TestHome"

 
(20 intermediate revisions by the same user not shown)
Line 9: Line 9:
 
z-index:1;
 
z-index:1;
 
position:relative;
 
position:relative;
 +
}
 +
#sidebar {
 +
position:fixed;
 +
top:130px;
 +
right:20px;
 +
width: 23px;
 +
z-index:1;
 +
}
 +
 +
.sidebar-item {
 +
width: 20px;
 +
height:20px;
 +
float: left;
 +
border-radius: 50%;
 +
border: #555 solid;
 +
margin-bottom: 5px;
 +
cursor:pointer;
 
}
 
}
 
</style>
 
</style>
Line 23: Line 40:
 
         $(this).css("cursor", "pointer")
 
         $(this).css("cursor", "pointer")
 
     }
 
     }
 +
})
 +
 +
$(window).on("resize", function(){
 +
 +
$("#sidebar").offset({right: $(".cam-container section:first-of-type div").offset().right - 30})
 +
 +
if($(".navbar-collapse").is(":visible")) {
 +
$(".downarrow").show()
 +
$("#sidebar").show()
 +
} else {
 +
$(".downarrow").hide()
 +
$("#sidebar").hide()
 +
}
 +
 +
}).resize();
 +
 +
var ix = 0;
 +
$("section").each(function(){
 +
if($(this).attr("id") != "footer-sec") {
 +
    $("#sidebar").append("<div onclick=\"scrollsection("+ix+")\" class=\"sidebar-item\""+(ix==0?" style=\"background: #555\"":"")+"></div>")
 +
}
 +
ix++
 +
})
 +
 +
 +
 +
$(window).on("scroll", function(){
 +
    var i=0;
 +
    $("section").each(function(){
 +
        if($(this).offset().top <= $(window).scrollTop()) {
 +
            $(".sidebar-item").css("background", "none")
 +
            $($(".sidebar-item").get(i)).css("background", "#555")
 +
        }
 +
        i++;
 +
    })
 
})
 
})
  
Line 33: Line 85:
  
 
})
 
})
 
$(window).on("resize", function(){
 
 
if($(".navbar-collapse").is(":visible")) {
 
$(".downarrow").show()
 
} else {
 
$(".downarrow").hide()
 
}
 
  
 
});
 
});
  
});
+
function scrollsection(id) {
 +
$("html, body").animate({ scrollTop: $($("section").get(id)).offset().top }, 1000);
 +
}
 
</script>
 
</script>
 +
<div id="sidebar"></div>
  
 
         <div class="downarrow" id="overlay" style="background:url(//2015.igem.org/wiki/images/6/60/CamJIC-Overlay_guide.png); background-size: 100% 100%;width:100%;height:27%;position:absolute;z-index:0"></div>
 
         <div class="downarrow" id="overlay" style="background:url(//2015.igem.org/wiki/images/6/60/CamJIC-Overlay_guide.png); background-size: 100% 100%;width:100%;height:27%;position:absolute;z-index:0"></div>

Latest revision as of 02:18, 4 August 2015

Abstract

Fluorescence microscopy has become a ubiquitous part of biological research and synthetic biology, but hardware can often be large and prohibitively expensive. This is particularly true for labs with small budgets, including those in the DIY Bio community and developing countries. Queuing systems imposed in labs for use of a few expensive microscopes can make research even more laborious and time-consuming than it needs to be. Furthermore, this makes it almost impossible to perform time-lapse imaging or imaging in environments such as in an incubator or in a fume hood.

We aim to provide a well documented, physically compact, easily modifiable and high quality fluorescence microscope to address all of these problems. We are designing it in a modular fashion such that it can be used standalone and also be incorporated into larger frameworks, with various pluggable stages.

The mechanics of the microscope will be 3D printable, and all other parts will be cheap and accessible. These introduce a novel method (developed by Dr Richard Bowman, Cambridge) for precise positioning and control which exploits the flexibility of the printed parts. The microscope will also utilise the developed-in-Cambridge Raspberry Pi board and camera module for image capture. Ultimately we are aiming for 4 micron resolution, both in brightfield and fluorescence modes.

Furthermore, software used to control commercial microscopes is very much focused upon translating the physical experience of using a microscope into a computer. We aim to leverage the full computational potential of a digital microscope, carefully considering functional UX design to allow control (locally and also over a network) via a Google Maps-like interface and implementing background image processing, annotation and stitching, as well as allowing fully autonomous operation. As a proof of principle, we are also developing automated screening systems on our microscope architecture.