Tag Archives: biology

Growth Monitor pi: an open monitoring system for plant science

Post Syndicated from Helen Lynn original https://www.raspberrypi.org/blog/growth-monitor-pi-an-open-monitoring-system-for-plant-science/

Plant scientists and agronomists use growth chambers to provide consistent growing conditions for the plants they study. This reduces confounding variables – inconsistent temperature or light levels, for example – that could render the results of their experiments less meaningful. To make sure that conditions really are consistent both within and between growth chambers, which minimises experimental bias and ensures that experiments are reproducible, it’s helpful to monitor and record environmental variables in the chambers.

A neat grid of small leafy plants on a black plastic tray. Metal housing and tubing is visible to the sides.

Arabidopsis thaliana in a growth chamber on the International Space Station. Many experimental plants are less well monitored than these ones.
(“Arabidopsis thaliana plants […]” by Rawpixel Ltd (original by NASA) / CC BY 2.0)

In a recent paper in Applications in Plant Sciences, Brandin Grindstaff and colleagues at the universities of Missouri and Arizona describe how they developed Growth Monitor pi, or GMpi: an affordable growth chamber monitor that provides wider functionality than other devices. As well as sensing growth conditions, it sends the gathered data to cloud storage, captures images, and generates alerts to inform scientists when conditions drift outside of an acceptable range.

The authors emphasise – and we heartily agree – that you don’t need expertise with software and computing to build, use, and adapt a system like this. They’ve written a detailed protocol and made available all the necessary software for any researcher to build GMpi, and they note that commercial solutions with similar functionality range in price from $10,000 to $1,000,000 – something of an incentive to give the DIY approach a go.

GMpi uses a Raspberry Pi Model 3B+, to which are connected temperature-humidity and light sensors from our friends at Adafruit, as well as a Raspberry Pi Camera Module.

The team used open-source app Rclone to upload sensor data to a cloud service, choosing Google Drive since it’s available for free. To alert users when growing conditions fall outside of a set range, they use the incoming webhooks app to generate notifications in a Slack channel. Sensor operation, data gathering, and remote monitoring are supported by a combination of software that’s available for free from the open-source community and software the authors developed themselves. Their package GMPi_Pack is available on GitHub.

With a bill of materials amounting to something in the region of $200, GMpi is another excellent example of affordable, accessible, customisable open labware that’s available to researchers and students. If you want to find out how to build GMpi for your lab, or just for your greenhouse, Affordable remote monitoring of plant growth in facilities using Raspberry Pi computers by Brandin et al. is available on PubMed Central, and it includes appendices with clear and detailed set-up instructions for the whole system.

The post Growth Monitor pi: an open monitoring system for plant science appeared first on Raspberry Pi.

A low-cost, open-source, computer-assisted microscope

Post Syndicated from Helen Lynn original https://www.raspberrypi.org/blog/a-low-cost-open-source-computer-assisted-microscope/

Low-cost open labware is a good thing in the world, and I was particularly pleased when micropalaeontologist Martin Tetard got in touch about the Raspberry Pi-based microscope he is developing. The project is called microscoPI (what else?), and it can capture, process, and store images and image analysis results. Martin is engaged in climate research: he uses microscopy to study tiny fossil remains, from which he gleans information about the environmental conditions that prevailed in the far-distant past.

microscoPI: a microcomputer-assisted microscope

microscoPI a project that aims to design a multipurpose, open-source and inexpensive micro-computer-assisted microscope (Raspberry PI 3). This microscope can automatically take images, process them, and save them altogether with the results of image analyses on a flash drive. It it multipurpose as it can be used on various kinds of images (e.g.

Martin repurposed an old microscope with a Z-axis adjustable stage for accurate focusing, and sourced an inexpensive X/Y movable stage to allow more accurate horizontal positioning of samples under the camera. He emptied the head of the scope to install a Raspberry Pi Camera Module, and he uses an M12 lens adapter to attach lenses suitable for single-specimen close-ups or for imaging several specimens at once. A Raspberry Pi 3B sits above the head of the microscope, and a 3.5-inch TFT touchscreen mounted on top of the Raspberry Pi allows the user to check images as they are captured and processed.

The Raspberry Pi runs our free operating system, Raspbian, and free image-processing software ImageJ. Martin and his colleagues use a number of plugins, some developed themselves and some by others, to support the specific requirements of their research. With this software, microscoPI can capture and analyse microfossil images automatically: it can count particles, including tiny specimens that are touching, analyse their shape and size, and save images and results before prompting the user for the name of the next sample.

microscoPI is compact – less than 30cm in height – and it’s powered by a battery bank secured under the base of the microscope, so it’s easily portable. The entire build comes in at under 160 Euros. You can find out more, and get in touch with Martin, on the microscoPI website.

The post A low-cost, open-source, computer-assisted microscope appeared first on Raspberry Pi.

Saving biologists’ time with Raspberry Pi

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/saving-biologists-time-with-raspberry-pi/

In an effort to save themselves and fellow biologists hours of time each week, Team IoHeat are currently prototyping a device that allows solutions to be heated while they are still in cold storage.

The IoHeat team didn’t provide any photos with their project writeup, so here’s a picture of a bored biologist that I found online

Saving time in the lab

As they explain in their prototype write-up:

As scientists working with living organisms (from single cells to tissue samples), we are often required to return to work outside of normal hours to maintain our specimens. In many cases, the compounds and solutions we are using in our line of work are stored at 4°C and need to reach 37°C before they can be used. So far, in order to do this we need to return to our workplace early, incubate our solutions at 37°C for 1–2h, depending on the required volume, and then use them in processes that often take a few minutes. It is clear that there is a lot of room here to improve our efficiency.

Controlling temperatures with Raspberry Pi

These hours wasted on waiting for solutions to heat up could be better spent elsewhere, so the team is building a Raspberry Pi–powered device that will allow them to control the heating process remotely.

We are aiming to built a small incubator that we can store in a cold room/fridge, and that can be activated remotely to warm up to a defined temperature. This incubator will enable us to safely store our reagents at low temperature and warm them up remotely before we need to use them, saving an estimate of 12h per week per user.

This is a great project idea, and they’ve already prototyped it using a Raspberry Pi, heating element, and fan. Temperature and humidity sensors connected to the Raspberry Pi monitor conditions inside the incubator, and the prototype can be controlled via Telegram.

Find out more about the project on Hackster.

We’ve got more than one biologist on the Raspberry Pi staff, so we have a personal appreciation for the effort behind this project, and we look forward to seeing how IoHeat progresses in the future.

The post Saving biologists’ time with Raspberry Pi appeared first on Raspberry Pi.

Friday Squid Blogging: Do Cephalopods Contain Alien DNA?

Post Syndicated from Bruce Schneier original https://www.schneier.com/blog/archives/2018/06/friday_squid_bl_627.html

Maybe not DNA, but biological somethings.

Cause of Cambrian explosion — Terrestrial or Cosmic?“:

Abstract: We review the salient evidence consistent with or predicted by the Hoyle-Wickramasinghe (H-W) thesis of Cometary (Cosmic) Biology. Much of this physical and biological evidence is multifactorial. One particular focus are the recent studies which date the emergence of the complex retroviruses of vertebrate lines at or just before the Cambrian Explosion of ~500 Ma. Such viruses are known to be plausibly associated with major evolutionary genomic processes. We believe this coincidence is not fortuitous but is consistent with a key prediction of H-W theory whereby major extinction-diversification evolutionary boundaries coincide with virus-bearing cometary-bolide bombardment events. A second focus is the remarkable evolution of intelligent complexity (Cephalopods) culminating in the emergence of the Octopus. A third focus concerns the micro-organism fossil evidence contained within meteorites as well as the detection in the upper atmosphere of apparent incoming life-bearing particles from space. In our view the totality of the multifactorial data and critical analyses assembled by Fred Hoyle, Chandra Wickramasinghe and their many colleagues since the 1960s leads to a very plausible conclusion — life may have been seeded here on Earth by life-bearing comets as soon as conditions on Earth allowed it to flourish (about or just before 4.1 Billion years ago); and living organisms such as space-resistant and space-hardy bacteria, viruses, more complex eukaryotic cells, fertilised ova and seeds have been continuously delivered ever since to Earth so being one important driver of further terrestrial evolution which has resulted in considerable genetic diversity and which has led to the emergence of mankind.

Two commentaries.

This is almost certainly not true.

As usual, you can also use this squid post to talk about the security stories in the news that I haven’t covered.

Read my blog posting guidelines here.

Recording lost seconds with the Augenblick blink camera

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/augenblick-camera/

Warning: a GIF used in today’s blog contains flashing images.

Students at the University of Bremen, Germany, have built a wearable camera that records the seconds of vision lost when you blink. Augenblick uses a Raspberry Pi Zero and Camera Module alongside muscle sensors to record footage whenever you close your eyes, producing a rather disjointed film of the sights you miss out on.

Augenblick blink camera recording using a Raspberry Pi Zero

Blink and you’ll miss it

The average person blinks up to five times a minute, with each blink lasting 0.5 to 0.8 seconds. These half-seconds add up to about 30 minutes a day. What sights are we losing during these minutes? That is the question asked by students Manasse Pinsuwan and René Henrich when they set out to design Augenblick.

Blinking is a highly invasive mechanism for our eyesight. Every day we close our eyes thousands of times without noticing it. Our mind manages to never let us wonder what exactly happens in the moments that we miss.

Capturing lost moments

For Augenblick, the wearer sticks MyoWare Muscle Sensor pads to their face, and these detect the electrical impulses that trigger blinking.

Augenblick blink camera recording using a Raspberry Pi Zero

Two pads are applied over the orbicularis oculi muscle that forms a ring around the eye socket, while the third pad is attached to the cheek as a neutral point.

Biology fact: there are two muscles responsible for blinking. The orbicularis oculi muscle closes the eye, while the levator palpebrae superioris muscle opens it — and yes, they both sound like the names of Harry Potter spells.

The sensor is read 25 times a second. Whenever it detects that the orbicularis oculi is active, the Camera Module records video footage.

Augenblick blink recording using a Raspberry Pi Zero

Pressing a button on the side of the Augenblick glasses set the code running. An LED lights up whenever the camera is recording and also serves to confirm the correct placement of the sensor pads.

Augenblick blink camera recording using a Raspberry Pi Zero

The Pi Zero saves the footage so that it can be stitched together later to form a continuous, if disjointed, film.

Learn more about the Augenblick blink camera

You can find more information on the conception, design, and build process of Augenblick here in German, with a shorter explanation including lots of photos here in English.

And if you’re keen to recreate this project, our free project resource for a wearable Pi Zero time-lapse camera will come in handy as a starting point.

The post Recording lost seconds with the Augenblick blink camera appeared first on Raspberry Pi.

Amazon Transcribe Now Generally Available

Post Syndicated from Randall Hunt original https://aws.amazon.com/blogs/aws/amazon-transcribe-now-generally-available/

At AWS re:Invent 2017 we launched Amazon Transcribe in private preview. Today we’re excited to make Amazon Transcribe generally available for all developers. Amazon Transcribe is an automatic speech recognition service (ASR) that makes it easy for developers to add speech to text capabilities to their applications. We’ve iterated on customer feedback in the preview to make a number of enhancements to Amazon Transcribe.

New Amazon Transcribe Features in GA

To start off we’ve made the SampleRate parameter optional which means you only need to know the file type of your media and the input language. We’ve added two new features – the ability to differentiate multiple speakers in the audio to provide more intelligible transcripts (“who spoke when”), and a custom vocabulary to improve the accuracy of speech recognition for product names, industry-specific terminology, or names of individuals. To refresh our memories on how Amazon Transcribe works lets look at a quick example. I’ll convert this audio in my S3 bucket.

import boto3
transcribe = boto3.client("transcribe")
    Media={"MediaFileUri": "https://s3.amazonaws.com/randhunt-transcribe-demo-us-east-1/out.mp3"}

This will output JSON similar to this (I’ve stripped out most of the response) with indidivudal speakers identified:

  "jobName": "reinvent",
  "accountId": "1234",
  "results": {
    "transcripts": [
        "transcript": "Hi, everybody, i'm randall ..."
    "speaker_labels": {
      "speakers": 2,
      "segments": [
          "start_time": "0.000000",
          "speaker_label": "spk_0",
          "end_time": "0.010",
          "items": []
          "start_time": "0.010000",
          "speaker_label": "spk_1",
          "end_time": "4.990",
          "items": [
              "start_time": "1.000",
              "speaker_label": "spk_1",
              "end_time": "1.190"
              "start_time": "1.190",
              "speaker_label": "spk_1",
              "end_time": "1.700"
    "items": [
        "start_time": "1.000",
        "end_time": "1.190",
        "alternatives": [
            "confidence": "0.9971",
            "content": "Hi"
        "type": "pronunciation"
        "alternatives": [
            "content": ","
        "type": "punctuation"
        "start_time": "1.190",
        "end_time": "1.700",
        "alternatives": [
            "confidence": "1.0000",
            "content": "everybody"
        "type": "pronunciation"
  "status": "COMPLETED"

Custom Vocabulary

Now if I needed to have a more complex technical discussion with a colleague I could create a custom vocabulary. A custom vocabulary is specified as an array of strings passed to the CreateVocabulary API and you can include your custom vocabulary in a transcription job by passing in the name as part of the Settings in a StartTranscriptionJob API call. An individual vocabulary can be as large as 50KB and each phrase must be less than 256 characters. If I wanted to transcribe the recordings of my highschool AP Biology class I could create a custom vocabulary in Python like this:

import boto3
transcribe = boto3.client("transcribe")

I can refer to this vocabulary later on by the name APBiology and update it programatically based on any errors I may find in the transcriptions.

Available Now

Amazon Transcribe is available now in US East (N. Virginia), US West (Oregon), US East (Ohio) and EU (Ireland). Transcribe’s free tier gives you 60 minutes of transcription for free per month for the first 12 months with a pay-as-you-go model of $0.0004 per second of transcribed audio after that, with a minimum charge of 15 seconds.

When combined with other tools and services I think transcribe opens up a entirely new opportunities for application development. I’m excited to see what technologies developers build with this new service.


[$] DIY biology

Post Syndicated from jake original https://lwn.net/Articles/747187/rss

A scientist with a rather unusual name, Meow-Ludo Meow-Meow, gave a talk at
linux.conf.au 2018
about the current trends in “do it yourself” (DIY) biology or
“biohacking”. He is perhaps most famous for being
prosecuted for implanting an Opal card RFID chip
into his hand; the
Opal card is used for public transportation fares in Sydney. He gave more
details about his implant as well as describing some other biohacking
projects in an engaging presentation.

Astro Pi celebrates anniversary of ISS Columbus module

Post Syndicated from David Honess original https://www.raspberrypi.org/blog/astro-pi-celebrates-anniversary/

Right now, 400km above the Earth aboard the International Space Station, are two very special Raspberry Pi computers. They were launched into space on 6 December 2015 and are, most assuredly, the farthest-travelled Raspberry Pi computers in existence. Each year they run experiments that school students create in the European Astro Pi Challenge.

Raspberry Astro Pi units on the International Space Station

Left: Astro Pi Vis (Ed); right: Astro Pi IR (Izzy). Image credit: ESA.

The European Columbus module

Today marks the tenth anniversary of the launch of the European Columbus module. The Columbus module is the European Space Agency’s largest single contribution to the ISS, and it supports research in many scientific disciplines, from astrobiology and solar science to metallurgy and psychology. More than 225 experiments have been carried out inside it during the past decade. It’s also home to our Astro Pi computers.

Here’s a video from 7 February 2008, when Space Shuttle Atlantis went skywards carrying the Columbus module in its cargo bay.

STS-122 Launch NASA TV Coverage

From February 7th, 2008 NASA-TV Coverage of The 121st Space Shuttle Launch Launched At:2:45:30 P.M E.T – Coverage begins exactly one hour till launch STS-122 Crew:

Today, coincidentally, is also the deadline for the European Astro Pi Challenge: Mission Space Lab. Participating teams have until midnight tonight to submit their experiments.

Anniversary celebrations

At 16:30 GMT today there will be a live event on NASA TV for the Columbus module anniversary with NASA flight engineers Joe Acaba and Mark Vande Hei.

Our Astro Pi computers will be joining in the celebrations by displaying a digital birthday candle that the crew can blow out. It works by detecting an increase in humidity when someone blows on it. The video below demonstrates the concept.

AstroPi candle

Uploaded by Effi Edmonton on 2018-01-17.

Do try this at home

The exact Astro Pi code that will run on the ISS today is available for you to download and run on your own Raspberry Pi and Sense HAT. You’ll notice that the program includes code to make it stop automatically when the date changes to 8 February. This is just to save time for the ground control team.

If you have a Raspberry Pi and a Sense HAT, you can use the terminal commands below to download and run the code yourself:

wget http://rpf.io/colbday -O birthday.py
chmod +x birthday.py

When you see a blank blue screen with the brightness increasing, the Sense HAT is measuring the baseline humidity. It does this every 15 minutes so it can recalibrate to take account of natural changes in background humidity. A humidity increase of 2% is needed to blow out the candle, so if the background humidity changes by more than 2% in 15 minutes, it’s possible to get a false positive. Press Ctrl + C to quit.

Please tweet pictures of your candles to @astro_pi – we might share yours! And if we’re lucky, we might catch a glimpse of the candle on the ISS during the NASA TV event at 16:30 GMT today.

The post Astro Pi celebrates anniversary of ISS Columbus module appeared first on Raspberry Pi.

Virtual Forest

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/virtual-forest/

The RICOH THETA S is a fairly affordable consumer 360° camera, which allows users to capture interesting locations and events for viewing through VR headsets and mobile-equipped Google Cardboard. When set up alongside a Raspberry Pi acting as a controller, plus a protective bubble, various cables, and good ol’ Mother Nature, the camera becomes a gateway to a serene escape from city life.

Virtual Forest

Ecologist Koen Hufkens, from the Richardson Lab in the Department of Organismic and Evolutionary Biology at Harvard University, decided to do exactly that, creating the Virtual Forest with the aim of “showing people how the forest changes throughout the seasons…and the beauty of the forest”.

The camera takes a still photograph every 15 minutes, uploading it for our viewing pleasure. The setup currently only supports daylight viewing, as the camera is not equipped for night vision, so check your watch first.

one autumn day

360 view of a day in a North-Eastern Hardwood forest during autumn

The build cost somewhere in the region of $500 to create; Hufkens provides a complete ingredients list here, with supporting code on GitHub. He also aims to improve the setup by using the new Nikon KeyMission, which can record video at 4K ultra-HD resolution.

The Virtual Forest has been placed deep within the heart of Harvard Forest, a university-owned plot of land used both by researchers and by the general public. If you live nearby, you could go look at it and possibly even appear in a photo. Please resist the urge to photobomb, though, because that would totally defeat the peopleless zen tranquility that we’re feeling here in Pi Towers.

The post Virtual Forest appeared first on Raspberry Pi.

Detecting landmines – with spinach

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/detecting-landmines-with-spinach/

Forget sniffer dogs…we need to talk about spinach.

The team at MIT (Massachusetts Institute of Technology) have been working to transform spinach plants into a means of detection in the fight against buried munitions such as landmines.

Plant-to-human communication

MIT engineers have transformed spinach plants into sensors that can detect explosives and wirelessly relay that information to a handheld device similar to a smartphone. (Learn more: http://news.mit.edu/2016/nanobionic-spinach-plants-detect-explosives-1031) Watch more videos from MIT: http://www.youtube.com/user/MITNewsOffice?sub_confirmation=1 The Massachusetts Institute of Technology is an independent, coeducational, privately endowed university in Cambridge, Massachusetts.

Nanoparticles, plus tiny tubes called carbon nanotubes, are embedded into the spinach leaves where they pick up nitro-aromatics, chemicals found in the hidden munitions.

It takes the spinach approximately ten minutes to absorb water from the ground, including the nitro-aromatics, which then bind to the polymer material wrapped around the nanotube.

But where does the Pi come into this?

The MIT team shine a laser onto the leaves, detecting the altered fluorescence of the light emitted by the newly bonded tubes. This light is then read by a Raspberry Pi fitted with an infrared camera, resulting in a precise map of where hidden landmines are located. This signal can currently be picked up within a one-mile radius, with plans to increase the reach in future.

detecting landmines with spinach

You can also physically hack a smartphone to replace the Raspberry Pi… but why would you want to do that?

The team at MIT have already used the tech to detect hydrogen peroxide, TNT, and sarin, while co-author Prof. Michael Strano advises that the same setup can be used to detect “virtually anything”.

“The plants could be use for defence applications, but also to monitor public spaces for terrorism-related activities, since we show both water and airborne detection”

More information on the paper can be found at the MIT website.

The post Detecting landmines – with spinach appeared first on Raspberry Pi.