Tag Archives: film

Recording lost seconds with the Augenblick blink camera

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/augenblick-camera/

Warning: a GIF used in today’s blog contains flashing images.

Students at the University of Bremen, Germany, have built a wearable camera that records the seconds of vision lost when you blink. Augenblick uses a Raspberry Pi Zero and Camera Module alongside muscle sensors to record footage whenever you close your eyes, producing a rather disjointed film of the sights you miss out on.

Augenblick blink camera recording using a Raspberry Pi Zero

Blink and you’ll miss it

The average person blinks up to five times a minute, with each blink lasting 0.5 to 0.8 seconds. These half-seconds add up to about 30 minutes a day. What sights are we losing during these minutes? That is the question asked by students Manasse Pinsuwan and René Henrich when they set out to design Augenblick.

Blinking is a highly invasive mechanism for our eyesight. Every day we close our eyes thousands of times without noticing it. Our mind manages to never let us wonder what exactly happens in the moments that we miss.

Capturing lost moments

For Augenblick, the wearer sticks MyoWare Muscle Sensor pads to their face, and these detect the electrical impulses that trigger blinking.

Augenblick blink camera recording using a Raspberry Pi Zero

Two pads are applied over the orbicularis oculi muscle that forms a ring around the eye socket, while the third pad is attached to the cheek as a neutral point.

Biology fact: there are two muscles responsible for blinking. The orbicularis oculi muscle closes the eye, while the levator palpebrae superioris muscle opens it — and yes, they both sound like the names of Harry Potter spells.

The sensor is read 25 times a second. Whenever it detects that the orbicularis oculi is active, the Camera Module records video footage.

Augenblick blink recording using a Raspberry Pi Zero

Pressing a button on the side of the Augenblick glasses set the code running. An LED lights up whenever the camera is recording and also serves to confirm the correct placement of the sensor pads.

Augenblick blink camera recording using a Raspberry Pi Zero

The Pi Zero saves the footage so that it can be stitched together later to form a continuous, if disjointed, film.

Learn more about the Augenblick blink camera

You can find more information on the conception, design, and build process of Augenblick here in German, with a shorter explanation including lots of photos here in English.

And if you’re keen to recreate this project, our free project resource for a wearable Pi Zero time-lapse camera will come in handy as a starting point.

The post Recording lost seconds with the Augenblick blink camera appeared first on Raspberry Pi.

Австрия: регулаторът не разрешава собствен канал на ORF в YouTube и нова филмова услуга на ORF

Post Syndicated from nellyo original https://nellyo.wordpress.com/2018/05/14/orf-3/

Информация от Австрия за начина, по който обществената телевизия е под надзора на местния регулатор за изпълнение на обществената мисия.

Регулаторът не е одобрил две предложения на ORF –  за собствен канал в YouTube и за нова услуга – предавания/филми от програмите на ORF срещу абонамент.

Видимо в Австрия  регулаторът има роля ex ante в  по-широк обхват – което е в защита на аудиторията.   БНТ е в YouYube  – или поне в интернет се вижда “Официален канал на БНТ” в YouTube. Но това не е единствената илюстрация за ограничената роля на българския регулатор – не е ясно например какво става с политематичната БНТ2, нито може да се прочете мониторингов доклад как БНТ изпълнява лицензиите си.

*

Eто какво става в Австрия:

Според прессъобщение от днес, австрийският медиен орган KommAustria е отхвърлил плановете на обществения телевизионен оператор ORF да създаде собствен канал в YouTube .

ORF кандидатства за разрешение да добави канал в YouTube към своите медийни дейности. Този канал трябваше да предложи главно предавания на ORF, които поради правни ограничения понастоящем не могат да бъдат предоставени за повече от 7 дни чрез catch-up услугата ORF TVthek.

KommAustria   твърди, че изключителното сътрудничество между ORF и YouTube би дискриминирало други, сравними компании.

Регулаторът също така  взема предвид съществуващите услуги, когато одобрява нови услуги на ORF. KommAustria предполага намаляване на интереса към ORF TVthek, ако се създаде  канал на ORF в YouTube. Освен това регулаторът твърди, че е възможно да се удължи общият период за предоставяне на програми на ORF TVthek (повече от 7 дни) чрез преразглеждане на правната рамка.

Не е одобрено и искането на ORF да превърне Flimmit в австрийска услуга тип Netflix, предлагайки предавания, които вече са били излъчвани по телевизионните канали или са предназначени за излъчване. ORF притежава Filmmit чрез дъщерно дружество.

Според KommAustria   по принцип не е забранено на ORF да предлага абонаментна услуга. В конкретния случай обаче нито икономически, нито правно искането не е обосновано, напр. остава “напълно неясно” колко голям ще бъде делът от таксата за тази услуга – доколкото разбирам, не е изяснена  пропорцията между абонамент и обществено финансиране.

ORF може да обжалва.

 

This is a really lovely Raspberry Pi tricorder

Post Syndicated from Helen Lynn original https://www.raspberrypi.org/blog/raspberry-pi-tricorder-prop/

At the moment I’m spending my evenings watching all of Star Trek in order. Yes, I have watched it before (but with some really big gaps). Yes, including the animated series (I’m up to The Terratin Incident). So I’m gratified to find this beautiful The Original Series–style tricorder build.

Star Trek Tricorder with Working Display!

At this year’s Replica Prop Forum showcase, we meet up once again wtih Brian Mix, who brought his new Star Trek TOS Tricorder. This beautiful replica captures the weight and finish of the filming hand prop, and Brian has taken it one step further with some modern-day electronics!

A what now?

If you don’t know what a tricorder is, which I guess is faintly possible, the easiest way I can explain is to steal words that Liz wrote when Recantha made one back in 2013. It’s “a made-up thing used by the crew of the Enterprise to measure stuff, store data, and scout ahead remotely when exploring strange new worlds, seeking out new life and new civilisations, and all that jazz.”

A brief history of Picorders

We’ve seen other Raspberry Pi–based realisations of this iconic device. Recantha’s LEGO-cased tricorder delivered some authentic functionality, including temperature sensors, an ultrasonic distance sensor, a photosensor, and a magnetometer. Michael Hahn’s tricorder for element14’s Sci-Fi Your Pi competition in 2015 packed some similar functions, along with Original Series audio effects, into a neat (albeit non-canon) enclosure.

Brian Mix’s Original Series tricorder

Brian Mix’s tricorder, seen in the video above from Tested at this year’s Replica Prop Forum showcase, is based on a high-quality kit into which, he discovered, a Raspberry Pi just fits. He explains that the kit is the work of the late Steve Horch, a special effects professional who provided props for later Star Trek series, including the classic Deep Space Nine episode Trials and Tribble-ations.

A still from an episode of Star Trek: Deep Space Nine: Jadzia Dax, holding an Original Series-sylte tricorder, speaks with Benjamin Sisko

Dax, equipped for time travel

This episode’s plot required sets and props — including tricorders — replicating the USS Enterprise of The Original Series, and Steve Horch provided many of these. Thus, a tricorder kit from him is about as close to authentic as you can possibly find unless you can get your hands on a screen-used prop. The Pi allows Brian to drive a real display and a speaker: “Being the geek that I am,” he explains, “I set it up to run every single Original Series Star Trek episode.”

Even more wonderful hypothetical tricorders that I would like someone to make

This tricorder is beautiful, and it makes me think how amazing it would be to squeeze in some of the sensor functionality of the devices depicted in the show. Space in the case is tight, but it looks like there might be a little bit of depth to spare — enough for an IMU, maybe, or a temperature sensor. I’m certain the future will bring more Pi tricorder builds, and I, for one, can’t wait. Please tell us in the comments if you’re planning something along these lines, and, well, I suppose some other sci-fi franchises have decent Pi project potential too, so we could probably stand to hear about those.

If you’re commenting, no spoilers please past The Animated Series S1 E11. Thanks.

The post This is a really lovely Raspberry Pi tricorder appeared first on Raspberry Pi.

Raspberry Pi in your favourite films and TV shows

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/raspberry-pi-films-tv/

If, like us, you’ve been bingeflixing your way through Netflix’s new show, Lost in Space, you may have noticed a Raspberry Pi being used as futuristic space tech.

Raspberry Pi Netflix Lost in Space

Danger, Will Robinson, that probably won’t work

This isn’t the first time a Pi has been used as a film or television prop. From Mr. Robot and Disney Pixar’s Big Hero 6 to Mr. Robot, Sense8, and Mr. Robot, our humble little computer has become quite the celeb.

Raspberry Pi Charlie Brooker Election Wipe
Raspberry Pi Big Hero 6
Raspberry Pi Netflix

Raspberry Pi Spy has been working hard to locate and document the appearance of the Raspberry Pi in some of our favourite shows and movies. He’s created this video covering 2010-2017:

Raspberry Pi TV and Film Appearances 2012-2017

Since 2012 the Raspberry Pi single board computer has appeared in a number of movies and TV shows. This video is a run through of those appearances where the Pi has been used as a prop.

For 2018 appearances and beyond, you can find a full list on the Raspberry Pi Spy website. If you’ve spotted an appearance that’s not on the list, tell us in the comments!

The post Raspberry Pi in your favourite films and TV shows appeared first on Raspberry Pi.

Converting a Kodak Box Brownie into a digital camera

Post Syndicated from Rob Zwetsloot original https://www.raspberrypi.org/blog/kodak-brownie-camera/

In this article from The MagPi issue 69, David Crookes explains how Daniel Berrangé took an old Kodak Brownie from the 1950s and turned it into a quirky digital camera. Get your copy of The MagPi magazine in stores now, or download it as a free PDF here.

Daniel Berrangé Kodak Brownie Raspberry Pi Camera

The Kodak Box Brownie

When Kodak unveiled its Box Brownie in 1900, it did so with the slogan ‘You press the button, we do the rest.’ The words referred to the ease-of-use of what was the world’s first mass-produced camera. But it could equally apply to Daniel Berrangé’s philosophy when modifying it for the 21st century. “I wanted to use the Box Brownie’s shutter button to trigger image capture, and make it simple to use,” he tells us.

Daniel Berrangé Kodak Brownie Raspberry Pi Camera

Daniel’s project grew from a previous effort in which he placed a pinhole webcam inside a ladies’ powder compact case. “The Box Brownie project is essentially a repeat of that design but with a normal lens instead of a pinhole, a real camera case, and improved software to enable a shutter button. Ideally, it would look unchanged from when it was shooting film.”

Webcam woes

At first, Daniel looked for a cheap webcam, intending to spend no more than the price of a Pi Zero. This didn’t work out too well. “The low-light performance of the webcam was not sufficient to make a pinhole camera so I just decided to make a ‘normal’ digital camera instead,” he reveals.
To that end, he began removing some internal components from the Box Brownie. “With the original lens removed, the task was to position the webcam’s electronic light sensor (the CCD) and lens as close to the front of the camera as possible,” Daniel explains. “In the end, the CCD was about 15 mm away from the front aperture of the camera, giving a field of view that was approximately the same as the unmodified camera would achieve.”

Daniel Berrangé Kodak Brownie Raspberry Pi Camera
Daniel Berrangé Kodak Brownie Raspberry Pi Camera
Daniel Berrangé Kodak Brownie Raspberry Pi Camera

It was then time for him to insert the Raspberry Pi, upon which was a custom ‘init’ binary that loads a couple of kernel modules to run the webcam, mount the microSD file system, and launch the application binary. Here, Daniel found he was in luck. “I’d noticed that the size of a 620 film spool (63 mm) was effectively the same as the width of a Raspberry Pi Zero (65 mm), so it could be held in place between the film spool grips,” he recalls. “It was almost as if it was designed with this in mind.”

Shutter success

In order to operate the camera, Daniel had to work on the shutter button. “The Box Brownie’s shutter button is entirely mechanical, driven by a handful of levers and springs,” Daniel explains. “First, the Pi Zero needs to know when the shutter button is pressed and second, the physical shutter has to be open while the webcam is capturing the image. Rather than try to synchronise image capture with the fraction of a second that the physical shutter is open, a bit of electrical tape was used on the shutter mechanism to keep it permanently open.”

Daniel Berrangé Kodak Brownie Raspberry Pi Camera

Daniel made use of the Pi Zero’s GPIO pins to detect the pressing of the shutter button. It determines if each pin is at 0 or 5 volts. “My thought was that I could set a GPIO pin high to 5 V, and then use the action of the shutter button to short it to ground, and detect this change in level from software.”

This initially involved using a pair of bare wires and some conductive paint, although the paint was later replaced by a piece of tinfoil. But with the button pressed, the GPIO pin level goes to zero and the device constantly captures still images until the button is released. All that’s left to do is smile and take the perfect snap.

The post Converting a Kodak Box Brownie into a digital camera appeared first on Raspberry Pi.

Using AWS Lambda and Amazon Comprehend for sentiment analysis

Post Syndicated from Chris Munns original https://aws.amazon.com/blogs/compute/using-aws-lambda-and-amazon-comprehend-for-sentiment-analysis/

This post courtesy of Giedrius Praspaliauskas, AWS Solutions Architect

Even with best IVR systems, customers get frustrated. What if you knew that 10 callers in your Amazon Connect contact flow were likely to say “Agent!” in frustration in the next 30 seconds? Would you like to get to them before that happens? What if your bot was smart enough to admit, “I’m sorry this isn’t helping. Let me find someone for you.”?

In this post, I show you how to use AWS Lambda and Amazon Comprehend for sentiment analysis to make your Amazon Lex bots in Amazon Connect more sympathetic.

Setting up a Lambda function for sentiment analysis

There are multiple natural language and text processing frameworks or services available to use with Lambda, including but not limited to Amazon Comprehend, TextBlob, Pattern, and NLTK. Pick one based on the nature of your system:  the type of interaction, languages supported, and so on. For this post, I picked Amazon Comprehend, which uses natural language processing (NLP) to extract insights and relationships in text.

The walkthrough in this post is just an example. In a full-scale implementation, you would likely implement a more nuanced approach. For example, you could keep the overall sentiment score through the conversation and act only when it reaches a certain threshold. It is worth noting that this Lambda function is not called for missed utterances, so there may be a gap between what is being analyzed and what was actually said.

The Lambda function is straightforward. It analyses the input transcript field of the Amazon Lex event. Based on the overall sentiment value, it generates a response message with next step instructions. When the sentiment is neutral, positive, or mixed, the response leaves it to Amazon Lex to decide what the next steps should be. It adds to the response overall sentiment value as an additional session attribute, along with slots’ values received as an input.

When the overall sentiment is negative, the function returns the dialog action, pointing to an escalation intent (specified in the environment variable ESCALATION_INTENT_NAME) or returns the fulfillment closure action with a failure state when the intent is not specified. In addition to actions or intents, the function returns a message, or prompt, to be provided to the customer before taking the next step. Based on the returned action, Amazon Connect can select the appropriate next step in a contact flow.

For this walkthrough, you create a Lambda function using the AWS Management Console:

  1. Open the Lambda console.
  2. Choose Create Function.
  3. Choose Author from scratch (no blueprint).
  4. For Runtime, choose Python 3.6.
  5. For Role, choose Create a custom role. The custom execution role allows the function to detect sentiments, create a log group, stream log events, and store the log events.
  6. Enter the following values:
    • For Role Description, enter Lambda execution role permissions.
    • For IAM Role, choose Create an IAM role.
    • For Role Name, enter LexSentimentAnalysisLambdaRole.
    • For Policy, use the following policy:
{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "logs:CreateLogGroup",
                "logs:CreateLogStream",
                "logs:PutLogEvents"
            ],
            "Resource": "arn:aws:logs:*:*:*"
        },
        {
            "Action": [
                "comprehend:DetectDominantLanguage",
                "comprehend:DetectSentiment"
            ],
            "Effect": "Allow",
            "Resource": "*"
        }
    ]
}
    1. Choose Create function.
    2. Copy/paste the following code to the editor window
import os, boto3

ESCALATION_INTENT_MESSAGE="Seems that you are having troubles with our service. Would you like to be transferred to the associate?"
FULFILMENT_CLOSURE_MESSAGE="Seems that you are having troubles with our service. Let me transfer you to the associate."

escalation_intent_name = os.getenv('ESACALATION_INTENT_NAME', None)

client = boto3.client('comprehend')

def lambda_handler(event, context):
    sentiment=client.detect_sentiment(Text=event['inputTranscript'],LanguageCode='en')['Sentiment']
    if sentiment=='NEGATIVE':
        if escalation_intent_name:
            result = {
                "sessionAttributes": {
                    "sentiment": sentiment
                    },
                    "dialogAction": {
                        "type": "ConfirmIntent", 
                        "message": {
                            "contentType": "PlainText", 
                            "content": ESCALATION_INTENT_MESSAGE
                        }, 
                    "intentName": escalation_intent_name
                    }
            }
        else:
            result = {
                "sessionAttributes": {
                    "sentiment": sentiment
                },
                "dialogAction": {
                    "type": "Close",
                    "fulfillmentState": "Failed",
                    "message": {
                            "contentType": "PlainText",
                            "content": FULFILMENT_CLOSURE_MESSAGE
                    }
                }
            }

    else:
        result ={
            "sessionAttributes": {
                "sentiment": sentiment
            },
            "dialogAction": {
                "type": "Delegate",
                "slots" : event["currentIntent"]["slots"]
            }
        }
    return result
  1. Below the code editor specify the environment variable ESCALATION_INTENT_NAME with a value of Escalate.

  1. Click on Save in the top right of the console.

Now you can test your function.

  1. Click Test at the top of the console.
  2. Configure a new test event using the following test event JSON:
{
  "messageVersion": "1.0",
  "invocationSource": "DialogCodeHook",
  "userId": "1234567890",
  "sessionAttributes": {},
  "bot": {
    "name": "BookSomething",
    "alias": "None",
    "version": "$LATEST"
  },
  "outputDialogMode": "Text",
  "currentIntent": {
    "name": "BookSomething",
    "slots": {
      "slot1": "None",
      "slot2": "None"
    },
    "confirmationStatus": "None"
  },
  "inputTranscript": "I want something"
}
  1. Click Create
  2. Click Test on the console

This message should return a response from Lambda with a sentiment session attribute of NEUTRAL.

However, if you change the input to “This is garbage!”, Lambda changes the dialog action to the escalation intent specified in the environment variable ESCALATION_INTENT_NAME.

Setting up Amazon Lex

Now that you have your Lambda function running, it is time to create the Amazon Lex bot. Use the BookTrip sample bot and call it BookSomething. The IAM role is automatically created on your behalf. Indicate that this bot is not subject to the COPPA, and choose Create. A few minutes later, the bot is ready.

Make the following changes to the default configuration of the bot:

  1. Add an intent with no associated slots. Name it Escalate.
  2. Specify the Lambda function for initialization and validation in the existing two intents (“BookCar” and “BookHotel”), at the same time giving Amazon Lex permission to invoke it.
  3. Leave the other configuration settings as they are and save the intents.

You are ready to build and publish this bot. Set a new alias, BookSomethingWithSentimentAnalysis. When the build finishes, test it.

As you see, sentiment analysis works!

Setting up Amazon Connect

Next, provision an Amazon Connect instance.

After the instance is created, you need to integrate the Amazon Lex bot created in the previous step. For more information, see the Amazon Lex section in the Configuring Your Amazon Connect Instance topic.  You may also want to look at the excellent post by Randall Hunt, New – Amazon Connect and Amazon Lex Integration.

Create a new contact flow, “Sentiment analysis walkthrough”:

  1. Log in into the Amazon Connect instance.
  2. Choose Create contact flow, Create transfer to agent flow.
  3. Add a Get customer input block, open the icon in the top left corner, and specify your Amazon Lex bot and its intents.
  4. Select the Text to speech audio prompt type and enter text for Amazon Connect to play at the beginning of the dialog.
  5. Choose Amazon Lex, enter your Amazon Lex bot name and the alias.
  6. Specify the intents to be used as dialog branches that a customer can choose: BookHotel, BookTrip, or Escalate.
  7. Add two Play prompt blocks and connect them to the customer input block.
    • If booking hotel or car intent is returned from the bot flow, play the corresponding prompt (“OK, will book it for you”) and initiate booking (in this walkthrough, just hang up after the prompt).
    • However, if escalation intent is returned (caused by the sentiment analysis results in the bot), play the prompt (“OK, transferring to an agent”) and initiate the transfer.
  8. Save and publish the contact flow.

As a result, you have a contact flow with a single customer input step and a text-to-speech prompt that uses the Amazon Lex bot. You expect one of the three intents returned:

Edit the phone number to associate the contact flow that you just created. It is now ready for testing. Call the phone number and check how your contact flow works.

Cleanup

Don’t forget to delete all the resources created during this walkthrough to avoid incurring any more costs:

  • Amazon Connect instance
  • Amazon Lex bot
  • Lambda function
  • IAM role LexSentimentAnalysisLambdaRole

Summary

In this walkthrough, you implemented sentiment analysis with a Lambda function. The function can be integrated into Amazon Lex and, as a result, into Amazon Connect. This approach gives you the flexibility to analyze user input and then act. You may find the following potential use cases of this approach to be of interest:

  • Extend the Lambda function to identify “hot” topics in the user input even if the sentiment is not negative and take action proactively. For example, switch to an escalation intent if a user mentioned “where is my order,” which may signal potential frustration.
  • Use Amazon Connect Streams to provide agent sentiment analysis results along with call transfer. Enable service tailored towards particular customer needs and sentiments.
  • Route calls to agents based on both skill set and sentiment.
  • Prioritize calls based on sentiment using multiple Amazon Connect queues instead of transferring directly to an agent.
  • Monitor quality and flag for review contact flows that result in high overall negative sentiment.
  • Implement sentiment and AI/ML based call analysis, such as a real-time recommendation engine. For more details, see Machine Learning on AWS.

If you have questions or suggestions, please comment below.

American Public Television Embraces the Cloud — And the Future

Post Syndicated from Andy Klein original https://www.backblaze.com/blog/american-public-television-embraces-the-cloud-and-the-future/

American Public Television website

American Public Television was like many organizations that have been around for a while. They were entrenched using an older technology — in their case, tape storage and distribution — that once met their needs but was limiting their productivity and preventing them from effectively collaborating with their many media partners. APT’s VP of Technology knew that he needed to move into the future and embrace cloud storage to keep APT ahead of the game.
Since 1961, American Public Television (APT) has been a leading distributor of groundbreaking, high-quality, top-rated programming to the nation’s public television stations. Gerry Field is the Vice President of Technology at APT and is responsible for delivering their extensive program catalog to 350+ public television stations nationwide.

In the time since Gerry  joined APT in 2007, the industry has been in digital overdrive. During that time APT has continued to acquire and distribute the best in public television programming to their technically diverse subscribers.

This created two challenges for Gerry. First, new technology and format proliferation were driving dramatic increases in digital storage. Second, many of APT’s subscribers struggled to keep up with the rapidly changing industry. While some subscribers had state-of-the-art satellite systems to receive programming, others had to wait for the post office to drop off programs recorded on tape weeks earlier. With no slowdown on the horizon of innovation in the industry, Gerry knew that his storage and distribution systems would reach a crossroads in no time at all.

American Public Television logo

Living the tape paradigm

The digital media industry is only a few years removed from its film, and later videotape, roots. Tape was the input and the output of the industry for many years. As a consequence, the tools and workflows used by the industry were built and designed to work with tape. Over time, the “file” slowly replaced the tape as the object to be captured, edited, stored and distributed. Trouble was, many of the systems and more importantly workflows were based on processing tape, and these have proven to be hard to change.

At APT, Gerry realized the limits of the tape paradigm and began looking for technologies and solutions that enabled workflows based on file and object based storage and distribution.

Thinking file based storage and distribution

For data (digital media) storage, APT, like everyone else, started by installing onsite storage servers. As the amount of digital data grew, more storage was added. In addition, APT was expanding its distribution footprint by creating or partnering with distribution channels such as CreateTV and APT Worldwide. This dramatically increased the number of programming formats and the amount of data that had to be stored. As a consequence, updating, maintaining, and managing the APT storage systems was becoming a major challenge and a major resource hog.

APT Online

Knowing that his in-house storage system was only going to cost more time and money, Gerry decided it was time to look at cloud storage. But that wasn’t the only reason he looked at the cloud. While most people consider cloud storage as just a place to back up and archive files, Gerry was envisioning how the ubiquity of the cloud could help solve his distribution challenges. The trouble was the price of cloud storage from vendors like Amazon S3 and Microsoft Azure was a non-starter, especially for a non-profit. Then Gerry came across Backblaze. B2 Cloud Storage service met all of his performance requirements, and at $0.005/GB/month for storage and $0.01/GB for downloads it was nearly 75% less than S3 or Azure.

Gerry did the math and found that he could economically incorporate B2 Cloud Storage into his IT portfolio, using it for both program submission and for active storage and archiving of the APT programs. In addition, B2 now gives him the foundation necessary to receive and distribute programming content over the Internet. This is especially useful for organizations that can’t conveniently access satellite distribution systems. Not to mention downloading from the cloud is much faster than sending a tape through the mail.

Adding B2 Cloud Storage to their infrastructure has helped American Public Television address two key challenges. First, they now have “unlimited” storage in the cloud without having to add any hardware. In addition, with B2, they only pay for the storage they use. That means they don’t have to buy storage upfront trying to match the maximum amount of storage they’ll ever need. Second, by using B2 as a distribution source for their programming APT subscribers, especially the smaller and remote ones, can get content faster and more reliably without having to perform costly upgrades to their infrastructure.

The road ahead

As APT gets used to their file based infrastructure and workflow, there are a number of cost saving and income generating ideas they are pondering which are now worth considering. Here are a few:

Program Submissions — New content can be uploaded from anywhere using a web browser, an Internet connection, and a login. For example, a producer in Cambodia can upload their film to B2. From there the film is downloaded to an in-house system where it is processed and transcoded using compute. The finished film is added to the APT catalog and added to B2. Once there, the program is instantly available for subscribers to order and download.

“The affordability and performance of Backblaze B2 is what allowed us to make the B2 cloud part of the APT data storage and distribution strategy into the future.” — Gerry Field

Easier Previews — At any time, work in process or finished programs can be made available for download from the B2 cloud. One place this could be useful is where a subscriber needs to review a program to comply with local policies and practices before airing. In the old system, each “one-off” was a time consuming manual process.

Instant Subscriptions — There are many organizations such as schools and businesses that want to use just one episode of a desired show. With an e-commerce based website, current or even archived programming kept in B2 could be available to download or stream for a minimal charge.

At APT there were multiple technologies needed to make their file-based infrastructure work, but as Gerry notes, having an affordable, trustworthy, cloud storage service like B2 is one of the critical building blocks needed to make everything work together.

The post American Public Television Embraces the Cloud — And the Future appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

LED cubes and how to map them

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/led-panel-cubes-and-how-to-build-them/

Taking inspiration from a cube he had filmed at the 34th Chaos Communication Congress in Leipzig, Germany, polyfloyd gathered friends Sebastius and Boekenwuurm together to create their own.

The build

As polyfloyd’s blog post for the project notes, Sebastius led the way with the hardware portion of the build. The cube is made from six LED panels driven by a Raspberry Pi, and uses a breakout board to support the panels, which are connected in pairs:

The displays are connected in 3 chains, the maximum number of parallel chains the board supports, of 2 panels each. Having a higher degree of parallelization increases the refresh rate which in turn improves the overall image quality.

The first two chains make up the 4 sides. The remaining chain makes up the top and bottom of the cube.

Sebastius removed the plastic frames that come as standard on the panels, in order to allow them to fit together snugly as a cube. He designed and laser-cut a custom frame from plywood to support the panels instead.

Raspberry Pi LED Cube

Software

The team used hzeller’s software to drive the panels, and polyfloyd wrote their own program to “shove the pixels around”. polyfloyd used Ledcat, software they had made to drive previous LED projects, and adapted this interface so programs written for Ledcat would also work with hzeller’s library.

The full code for the project can be found on polyfloyd’s GitHub profile. It includes the ability to render animations to gzipped files, and to stream animations in real time via SSH.

Mapping 2D and spherical images with shaders

“One of the programs that could work with my LED-panels through [Unix] pipes was Shady,” observes polyfloyd, explaining the use of shaders with the cube. “The program works by rendering OpenGL fragment shaders to an RGB24 format which could then be piped to wherever needed. These shaders are small programs that can render an image by calculating the color for each pixel on the screen individually.”

The team programmed a shader to map the two-dimensional position of pixels in an image to the three-dimensional space of the cube. This then allowed the team to apply the mapping to spherical images, such as the globe in the video below:

The team has interesting plans for the cube moving forward, including the addition of an accelerometer and batteries. Follow their progress on the polyfloyd blog.

Fun with LED panels

The internet is full of amazing Raspberry Pi projects that use LED panels. This recent project available on Instructables shows how to assemble and set up a particle generator, while this one, featured on this blog last year, tracks emojis used on the Chelsea Handler: Gotta Go! app.

The post LED cubes and how to map them appeared first on Raspberry Pi.

HDD vs SSD: What Does the Future for Storage Hold?

Post Syndicated from Roderick Bauer original https://www.backblaze.com/blog/ssd-vs-hdd-future-of-storage/

SSD 60 TB drive

This is part one of a series. Use the Join button above to receive notification of future posts on this and other topics.

Customers frequently ask us whether and when we plan to move our cloud backup and data storage to SSDs (Solid-State Drives). That’s not a surprising question considering the many advantages SSDs have over magnetic platter type drives, also known as HDDs (Hard-Disk Drives).

We’re a large user of HDDs in our data centers (currently 100,000 hard drives holding over 500 petabytes of data). We want to provide the best performance, reliability, and economy for our cloud backup and cloud storage services, so we continually evaluate which drives to use for operations and in our data centers. While we use SSDs for some applications, which we’ll describe below, there are reasons why HDDs will continue to be the primary drives of choice for us and other cloud providers for the foreseeable future.

HDDs vs SSDs

HDD vs SSD

The laptop computer I am writing this on has a single 512GB SSD, which has become a common feature in higher end laptops. The SSD’s advantages for a laptop are easy to understand: they are smaller than an HDD, faster, quieter, last longer, and are not susceptible to vibration and magnetic fields. They also have much lower latency and access times.

Today’s typical online price for a 2.5” 512GB SSD is $140 to $170. The typical online price for a 3.5” 512 GB HDD is $44 to $65. That’s a pretty significant difference in price, but since the SSD helps make the laptop lighter, enables it to be more resistant to the inevitable shocks and jolts it will experience in daily use, and adds of benefits of faster booting, faster waking from sleep, and faster launching of applications and handling of big files, the extra cost for the SSD in this case is worth it.

Some of these SSD advantages, chiefly speed, also will apply to a desktop computer, so desktops are increasingly outfitted with SSDs, particularly to hold the operating system, applications, and data that is accessed frequently. Replacing a boot drive with an SSD has become a popular upgrade option to breathe new life into a computer, especially one that seems to take forever to boot or is used for notoriously slow-loading applications such as Photoshop.

We covered upgrading your computer with an SSD in our blog post SSD 101: How to Upgrade Your Computer With An SSD.

Data centers are an entirely different kettle of fish. The primary concerns for data center storage are reliability, storage density, and cost. While SSDs are strong in the first two areas, it’s the third where they are not yet competitive. At Backblaze we adopt higher density HDDs as they become available — we’re currently using both 10TB and 12TB drives (among other capacities) in our data centers. Higher density drives provide greater storage density per Storage Pod and Vault and reduce our overhead cost through less required maintenance and lower total power requirements. Comparable SSDs in those sizes would cost roughly $1,000 per terabyte, considerably higher than the corresponding HDD. Simply put, SSDs are not yet in the price range to make their use economical for the benefits they provide, which is the reason why we expect to be using HDDs as our primary storage media for the foreseeable future.

What Are HDDs?

HDDs have been around over 60 years since IBM introduced them in 1956. The first disk drive was the size of a car, stored a mere 3.75 megabytes, and cost $300,000 in today’s dollars.

IBM 350 Disk Storage System — 3.75MB in 1956

The 350 Disk Storage System was a major component of the IBM 305 RAMAC (Random Access Method of Accounting and Control) system, which was introduced in September 1956. It consisted of 40 platters and a dual read/write head on a single arm that moved up and down the stack of magnetic disk platters.

The basic mechanism of an HDD remains unchanged since then, though it has undergone continual refinement. An HDD uses magnetism to store data on a rotating platter. A read/write head is affixed to an arm that floats above the spinning platter reading and writing data. The faster the platter spins, the faster an HDD can perform. Typical laptop drives today spin at either 5400 RPM (revolutions per minute) or 7200 RPM, though some server-based platters spin at even higher speeds.

Exploded drawing of a hard drive

Exploded drawing of a hard drive

The platters inside the drives are coated with a magnetically sensitive film consisting of tiny magnetic grains. Data is recorded when a magnetic write-head flies just above the spinning disk; the write head rapidly flips the magnetization of one magnetic region of grains so that its magnetic pole points up or down, to encode a 1 or a 0 in binary code. If all this sounds like an HDD is vulnerable to shocks and vibration, you’d be right. They also are vulnerable to magnets, which is one way to destroy the data on an HDD if you’re getting rid of it.

The major advantage of an HDD is that it can store lots of data cheaply. One and two terabyte (1,024 and 2,048 gigabytes) hard drives are not unusual for a laptop these days, and 10TB and 12TB drives are now available for desktops and servers. Densities and rotation speeds continue to grow. However, if you compare the cost of common HDDs vs SSDs for sale online, the SSDs are roughly 3-5x the cost per gigabyte. So if you want cheap storage and lots of it, using a standard hard drive is definitely the more economical way to go.

What are the best uses for HDDs?

  • Disk arrays (NAS, RAID, etc.) where high capacity is needed
  • Desktops when low cost is priority
  • Media storage (photos, videos, audio not currently being worked on)
  • Drives with extreme number of reads and writes

What Are SSDs?

SSDs go back almost as far as HDDs, with the first semiconductor storage device compatible with a hard drive interface introduced in 1978, the StorageTek 4305.

Storage Technology 4305 SSD

The StorageTek was an SSD aimed at the IBM mainframe compatible market. The STC 4305 was seven times faster than IBM’s popular 2305 HDD system (and also about half the price). It consisted of a cabinet full of charge-coupled devices and cost $400,000 for 45MB capacity with throughput speeds up to 1.5 MB/sec.

SSDs are based on a type of non-volatile memory called NAND (named for the Boolean operator “NOT AND,” and one of two main types of flash memory). Flash memory stores data in individual memory cells, which are made of floating-gate transistors. Though they are semiconductor-based memory, they retain their information when no power is applied to them — a feature that’s obviously a necessity for permanent data storage.

Samsung SSD

Samsung SSD 850 Pro

Compared to an HDD, SSDs have higher data-transfer rates, higher areal storage density, better reliability, and much lower latency and access times. For most users, it’s the speed of an SSD that primarily attracts them. When discussing the speed of drives, what we are referring to is the speed at which they can read and write data.

For HDDs, the speed at which the platters spin strongly determines the read/write times. When data on an HDD is accessed, the read/write head must physically move to the location where the data was encoded on a magnetic section on the platter. If the file being read was written sequentially to the disk, it will be read quickly. As more data is written to the disk, however, it’s likely that the file will be written across multiple sections, resulting in fragmentation of the data. Fragmented data takes longer to read with an HDD as the read head has to move to different areas of the platter(s) to completely read all the data requested.

Because SSDs have no moving parts, they can operate at speeds far above those of a typical HDD. Fragmentation is not an issue for SSDs. Files can be written anywhere with little impact on read/write times, resulting in read times far faster than any HDD, regardless of fragmentation.

Samsung SSD 850 Pro (back)

Due to the way data is written and read to the drive, however, SSD cells can wear out over time. SSD cells push electrons through a gate to set its state. This process wears on the cell and over time reduces its performance until the SSD wears out. This effect takes a long time and SSDs have mechanisms to minimize this effect, such as the TRIM command. Flash memory writes an entire block of storage no matter how few pages within the block are updated. This requires reading and caching the existing data, erasing the block and rewriting the block. If an empty block is available, a write operation is much faster. The TRIM command, which must be supported in both the OS and the SSD, enables the OS to inform the drive which blocks are no longer needed. It allows the drive to erase the blocks ahead of time in order to make empty blocks available for subsequent writes.

The effect of repeated reading and erasing on an SSD is cumulative and an SSD can slow down and even display errors with age. It’s more likely, however, that the system using the SSD will be discarded for obsolescence before the SSD begins to display read/write errors. Hard drives eventually wear out from constant use as well, since they use physical recording methods, so most users won’t base their selection of an HDD or SSD drive based on expected longevity.

SSD internals

SSD circuit board

Overall, SSDs are considered far more durable than HDDs due to a lack of mechanical parts. The moving mechanisms within an HDD are susceptible to not only wear and tear over time, but to damage due to movement or forceful contact. If one were to drop a laptop with an HDD, there is a high likelihood that all those moving parts will collide, resulting in potential data loss and even destructive physical damage that could kill the HDD outright. SSDs have no moving parts so, while they hold the risk of a potentially shorter life span due to high use, they can survive the rigors we impose upon our portable devices and laptops.

What are the best uses for SSDs?

  • Notebooks, laptops, where performance, lightweight, areal storage density, resistance to shock and general ruggedness are desirable
  • Boot drives holding operating system and applications, which will speed up booting and application launching
  • Working files (media that is being edited: photos, video, audio, etc.)
  • Swap drives where SSD will speed up disk paging
  • Cache drives
  • Database servers
  • Revitalizing an older computer. If you’ve got a computer that seems slow to start up and slow to load applications and files, updating the boot drive with an SSD could make it seem, if not new, at least as if it just came back refreshed from spending some time on the beach.

Stay Tuned for Part 2 of HDD vs SSD

That’s it for part 1. In our second part we’ll take a deeper look at the differences between HDDs and SSDs, how both HDD and SSD technologies are evolving, and how Backblaze takes advantage of SSDs in our operations and data centers.

Here's a tip!Here’s a tip on finding all the posts tagged with SSD on our blog. Just follow https://www.backblaze.com/blog/tag/ssd/.

Don’t miss future posts on HDDs, SSDs, and other topics, including hard drive stats, cloud storage, and tips and tricks for backing up to the cloud. Use the Join button above to receive notification of future posts on our blog.

The post HDD vs SSD: What Does the Future for Storage Hold? appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

Containers Will Not Fix Your Broken Culture (and Other Hard Truths) (ACMQueue)

Post Syndicated from jake original https://lwn.net/Articles/747020/rss

In ACMQueue magazine, Bridget Kromhout writes about containers and why they are not the solution to every problem. The article is subtitled:
“Complex socio-technical systems are hard;
film at 11.”
Don’t get me wrong—containers are delightful! But let’s be real: we’re unlikely to solve the vast majority of problems in a given organization via the judicious application of kernel features. If you have contention between your ops team and your dev team(s)—and maybe they’re all facing off with some ill-considered DevOps silo inexplicably stuck between them—then cgroups and namespaces won’t have a prayer of solving that.

Development teams love the idea of shipping their dependencies bundled with their apps, imagining limitless portability. Someone in security is weeping for the unpatched CVEs, but feature velocity is so desirable that security’s pleas go unheard. Platform operators are happy (well, less surly) knowing they can upgrade the underlying infrastructure without affecting the dependencies for any applications, until they realize the heavyweight app containers shipping a full operating system aren’t being maintained at all.”

[$] Containers from user space

Post Syndicated from corbet original https://lwn.net/Articles/745820/rss

In a linux.conf.au 2018 keynote called “Containers from user space” — an
explicit reference to the cult film “Plan 9 from Outer Space” — Jessie
Frazelle took the audience on a fast-moving tour of the past, present, and
possible future of container technology. Describing the container craze as
“amazing”, she covered topics like the definition of a container, security,
runtimes, container concepts in programming languages, multi-tenancy, and
more.

Spiegelbilder Studio’s giant CRT video walls

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/crt-video-walls/

After getting in contact with us to share their latest build with us, we invited Matvey Fridman of Germany-based production company Spiegelbilder Studio to write a guest blog post about their CRT video walls created for the band STRANDKØNZERT.

STRANDKØNZERT – TAGTRAUMER – OFFICIAL VIDEO

GERMAN DJENT RAP / EST. 2017. COMPLETE DIY-PROJECT.

CRT video wall

About a year ago, we had the idea of building a huge video wall out of old TVs to use in a music video. It took some time, but half a year later we found ourselves in a studio actually building this thing using 30 connected computers, 24 of which were Raspberry Pis.

STRANDKØNZERT CRT video wall Raspberry Pi

How we did it

After weeks and months of preproduction and testing, we decided on two consecutive days to build the wall, create the underlying IP network, run a few tests, and then film the artists’ performance in front of it. We actually had 32 Pis (a mixed bag of first, second, and third generation models) and even more TVs ready to go, since we didn’t know what the final build would actually look like. We ended up using 29 separate screens of various sizes hooked up to 24 separate Pis — the remaining five TVs got a daisy-chained video signal out of other monitors for a cool effect. Each Pi had to run a free software called PiWall.

STRANDKØNZERT CRT video wall Raspberry Pi

Since the TVs only had analogue video inputs, we had to get special composite breakout cables and then adapt the RCA connectors to either SCART, S-Video, or BNC.

STRANDKØNZERT CRT video wall Raspberry Pi

As soon as we had all of that running, we connected every Pi to a 48-port network switch that we’d hooked up to a Windows PC acting as a DHCP server to automatically assign IP addresses and handle the multicast addressing. To make remote control of the Raspberry Pis easier, a separate master Linux PC and two MacBook laptops, each with SSH enabled and a Samba server running, joined the network as well.

STRANDKØNZERT CRT video wall Raspberry Pi

The MacBook laptops were used to drop two files containing the settings on each Pi. The .pitile file was unique to every Pi and contained their respective IDs. The .piwall file contained the same info for all Pis: the measurements and positions of every single screen to help the software split up the video signal coming in through the network. After every Pi got the command to start the PiWall software, which specifies the UDP multicast address and settings to be used to receive the video stream, the master Linux PC was tasked with streaming the video file to these UDP addresses. Now every TV was showing its section of the video, and we could begin filming.

STRANDKØNZERT CRT video wall Raspberry Pi

The whole process and the contents of the files and commands are summarised in the infographic below. A lot of trial and error was involved in the making of this project, but it all worked out well in the end. We hope you enjoy the craft behind the music video even though the music is not for everybody 😉

PiWall_Infographic

You can follow Spiegelbilder Studio on Facebook, Twitter, and Instagram. And if you enjoyed the music video, be sure to follow STRANDKØNZERT too.

The post Spiegelbilder Studio’s giant CRT video walls appeared first on Raspberry Pi.

I am Beemo, a little living boy: Adventure Time prop build

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/adventure-time-bmo/

Bob Herzberg, BMO builder and blogger at BYOBMO.com, fills us in on the whys and hows and even the Pen Wards of creating interactive Adventure Time BMO props with the Raspberry Pi.

A Conversation With BMO

A conversation with BMO showing off some voice recognition capabilities. There is no interaction for BMO’s responses other than voice commands. There is a small microphone inside BMO (right behind the blue dot) and the voice commands are processed by Google voice API over WiFi.

Finding BMO

My first BMO began as a cosplay prop for my daughter. She and her friends are huge fans of Adventure Time and made their costumes for Princess Bubblegum, Marceline, and Finn. It was my job to come up with a BMO.

Raspberry Pi BMO Laura Herzberg Bob Herzberg

Bob as Banana Guard, daughter Laura as Princess Bubblegum, and son Steven as Finn

I wanted something electronic, and also interactive if possible. And it had to run on battery power. There was only one option that I found that would work: the Raspberry Pi.

Building a living little boy

BMO’s basic internals consist of the Raspberry Pi, an 8” HDMI monitor, and a USB battery pack. The body is made from laser-cut MDF wood, which I sanded, sealed, and painted. I added 3D-printed arms and legs along with some vinyl lettering to complete the look. There is also a small wireless keyboard that works as a remote control.

Adventure Time BMO prop
Adventure Time BMO prop
Adventure Time BMO prop
Adventure Time BMO prop

To make the front panel button function, I created a custom PCB, mounted laser-cut acrylic buttons on it, and connected it to the Pi’s IO header.

Inside BMO - Raspberry Pi BMO Laura Herzberg Bob Herzberg

Custom-made PCBs control BMO’s gaming buttons and USB input.

The USB jack is extended with another custom PCB, which gives BMO USB ports on the front panel. His battery life is an impressive 8 hours of continuous use.

The main brain game frame

Most of BMO’s personality comes from custom animations that my daughter created and that were then turned into MP4 video files. The animations are triggered by the remote keyboard. Some versions of BMO have an internal microphone, and the Google Voice API is used to translate the user’s voice and map it to an appropriate response, so it’s possible to have a conversation with BMO.

The final components of Raspberry Pi BMO Laura Herzberg Bob Herzberg

The Raspberry Pi Camera Module was also put to use. Some BMOs have a servo that can pop up a camera, called GoMO, which takes pictures. Although some people mistake it for ghost detecting equipment, BMO just likes taking nice pictures.

Who wants to play video games?

Playing games on BMO is as simple as loading one of the emulators supported by Raspbian.

BMO connected to SNES controllers - Raspberry Pi BMO Laura Herzberg Bob Herzberg

I’m partial to the Atari 800 emulator, since I used to write games for that platform when I was just starting to learn programming. The front-panel USB ports are used for connecting gamepads, or his front-panel buttons and D-Pad can be used.

Adventure time

BMO has been a lot of fun to bring to conventions. He makes it to ComicCon San Diego each year and has been as far away as DragonCon in Atlanta, where he finally got to meet the voice of BMO, Niki Yang.

BMO's back panel - Raspberry Pi BMO Laura Herzberg Bob Herzberg

BMO’s back panel, autographed by Niki Yang

One day, I received an email from the producer of Adventure Time, Kelly Crews, with a very special request. Kelly was looking for a birthday present for the show’s creator, Pendleton Ward. It was either luck or coincidence that I just was finishing up the latest version of BMO. Niki Yang added some custom greetings just for Pen.

BMO Wishes Pendleton Ward a Happy Birthday!

Happy birthday to Pendleton Ward, the creator of, well, you know what. We were asked to build Pen his very own BMO and with help from Niki Yang and the Adventure Time crew here is the result.

We added a few more items inside, including a 3D-printed heart, a medal, and a certificate which come from the famous Be More episode that explains BMO’s origins.

Back of Adventure Time BMO prop
Adventure Time BMO prop
Adventure Time BMO prop
Adventure Time BMO prop

BMO was quite a challenge to create. Fabricating the enclosure required several different techniques and materials. Fortunately, bringing him to life was quite simple once he had a Raspberry Pi inside!

Find out more

Be sure to follow Bob’s adventures with BMO at the Build Your Own BMO blog. And if you’ve built your own prop from television or film using a Raspberry Pi, be sure to share it with us in the comments below or on our social media channels.

 

All images c/o Bob and Laura Herzberg

The post I am Beemo, a little living boy: Adventure Time prop build appeared first on Raspberry Pi.

What is HAMR and How Does It Enable the High-Capacity Needs of the Future?

Post Syndicated from Andy Klein original https://www.backblaze.com/blog/hamr-hard-drives/

HAMR drive illustration

During Q4, Backblaze deployed 100 petabytes worth of Seagate hard drives to our data centers. The newly deployed Seagate 10 and 12 TB drives are doing well and will help us meet our near term storage needs, but we know we’re going to need more drives — with higher capacities. That’s why the success of new hard drive technologies like Heat-Assisted Magnetic Recording (HAMR) from Seagate are very relevant to us here at Backblaze and to the storage industry in general. In today’s guest post we are pleased to have Mark Re, CTO at Seagate, give us an insider’s look behind the hard drive curtain to tell us how Seagate engineers are developing the HAMR technology and making it market ready starting in late 2018.

What is HAMR and How Does It Enable the High-Capacity Needs of the Future?

Guest Blog Post by Mark Re, Seagate Senior Vice President and Chief Technology Officer

Earlier this year Seagate announced plans to make the first hard drives using Heat-Assisted Magnetic Recording, or HAMR, available by the end of 2018 in pilot volumes. Even as today’s market has embraced 10TB+ drives, the need for 20TB+ drives remains imperative in the relative near term. HAMR is the Seagate research team’s next major advance in hard drive technology.

HAMR is a technology that over time will enable a big increase in the amount of data that can be stored on a disk. A small laser is attached to a recording head, designed to heat a tiny spot on the disk where the data will be written. This allows a smaller bit cell to be written as either a 0 or a 1. The smaller bit cell size enables more bits to be crammed into a given surface area — increasing the areal density of data, and increasing drive capacity.

It sounds almost simple, but the science and engineering expertise required, the research, experimentation, lab development and product development to perfect this technology has been enormous. Below is an overview of the HAMR technology and you can dig into the details in our technical brief that provides a point-by-point rundown describing several key advances enabling the HAMR design.

As much time and resources as have been committed to developing HAMR, the need for its increased data density is indisputable. Demand for data storage keeps increasing. Businesses’ ability to manage and leverage more capacity is a competitive necessity, and IT spending on capacity continues to increase.

History of Increasing Storage Capacity

For the last 50 years areal density in the hard disk drive has been growing faster than Moore’s law, which is a very good thing. After all, customers from data centers and cloud service providers to creative professionals and game enthusiasts rarely go shopping looking for a hard drive just like the one they bought two years ago. The demands of increasing data on storage capacities inevitably increase, thus the technology constantly evolves.

According to the Advanced Storage Technology Consortium, HAMR will be the next significant storage technology innovation to increase the amount of storage in the area available to store data, also called the disk’s “areal density.” We believe this boost in areal density will help fuel hard drive product development and growth through the next decade.

Why do we Need to Develop Higher-Capacity Hard Drives? Can’t Current Technologies do the Job?

Why is HAMR’s increased data density so important?

Data has become critical to all aspects of human life, changing how we’re educated and entertained. It affects and informs the ways we experience each other and interact with businesses and the wider world. IDC research shows the datasphere — all the data generated by the world’s businesses and billions of consumer endpoints — will continue to double in size every two years. IDC forecasts that by 2025 the global datasphere will grow to 163 zettabytes (that is a trillion gigabytes). That’s ten times the 16.1 ZB of data generated in 2016. IDC cites five key trends intensifying the role of data in changing our world: embedded systems and the Internet of Things (IoT), instantly available mobile and real-time data, cognitive artificial intelligence (AI) systems, increased security data requirements, and critically, the evolution of data from playing a business background to playing a life-critical role.

Consumers use the cloud to manage everything from family photos and videos to data about their health and exercise routines. Real-time data created by connected devices — everything from Fitbit, Alexa and smart phones to home security systems, solar systems and autonomous cars — are fueling the emerging Data Age. On top of the obvious business and consumer data growth, our critical infrastructure like power grids, water systems, hospitals, road infrastructure and public transportation all demand and add to the growth of real-time data. Data is now a vital element in the smooth operation of all aspects of daily life.

All of this entails a significant infrastructure cost behind the scenes with the insatiable, global appetite for data storage. While a variety of storage technologies will continue to advance in data density (Seagate announced the first 60TB 3.5-inch SSD unit for example), high-capacity hard drives serve as the primary foundational core of our interconnected, cloud and IoT-based dependence on data.

HAMR Hard Drive Technology

Seagate has been working on heat assisted magnetic recording (HAMR) in one form or another since the late 1990s. During this time we’ve made many breakthroughs in making reliable near field transducers, special high capacity HAMR media, and figuring out a way to put a laser on each and every head that is no larger than a grain of salt.

The development of HAMR has required Seagate to consider and overcome a myriad of scientific and technical challenges including new kinds of magnetic media, nano-plasmonic device design and fabrication, laser integration, high-temperature head-disk interactions, and thermal regulation.

A typical hard drive inside any computer or server contains one or more rigid disks coated with a magnetically sensitive film consisting of tiny magnetic grains. Data is recorded when a magnetic write-head flies just above the spinning disk; the write head rapidly flips the magnetization of one magnetic region of grains so that its magnetic pole points up or down, to encode a 1 or a 0 in binary code.

Increasing the amount of data you can store on a disk requires cramming magnetic regions closer together, which means the grains need to be smaller so they won’t interfere with each other.

Heat Assisted Magnetic Recording (HAMR) is the next step to enable us to increase the density of grains — or bit density. Current projections are that HAMR can achieve 5 Tbpsi (Terabits per square inch) on conventional HAMR media, and in the future will be able to achieve 10 Tbpsi or higher with bit patterned media (in which discrete dots are predefined on the media in regular, efficient, very dense patterns). These technologies will enable hard drives with capacities higher than 100 TB before 2030.

The major problem with packing bits so closely together is that if you do that on conventional magnetic media, the bits (and the data they represent) become thermally unstable, and may flip. So, to make the grains maintain their stability — their ability to store bits over a long period of time — we need to develop a recording media that has higher coercivity. That means it’s magnetically more stable during storage, but it is more difficult to change the magnetic characteristics of the media when writing (harder to flip a grain from a 0 to a 1 or vice versa).

That’s why HAMR’s first key hardware advance required developing a new recording media that keeps bits stable — using high anisotropy (or “hard”) magnetic materials such as iron-platinum alloy (FePt), which resist magnetic change at normal temperatures. Over years of HAMR development, Seagate researchers have tested and proven out a variety of FePt granular media films, with varying alloy composition and chemical ordering.

In fact the new media is so “hard” that conventional recording heads won’t be able to flip the bits, or write new data, under normal temperatures. If you add heat to the tiny spot on which you want to write data, you can make the media’s coercive field lower than the magnetic field provided by the recording head — in other words, enable the write head to flip that bit.

So, a challenge with HAMR has been to replace conventional perpendicular magnetic recording (PMR), in which the write head operates at room temperature, with a write technology that heats the thin film recording medium on the disk platter to temperatures above 400 °C. The basic principle is to heat a tiny region of several magnetic grains for a very short time (~1 nanoseconds) to a temperature high enough to make the media’s coercive field lower than the write head’s magnetic field. Immediately after the heat pulse, the region quickly cools down and the bit’s magnetic orientation is frozen in place.

Applying this dynamic nano-heating is where HAMR’s famous “laser” comes in. A plasmonic near-field transducer (NFT) has been integrated into the recording head, to heat the media and enable magnetic change at a specific point. Plasmonic NFTs are used to focus and confine light energy to regions smaller than the wavelength of light. This enables us to heat an extremely small region, measured in nanometers, on the disk media to reduce its magnetic coercivity,

Moving HAMR Forward

HAMR write head

As always in advanced engineering, the devil — or many devils — is in the details. As noted earlier, our technical brief provides a point-by-point short illustrated summary of HAMR’s key changes.

Although hard work remains, we believe this technology is nearly ready for commercialization. Seagate has the best engineers in the world working towards a goal of a 20 Terabyte drive by 2019. We hope we’ve given you a glimpse into the amount of engineering that goes into a hard drive. Keeping up with the world’s insatiable appetite to create, capture, store, secure, manage, analyze, rapidly access and share data is a challenge we work on every day.

With thousands of HAMR drives already being made in our manufacturing facilities, our internal and external supply chain is solidly in place, and volume manufacturing tools are online. This year we began shipping initial units for customer tests, and production units will ship to key customers by the end of 2018. Prepare for breakthrough capacities.

The post What is HAMR and How Does It Enable the High-Capacity Needs of the Future? appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

The Raspberry Pi Christmas shopping list 2017

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/christmas-shopping-list-2017/

Looking for the perfect Christmas gift for a beloved maker in your life? Maybe you’d like to give a relative or friend a taste of the world of coding and Raspberry Pi? Whatever you’re looking for, the Raspberry Pi Christmas shopping list will point you in the right direction.

An ice-skating Raspberry Pi - The Raspberry Pi Christmas Shopping List 2017

For those getting started

Thinking about introducing someone special to the wonders of Raspberry Pi during the holidays? Although you can set up your Pi with peripherals from around your home, such as a mobile phone charger, your PC’s keyboard, and the old mouse dwelling in an office drawer, a starter kit is a nice all-in-one package for the budding coder.



Check out the starter kits from Raspberry Pi Approved Resellers such as Pimoroni, The Pi Hut, ModMyPi, Adafruit, CanaKit…the list is pretty long. Our products page will direct you to your closest reseller, or you can head to element14 to pick up the official Raspberry Pi Starter Kit.



You can also buy the Raspberry Pi Press’s brand-new Raspberry Pi Beginners Book, which includes a Raspberry Pi Zero W, a case, a ready-made SD card, and adapter cables.

Once you’ve presented a lucky person with their first Raspberry Pi, it’s time for them to spread their maker wings and learn some new skills.

MagPi Essentials books - The Raspberry Pi Christmas Shopping List 2017

To help them along, you could pick your favourite from among the Official Projects Book volume 3, The MagPi Essentials guides, and the brand-new third edition of Carrie Anne Philbin’s Adventures in Raspberry Pi. (She is super excited about this new edition!)

And you can always add a link to our free resources on the gift tag.

For the maker in your life

If you’re looking for something for a confident digital maker, you can’t go wrong with adding to their arsenal of electric and electronic bits and bobs that are no doubt cluttering drawers and boxes throughout their house.



Components such as servomotors, displays, and sensors are staples of the maker world. And when it comes to jumper wires, buttons, and LEDs, one can never have enough.



You could also consider getting your person a soldering iron, some helpings hands, or small tools such as a Dremel or screwdriver set.

And to make their life a little less messy, pop it all inside a Really Useful Box…because they’re really useful.



For kit makers

While some people like to dive into making head-first and to build whatever comes to mind, others enjoy working with kits.



The Naturebytes kit allows you to record the animal visitors of your garden with the help of a camera and a motion sensor. Footage of your local badgers, birds, deer, and more will be saved to an SD card, or tweeted or emailed to you if it’s in range of WiFi.

Cortec Tiny 4WD - The Raspberry Pi Christmas Shopping List 2017

Coretec’s Tiny 4WD is a kit for assembling a Pi Zero–powered remote-controlled robot at home. Not only is the robot adorable, building it also a great introduction to motors and wireless control.



Bare Conductive’s Touch Board Pro Kit offers everything you need to create interactive electronics projects using conductive paint.

Pi Hut Arcade Kit - The Raspberry Pi Christmas Shopping List 2017

Finally, why not help your favourite maker create their own gaming arcade using the Arcade Building Kit from The Pi Hut?

For the reader

For those who like to curl up with a good read, or spend too much of their day on public transport, a book or magazine subscription is the perfect treat.

For makers, hackers, and those interested in new technologies, our brand-new HackSpace magazine and the ever popular community magazine The MagPi are ideal. Both are available via a physical or digital subscription, and new subscribers to The MagPi also receive a free Raspberry Pi Zero W plus case.

Cover of CoderDojo Nano Make your own game

Marc Scott Beginner's Guide to Coding Book

You can also check out other publications from the Raspberry Pi family, including CoderDojo’s new CoderDojo Nano: Make Your Own Game, Eben Upton and Gareth Halfacree’s Raspberry Pi User Guide, and Marc Scott’s A Beginner’s Guide to Coding. And have I mentioned Carrie Anne’s Adventures in Raspberry Pi yet?

Stocking fillers for everyone

Looking for something small to keep your loved ones occupied on Christmas morning? Or do you have to buy a Secret Santa gift for the office tech? Here are some wonderful stocking fillers to fill your boots with this season.

Pi Hut 3D Christmas Tree - The Raspberry Pi Christmas Shopping List 2017

The Pi Hut 3D Xmas Tree: available as both a pre-soldered and a DIY version, this gadget will work with any 40-pin Raspberry Pi and allows you to create your own mini light show.



Google AIY Voice kit: build your own home assistant using a Raspberry Pi, the MagPi Essentials guide, and this brand-new kit. “Google, play Mariah Carey again…”



Pimoroni’s Raspberry Pi Zero W Project Kits offer everything you need, including the Pi, to make your own time-lapse cameras, music players, and more.



The official Raspberry Pi Sense HAT, Camera Module, and cases for the Pi 3 and Pi Zero will complete the collection of any Raspberry Pi owner, while also opening up exciting project opportunities.

STEAM gifts that everyone will love

Awesome Astronauts | Building LEGO’s Women of NASA!

LEGO Idea’s bought out this amazing ‘Women of NASA’ set, and I thought it would be fun to build, play and learn from these inspiring women! First up, let’s discover a little more about Sally Ride and Mae Jemison, two AWESOME ASTRONAUTS!

Treat the kids, and big kids, in your life to the newest LEGO Ideas set, the Women of NASA — starring Nancy Grace Roman, Margaret Hamilton, Sally Ride, and Mae Jemison!



Explore the world of wearables with Pimoroni’s sewable, hackable, wearable, adorable Bearables kits.



Add lights and motors to paper creations with the Activating Origami Kit, available from The Pi Hut.




We all loved Hidden Figures, and the STEAM enthusiast you know will do too. The film’s available on DVD, and you can also buy the original book, along with other fascinating non-fiction such as Rebecca Skloot’s The Immortal Life of Henrietta Lacks, Rachel Ignotofsky’s Women in Science, and Sydney Padua’s (mostly true) The Thrilling Adventures of Lovelace and Babbage.

Have we missed anything?

With so many amazing kits, HATs, and books available from members of the Raspberry Pi community, it’s hard to only pick a few. Have you found something splendid for the maker in your life? Maybe you’ve created your own kit that uses the Raspberry Pi? Share your favourites with us in the comments below or via our social media accounts.

The post The Raspberry Pi Christmas shopping list 2017 appeared first on Raspberry Pi.

Community Profile: Matthew Timmons-Brown

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/community-profile-matthew-timmons-brown/

This column is from The MagPi issue 57. You can download a PDF of the full issue for free, or subscribe to receive the print edition in your mailbox or the digital edition on your tablet. All proceeds from the print and digital editions help the Raspberry Pi Foundation achieve its charitable goals.

“I first set up my YouTube channel because I noticed a massive lack of video tutorials for the Raspberry Pi,” explains Matthew Timmons-Brown, known to many as The Raspberry Pi Guy. At 18 years old, the Cambridge-based student has more than 60 000 subscribers to his channel, making his account the most successful Raspberry Pi–specific YouTube account to date.

Matthew Timmons-Brown

Matt gives a talk at the Raspberry Pi 5th Birthday weekend event

The Raspberry Pi Guy

If you’ve attended a Raspberry Pi event, there’s a good chance you’ve already met Matt. And if not, you’ll have no doubt come across one or more of his tutorials and builds online. On more than one occasion, his work has featured on the Raspberry Pi blog, with his yearly Raspberry Pi roundup videos being a staple of the birthday celebrations.

Matthew Timmons-Brown

With his website, Matt aimed to collect together “the many strands of The Raspberry Pi Guy” into one, neat, cohesive resource — and it works. From newcomers to the credit card-sized computer to hardened Pi veterans, The Raspberry Pi Guy offers aid and inspiration for many. Looking for a review of the Raspberry Pi Zero W? He’s filmed one. Looking for a step-by-step guide to building a Pi-powered Amazon Alexa? No problem, there’s one of those too.

Make your Raspberry Pi artificially intelligent! – Amazon Alexa Personal Assistant Tutorial

Artificial Intelligence. A hefty topic that has dominated the field since computers were first conceived. What if I told you that you could put an artificial intelligence service on your own $30 computer?! That’s right! In this tutorial I will show you how to create your own artificially intelligent personal assistant, using Amazon’s Alexa voice recognition and information service!

Raspberry Pi electric skateboard

Last summer, Matt introduced the world to his Raspberry Pi-controlled electric skateboard, soon finding himself plastered over local press as well as the BBC and tech sites like Adafruit and geek.com. And there’s no question as to why the build was so popular. With YouTubers such as Casey Neistat increasing the demand for electric skateboards on a near-daily basis, the call for a cheaper, home-brew version has quickly grown.

DIY 30KM/H ELECTRIC SKATEBOARD – RASPBERRY PI/WIIMOTE POWERED

Over the summer, I made my own electric skateboard using a £4 Raspberry Pi Zero. Controlled with a Nintendo Wiimote, capable of going 30km/h, and with a range of over 10km, this project has been pretty darn fun. In this video, you see me racing around Cambridge and I explain the ins and outs of this project.

Using a Raspberry Pi Zero, a Nintendo Wii Remote, and a little help from members of the Cambridge Makespace community, Matt built a board capable of reaching 30km/h, with a battery range of 10km per charge. Alongside Neistat, Matt attributes the project inspiration to Australian student Tim Maier, whose build we previously covered in The MagPi.

Matthew Timmons-Brown and Eben Upton standing in a car park looking at a smartphone

LiDAR

Despite the success and the fun of the electric skateboard (including convincing Raspberry Pi Trading CEO Eben Upton to have a go for local television news coverage), the project Matt is most proud of is his wireless LiDAR system for theoretical use on the Mars rovers.

Matthew Timmons-Brown's LiDAR project for scanning terrains with lasers

Using a tablet app to define the angles, Matt’s A Level coursework LiDAR build scans the surrounding area, returning the results to the touchscreen, where they can be manipulated by the user. With his passion for the cosmos and the International Space Station, it’s no wonder that this is Matt’s proudest build.

Built for his A Level Computer Science coursework, the build demonstrates Matt’s passion for space and physics. Used as a means of surveying terrain, LiDAR uses laser light to measure distance, allowing users to create 3D-scanned, high-resolution maps of a specific area. It is a perfect technology for exploring unknown worlds.

Matthew Timmons-Brown and two other young people at a reception in the Houses of Parliament

Matt was invited to St James’s Palace and the Houses of Parliament as part of the Raspberry Pi community celebrations in 2016

Joining the community

In a recent interview at Hills Road Sixth Form College, where he is studying mathematics, further mathematics, physics, and computer science, Matt revealed where his love of electronics and computer science started. “I originally became interested in computer science in 2012, when I read a tiny magazine article about a computer that I would be able to buy with pocket money. This was a pretty exciting thing for a 12-year-old! Your own computer… for less than £30?!” He went on to explain how it became his mission to learn all he could on the subject and how, months later, his YouTube channel came to life, cementing him firmly into the Raspberry Pi community

The post Community Profile: Matthew Timmons-Brown appeared first on Raspberry Pi.

Astro Pi upgrades on the International Space Station

Post Syndicated from David Honess original https://www.raspberrypi.org/blog/astro-pi-upgrades/

In 2015, The Raspberry Pi Foundation built two space-hardened Raspberry Pi units, or Astro Pis, to run student code on board the International Space Station (ISS).

Astro Pi

A space-hardened Raspberry Pi

Astro Pi upgrades

Each school year we run an Astro Pi challenge to find the next generation of space scientists to program them. After the students have their code run in space, any output files are downloaded to ground and returned to them for analysis.

That download process was originally accomplished by an astronaut shutting down the Astro Pi, moving its micro SD card to a crew laptop and copying over the files manually. This used about 20 minutes of precious crew time.

space pi – Create, Discover and Share Awesome GIFs on Gfycat

Watch space pi GIF by sooperdave on Gfycat. Discover more GIFS online on Gfycat

Last year, we passed the qualification to allow the Astro Pi computers to be connected to the Local Area Network (LAN) on board the ISS. This allows us to remotely access them from the ground, upload student code and download the results without having to involve the crew.

This year, we have been preparing a new payload to upgrade the operational capabilities of the Astro Pi units.

The payload consists of the following items:

  • 2 × USB WiFi dongles
  • 5 × optical filters
  • 4 × 32GB micro SD cards

Before anyone asks – no, we’re not going outside into the vacuum of space!

USB WiFi dongle

Currently both Astro Pi units are located in the European Columbus module. They’re even visible on Google Street View (pan down and right)! You can see that we’ve created a bit of a bird’s nest of wires behind them.

Astro Pi

The D-Link DWA-171

The decision to add WiFi capability is partly to clean up the cabling situation, but mainly so that the Astro Pi units can be deployed in ISS locations other than the Columbus module, where we won’t have access to an Ethernet switch.

The Raspberry Pi used in the Astro Pi flight units is the B+ (released in 2014), which does not have any built in wireless connectivity, so we need to use a USB dongle. This particular D-Link dongle was recommended by the European Space Agency (ESA) because a number of other payloads are already using it.

Astro Pi

An Astro Pi unit with WiFi dongle installed

Plans have been made for one of the Astro Pi units to be deployed on an Earth-facing window, to allow Earth-observation student experiments. This is where WiFi connectivity will be required to maintain LAN access for ground control.

Optical filters

With Earth-observation experiments in mind, we are also sending some flexible film optical filters. These are made from the same material as the blue square which is shipped with the Pi NoIR camera module, as noted in this post from when the product was launched. You can find the data sheet here.

Astro Pi

Rosco Roscalux #2007 Storaro Blue

To permit the filter to be easily attached to the Astro Pi unit, the film is laser-cut to friction-fit onto the 12 inner heatsink pins on the base, so that the camera aperture is covered.

Astro Pi

Laser cutting at Makespace

The laser-cutting work was done right here in Cambridge at Makespace by our own Alex Bate, and local artist Diana Probst.

Astro Pi

An Astro Pi with the optical filter installed

32GB micro SD cards

A consequence of running Earth observation experiments is a dramatic increase in the amount of disk space needed. To avoid a high frequency of commanding windows to download imagery to ground, we’re also flying some larger 32GB micro SD cards to replace the current 8GB cards.

Astro Pi

The Samsung Evo MB-MP32DA/EU

This particular type of micro SD card is X-ray proof, waterproof, and resistant to magnetism and heat. Operationally speaking there is no difference, other than the additional available disk space.

Astro Pi

An Astro Pi unit with the new micro SD card installed

The micro SD cards will be flown with a security-hardened version of Raspbian pre-installed.

Crew activities

We have several crew activities planned for when this payload arrives on the ISS. These include the installation of the upgrade items on both Astro Pi units; moving one of the units from Columbus to an earth-facing window (possibly in Node 2); and then moving it back a few weeks later.

Currently it is expected that these activities will be carried out by German ESA astronaut Alexander Gerst who launches to the ISS in November (and will also be the ISS commander for Expedition 57).

Payload launch

We are targeting a January 2018 launch date for the payload. The exact launch vehicle is yet to be determined, but it could be SpaceX CRS 14. We will update you closer to the time.

Questions?

If you have any questions about this payload, how an item works, or why that specific model was chosen, please post them in the comments below, and we’ll try to answer them.

The post Astro Pi upgrades on the International Space Station appeared first on Raspberry Pi.

Digitising film reels with Pi Film Capture

Post Syndicated from Janina Ander original https://www.raspberrypi.org/blog/digitising-reels-pi-film-capture/

Joe Herman’s Pi Film Capture project combines old projectors and a stepper motor with a Raspberry Pi and a Raspberry Pi Camera Module, to transform his grandfather’s 8- and 16-mm home movies into glorious digital films.

We chatted to him about his Pi Film Capture build at Maker Faire New York 2016:

Film to Digital Conversion at Maker Faire New York 2016

Uploaded by Raspberry Pi on 2017-08-25.

What inspired Pi Film Capture?

Joe’s grandfather, Leo Willmott, loved recording home movies of his family of eight children and their grandchildren. He passed away when Joe was five, but in 2013 Joe found a way to connect with his legacy: while moving house, a family member uncovered a box of more than a hundred of Leo’s film reels. These covered decades of family history, and some dated back as far as 1939.

Super 8 film reels

Kodachrome film reels of the type Leo used

This provided an unexpected opportunity for Leo’s family to restore some of their shared history. Joe immediately made plans to digitise the material, knowing that the members of his extensive family tree would provide an eager audience.

Building Pi Film Capture

After a failed attempt with a DSLR camera, Joe realised he couldn’t simply re-film the movies — instead, he would have to capture each frame individually. He combined a Raspberry Pi with an old Super 8 projector, and set about rigging up something to do just that.

He went through numerous stages of prototyping, and his final hardware setup works very well. A NEMA 17 stepper motor  moves the film reel forward in the projector. A magnetic reed switch triggers the Camera Module each time the reel moves on to the next frame. Joe hacked the Camera Module so that it has a different focal distance, and he also added a magnifying lens. Moreover, he realised it would be useful to have a diffuser to ‘smooth’ some of the faults in the aged film reel material. To do this, he mounted “a bit of translucent white plastic from an old ceiling fixture” parallel with the film.

Pi Film Capture device by Joe Herman

Joe’s 16-mm projector, with embedded Raspberry Pi hardware

Software solutions

In addition to capturing every single frame (sometimes with multiple exposure settings), Joe found that he needed intensive post-processing to restore some of the films. He settled on sending the images from the Pi to a more powerful Linux machine. To enable processing of the raw data, he had to write Python scripts implementing several open-source software packages. For example, to deal with the varying quality of the film reels more easily, Joe implemented a GUI (written with the help of PyQt), which he uses to change the capture parameters. This was a demanding job, as he was relatively new to using these tools.

Top half of GUI for Pi Film Capture Joe Herman

The top half of Joe’s GUI, because the whole thing is really long and really thin and would have looked weird on the blog…

If a frame is particularly damaged, Joe can capture multiple instances of the image at different settings. These are then merged to achieve a good-quality image using OpenCV functionality. Joe uses FFmpeg to stitch the captured images back together into a film. Some of his grandfather’s reels were badly degraded, but luckily Joe found scripts written by other people to perform advanced digital restoration of film with AviSynth. He provides code he has written for the project on his GitHub account.

For an account of the project in his own words, check out Joe’s guest post on the IEEE Spectrum website. He also described some of the issues he encountered, and how he resolved them, in The MagPi.

What does Pi Film Capture deliver?

Joe provides videos related to Pi Film Capture on two sites: on his YouTube channel, you’ll find videos in which he has documented the build process of his digitising project. Final results of the project live on Joe’s Vimeo channel, where so far he has uploaded 55 digitised home videos.

m093a: Tom Herman Wedding, Detroit 8/10/63

Shot on 8mm by Leo Willmott, captured and restored by Joe Herman (Not a Wozniak film, but placed in that folder b/c it may be of interest to Hermans)

We’re beyond pleased that our tech is part of this amazing project, helping to reconnect the entire Herman/Willmott clan with their past. And it was great to be able to catch up with Joe, and talk about his build at Maker Faire last year!

Maker Faire New York 2017

We’ll be at Maker Faire New York again on the 23-24 September, and we can’t wait to see the amazing makes the Raspberry Pi community will be presenting there!

Are you going to be at MFNY to show off your awesome Pi-powered project? Tweet us, so we can meet up, check it out and share your achievements!

The post Digitising film reels with Pi Film Capture appeared first on Raspberry Pi.