Tag Archives: zero

Derek Woodroffe’s steampunk tentacle hat

Post Syndicated from Janina Ander original https://www.raspberrypi.org/blog/steampunk-tentacle-hat/

Halloween: that glorious time of year when you’re officially allowed to make your friends jump out of their skin with your pranks. For those among us who enjoy dressing up, Halloween is also the occasion to go all out with costumes. And so, dear reader, we present to you: a steampunk tentacle hat, created by Derek Woodroffe.

Finished Tenticle hat

Finished Tenticle hat

Extreme Electronics

Derek is an engineer who loves all things electronics. He’s part of Extreme Kits, and he runs the website Extreme Electronics. Raspberry Pi Zero-controlled Tesla coils are Derek’s speciality — he’s even been on one of the Royal Institution’s Christmas Lectures with them! Skip ahead to 15:06 in this video to see Derek in action:

Let There Be Light! // 2016 CHRISTMAS LECTURES with Saiful Islam – Lecture 1

The first Lecture from Professor Saiful Islam’s 2016 series of CHRISTMAS LECTURES, ‘Supercharged: Fuelling the future’. Watch all three Lectures here: http://richannel.org/christmas-lectures 2016 marked the 80th anniversary since the BBC first broadcast the Christmas Lectures on TV. To celebrate, chemist Professor Saiful Islam explores a subject that the lectures’ founder – Michael Faraday – addressed in the very first Christmas Lectures – energy.

Wearables

Wearables are electronically augmented items you can wear. They might take the form of spy eyeglasses, clothes with integrated sensors, or, in this case, headgear adorned with mechanised tentacles.

Why did Derek make this? We’re not entirely sure, but we suspect he’s a fan of the Cthulu mythos. In any case, we were a little astounded by his project. This is how we reacted when Derek tweeted us about it:

Raspberry Pi on Twitter

@ExtElec @extkits This is beyond incredible and completely unexpected.

In fact, we had to recover from a fit of laughter before we actually managed to type this answer.

Making a steampunk tentacle hat

Derek made the ‘skeleton’ of each tentacle out of a net curtain spring, acrylic rings, and four lengths of fishing line. Two servomotors connect to two ends of fishing line each, and pull them to move the tentacle.

net curtain spring and acrylic rings forming a mechanic tentacle skeleton - steampunk tentacle hat by Derek Woodroffe
Two servos connecting to lengths of fishing line - steampunk tentacle hat by Derek Woodroffe

Then he covered the tentacles with nylon stockings and liquid latex, glued suckers cut out of MDF onto them, and mounted them on an acrylic base. The eight motors connect to a Raspberry Pi via an I2C 8-port PWM controller board.

artificial tentacles - steampunk tentacle hat by Derek Woodroffe
8 servomotors connected to a controller board and a raspberry pi- steampunk tentacle hat by Derek Woodroffe

The Pi makes the servos pull the tentacles so that they move in sine waves in both the x and y directions, seemingly of their own accord. Derek cut open the top of a hat to insert the mounted tentacles, and he used more liquid latex to give the whole thing a slimy-looking finish.

steampunk tentacle hat by Derek Woodroffe

Iä! Iä! Cthulhu fhtagn!

You can read more about Derek’s steampunk tentacle hat here. He will be at the Beeston Raspberry Jam in November to show off his build, so if you’re in the Nottingham area, why not drop by?

Wearables for Halloween

This build is already pretty creepy, but just imagine it with a sensor- or camera-powered upgrade that makes the tentacles reach for people nearby. You’d have nightmare fodder for weeks.

With the help of the Raspberry Pi, any Halloween costume can be taken to the next level. How could Pi technology help you to win that coveted ‘Scariest costume’ prize this year? Tell us your ideas in the comments, and be sure to share pictures of you in your get-up with us on Twitter, Facebook, or Instagram.

The post Derek Woodroffe’s steampunk tentacle hat appeared first on Raspberry Pi.

N O D E’s Handheld Linux Terminal

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/n-o-d-es-handheld-linux-terminal/

Fit an entire Raspberry Pi-based laptop into your pocket with N O D E’s latest Handheld Linux Terminal build.

The Handheld Linux Terminal Version 3 (Portable Pi 3)

Hey everyone. Today I want to show you the new version 3 of the Handheld Linux Terminal. It’s taken a long time, but I’m finally finished. This one takes all the things I’ve learned so far, and improves on many of the features from the previous iterations.

N O D E

With interests in modding tech, exploring the boundaries of the digital world, and open source, YouTuber N O D E has become one to watch within the digital maker world. He maintains a channel focused on “the transformative power of technology.”

“Understanding that electronics isn’t voodoo is really powerful”, he explains in his Patreon video. “And learning how to build your own stuff opens up so many possibilities.”

NODE Youtube channel logo - Handheld Linux Terminal v3

The topics of his videos range from stripped-down devices, upgraded tech, and security upgrades, to the philosophy behind technology. He also provides weekly roundups of, and discussions about, new releases.

Essentially, if you like technology, you’ll like N O D E.

Handheld Linux Terminal v3

Subscribers to N O D E’s YouTube channel, of whom there are currently over 44000, will have seen him documenting variations of this handheld build throughout the last year. By stripping down a Raspberry Pi 3, and incorporating a Zero W, he’s been able to create interesting projects while always putting functionality first.

Handheld Linux Terminal v3

With the third version of his terminal, N O D E has taken experiences gained from previous builds to create something of which he’s obviously extremely proud. And so he should be. The v3 handheld is impressively small considering he managed to incorporate a fully functional keyboard with mouse, a 3.5″ screen, and a fan within the 3D-printed body.

Handheld Linux Terminal v3

“The software side of things is where it really shines though, and the Pi 3 is more than capable of performing most non-intensive tasks,” N O D E goes on to explain. He demonstrates various applications running on Raspbian, plus other operating systems he has pre-loaded onto additional SD cards:

“I have also installed Exagear Desktop, which allows it to run x86 apps too, and this works great. I have x86 apps such as Sublime Text and Spotify running without any problems, and it’s technically possible to use Wine to also run Windows apps on the device.”

We think this is an incredibly neat build, and we can’t wait to see where N O D E takes it next!

The post N O D E’s Handheld Linux Terminal appeared first on Raspberry Pi.

IoT Cybersecurity: What’s Plan B?

Post Syndicated from Bruce Schneier original https://www.schneier.com/blog/archives/2017/10/iot_cybersecuri.html

In August, four US Senators introduced a bill designed to improve Internet of Things (IoT) security. The IoT Cybersecurity Improvement Act of 2017 is a modest piece of legislation. It doesn’t regulate the IoT market. It doesn’t single out any industries for particular attention, or force any companies to do anything. It doesn’t even modify the liability laws for embedded software. Companies can continue to sell IoT devices with whatever lousy security they want.

What the bill does do is leverage the government’s buying power to nudge the market: any IoT product that the government buys must meet minimum security standards. It requires vendors to ensure that devices can not only be patched, but are patched in an authenticated and timely manner; don’t have unchangeable default passwords; and are free from known vulnerabilities. It’s about as low a security bar as you can set, and that it will considerably improve security speaks volumes about the current state of IoT security. (Full disclosure: I helped draft some of the bill’s security requirements.)

The bill would also modify the Computer Fraud and Abuse and the Digital Millennium Copyright Acts to allow security researchers to study the security of IoT devices purchased by the government. It’s a far narrower exemption than our industry needs. But it’s a good first step, which is probably the best thing you can say about this legislation.

However, it’s unlikely this first step will even be taken. I am writing this column in August, and have no doubt that the bill will have gone nowhere by the time you read it in October or later. If hearings are held, they won’t matter. The bill won’t have been voted on by any committee, and it won’t be on any legislative calendar. The odds of this bill becoming law are zero. And that’s not just because of current politics — I’d be equally pessimistic under the Obama administration.

But the situation is critical. The Internet is dangerous — and the IoT gives it not just eyes and ears, but also hands and feet. Security vulnerabilities, exploits, and attacks that once affected only bits and bytes now affect flesh and blood.

Markets, as we’ve repeatedly learned over the past century, are terrible mechanisms for improving the safety of products and services. It was true for automobile, food, restaurant, airplane, fire, and financial-instrument safety. The reasons are complicated, but basically, sellers don’t compete on safety features because buyers can’t efficiently differentiate products based on safety considerations. The race-to-the-bottom mechanism that markets use to minimize prices also minimizes quality. Without government intervention, the IoT remains dangerously insecure.

The US government has no appetite for intervention, so we won’t see serious safety and security regulations, a new federal agency, or better liability laws. We might have a better chance in the EU. Depending on how the General Data Protection Regulation on data privacy pans out, the EU might pass a similar security law in 5 years. No other country has a large enough market share to make a difference.

Sometimes we can opt out of the IoT, but that option is becoming increasingly rare. Last year, I tried and failed to purchase a new car without an Internet connection. In a few years, it’s going to be nearly impossible to not be multiply connected to the IoT. And our biggest IoT security risks will stem not from devices we have a market relationship with, but from everyone else’s cars, cameras, routers, drones, and so on.

We can try to shop our ideals and demand more security, but companies don’t compete on IoT safety — and we security experts aren’t a large enough market force to make a difference.

We need a Plan B, although I’m not sure what that is. E-mail me if you have any ideas.

This essay previously appeared in the September/October issue of IEEE Security & Privacy.

What’s new in HiveMQ 3.3

Post Syndicated from The HiveMQ Team original https://www.hivemq.com/whats-new-in-hivemq-3-3

We are pleased to announce the release of HiveMQ 3.3. This version of HiveMQ is the most advanced and user friendly version of HiveMQ ever. A broker is the heart of every MQTT deployment and it’s key to monitor and understand how healthy your system and your connected clients are. Version 3.3 of HiveMQ focuses on observability, usability and advanced administration features and introduces a brand new Web UI. This version is a drop-in replacement for HiveMQ 3.2 and of course supports rolling upgrades for zero-downtime.

HiveMQ 3.3 brings many features that your users, administrators and plugin developers are going to love. These are the highlights:

Web UI

Web UI
The new HiveMQ version has a built-in Web UI for advanced analysis and administrative tasks. A powerful dashboard shows important data about the health of the broker cluster and an overview of the whole MQTT deployment.
With the new Web UI, administrators are able to drill down to specific client information and can perform administrative actions like disconnecting a client. Advanced analytics functionality allows indetifying clients with irregular behavior. It’s easy to identify message-dropping clients as HiveMQ shows detailed statistics of such misbehaving MQTT participants.
Of course all Web UI features work at scale with more than a million connected MQTT clients. Learn more about the Web UI in the documentation.

Time To Live

TTL
HiveMQ introduces Time to Live (TTL) on various levels of the MQTT lifecycle. Automatic cleanup of expired messages is as well supported as the wiping of abandoned persistent MQTT sessions. In particular, version 3.3 implements the following TTL features:

  • MQTT client session expiration
  • Retained Message expiration
  • MQTT PUBLISH message expiration

Configuring a TTL for MQTT client sessions and retained messages allows freeing system resources without manual administrative intervention as soon as the data is not needed anymore.
Beside global configuration, MQTT PUBLISHES can have individual TTLs based on application specific characteristics. It’s a breeze to change the TTL of particular messages with the HiveMQ plugin system. As soon as a message TTL expires, the broker won’t send out the message anymore, even if the message was previously queued or in-flight. This can save precious bandwidth for mobile connections as unnecessary traffic is avoided for expired messages.

Trace Recordings

Trace Recordings
Debugging specific MQTT clients or groups of MQTT clients can be challenging at scale. HiveMQ 3.3 introduces an innovative Trace Recording mechanism that allows creating detailed recordings of all client interactions with given filters.
It’s possible to filter based on client identifiers, MQTT message types and topics. And the best of all: You can use regular expressions to select multiple MQTT clients at once as well as topics with complex structures. Getting detailed information about the behavior of specific MQTT clients for debugging complex issues was never easier.

Native SSL

Native SSL
The new native SSL integration of HiveMQ brings a performance boost of more than 40% for SSL Handshakes (in terms of CPU usage) by utilizing an integration with BoringSSL. BoringSSL is Google’s fork of OpenSSL which is also used in Google Chrome and Android. Besides the compute and huge memory optimizations (saves up to 60% Java Heap), additional secure state-of-the-art cipher suites are supported by HiveMQ which are not directly available for Java (like ChaCha20-Poly1305).
Most HiveMQ deployments on Linux systems are expected to see decreased CPU load on TLS handshakes with the native SSL integration and huge memory improvements.

New Plugin System Features

New Plugin System Features
The popular and powerful plugin system has received additional services and callbacks which are useful for many existing and future plugins.
Plugin developers can now use a ConnectionAttributeStore and a SessionAttributeStore for storing arbitrary data for the lifetime of a single MQTT connection of a client or for the whole session of a client. The new ClientGroupService allows grouping different MQTT client identifiers by the same key, so it’s easy to address multiple MQTT clients (with the same group) at once.

A new callback was introduced which notifies a plugin when a HiveMQ instance is ready, which means the instance is part of the cluster and all listeners were started successfully. Developers can now react when a MQTT client session is ready and usable in the cluster with a dedicated callback.

Some use cases require modifying a MQTT PUBLISH packet before it’s sent out to a client. This is now possible with a new callback that was introduced for modifying a PUBLISH before sending it out to a individual client.
The offline queue size for persistent clients is now also configurable for individual clients as well as the queue discard strategy.

Additional Features

Additional Features
HiveMQ 3.3 has many additional features designed for power users and professional MQTT deployments. The new version also has the following highlights:

  • OCSP Stapling
  • Event Log for MQTT client connects, disconnects and unusual events (e.g. discarded message due to slow consumption on the client side
  • Throttling of concurrent TLS handshakes
  • Connect Packet overload protection
  • Configuration of Socket send and receive buffer sizes
  • Global System Information like the HiveMQ Home folder can now be set via Environment Variables without changing the run script
  • The internal HTTP server of HiveMQ is now exposed to the holistic monitoring subsystem
  • Many additional useful metrics were exposed to HiveMQ’s monitoring subsystem

 

In order to upgrade to HiveMQ 3.3 from HiveMQ 3.2 or older versions, take a look at our Upgrade Guide.
Don’t forget to learn more about all the new features with our HiveMQ User Guide.

Download HiveMQ 3.3 now

Security Flaw in Infineon Smart Cards and TPMs

Post Syndicated from Bruce Schneier original https://www.schneier.com/blog/archives/2017/10/security_flaw_i_1.html

A security flaw in Infineon smart cards and TPMs allows an attacker to recover private keys from the public keys. Basically, the key generation algorithm sometimes creates public keys that are vulnerable to Coppersmith’s attack:

While all keys generated with the library are much weaker than they should be, it’s not currently practical to factorize all of them. For example, 3072-bit and 4096-bit keys aren’t practically factorable. But oddly enough, the theoretically stronger, longer 4096-bit key is much weaker than the 3072-bit key and may fall within the reach of a practical (although costly) factorization if the researchers’ method improves.

To spare time and cost, attackers can first test a public key to see if it’s vulnerable to the attack. The test is inexpensive, requires less than 1 millisecond, and its creators believe it produces practically zero false positives and zero false negatives. The fingerprinting allows attackers to expend effort only on keys that are practically factorizable.

This is the flaw in the Estonian national ID card we learned about last month.

The paper isn’t online yet. I’ll post it when it is.

Ouch. This is a bad vulnerability, and it’s in systems — like the Estonian national ID card — that are critical.

Millions of high-security crypto keys crippled by newly discovered flaw (Ars Technica)

Post Syndicated from jake original https://lwn.net/Articles/736520/rss

Ars Technica is reporting on a flaw in the RSA library developed by Infineon that drastically reduces the amount of work needed to discover a private key from its corresponding public key. This flaw, dubbed “ROCA”, mainly affects key pairs that have been generated on keycards. “While all keys generated with the library are much weaker than they should be, it’s not currently practical to factorize all of them. For example, 3072-bit and 4096-bit keys aren’t practically factorable. But oddly enough, the theoretically stronger, longer 4096-bit key is much weaker than the 3072-bit key and may fall within the reach of a practical (although costly) factorization if the researchers’ method improves.

To spare time and cost, attackers can first test a public key to see if it’s vulnerable to the attack. The test is inexpensive, requires less than 1 millisecond, and its creators believe it produces practically zero false positives and zero false negatives. The fingerprinting allows attackers to expend effort only on keys that are practically factorizable. The researchers have already used the method successfully to identify weak keys, and they have provided a tool here to test if a given key was generated using the faulty library. A blog post with more details is here.”

New KRACK Attack Against Wi-Fi Encryption

Post Syndicated from Bruce Schneier original https://www.schneier.com/blog/archives/2017/10/new_krack_attac.html

Mathy Vanhoef has just published a devastating attack against WPA2, the 14-year-old encryption protocol used by pretty much all wi-fi systems. Its an interesting attack, where the attacker forces the protocol to reuse a key. The authors call this attack KRACK, for Key Reinstallation Attacks

This is yet another of a series of marketed attacks; with a cool name, a website, and a logo. The Q&A on the website answers a lot of questions about the attack and its implications. And lots of good information in this ArsTechnica article.

There is an academic paper, too:

“Key Reinstallation Attacks: Forcing Nonce Reuse in WPA2,” by Mathy Vanhoef and Frank Piessens.

Abstract: We introduce the key reinstallation attack. This attack abuses design or implementation flaws in cryptographic protocols to reinstall an already-in-use key. This resets the key’s associated parameters such as transmit nonces and receive replay counters. Several types of cryptographic Wi-Fi handshakes are affected by the attack. All protected Wi-Fi networks use the 4-way handshake to generate a fresh session key. So far, this 14-year-old handshake has remained free from attacks, and is even proven secure. However, we show that the 4-way handshake is vulnerable to a key reinstallation attack. Here, the adversary tricks a victim into reinstalling an already-in-use key. This is achieved by manipulating and replaying handshake messages. When reinstalling the key, associated parameters such as the incremental transmit packet number (nonce) and receive packet number (replay counter) are reset to their initial value. Our key reinstallation attack also breaks the PeerKey, group key, and Fast BSS Transition (FT) handshake. The impact depends on the handshake being attacked, and the data-confidentiality protocol in use. Simplified, against AES-CCMP an adversary can replay and decrypt (but not forge) packets. This makes it possible to hijack TCP streams and inject malicious data into them. Against WPA-TKIP and GCMP the impact is catastrophic: packets can be replayed, decrypted, and forged. Because GCMP uses the same authentication key in both communication directions, it is especially affected.

Finally, we confirmed our findings in practice, and found that every Wi-Fi device is vulnerable to some variant of our attacks. Notably, our attack is exceptionally devastating against Android 6.0: it forces the client into using a predictable all-zero encryption key.

I’m just reading about this now, and will post more information
as I learn it.

EDITED TO ADD: More news.

EDITED TO ADD: This meets my definition of brilliant. The attack is blindingly obvious once it’s pointed out, but for over a decade no one noticed it.

EDITED TO ADD: Matthew Green has a blog post on what went wrong. The vulnerability is in the interaction between two protocols. At a meta level, he blames the opaque IEEE standards process:

One of the problems with IEEE is that the standards are highly complex and get made via a closed-door process of private meetings. More importantly, even after the fact, they’re hard for ordinary security researchers to access. Go ahead and google for the IETF TLS or IPSec specifications — you’ll find detailed protocol documentation at the top of your Google results. Now go try to Google for the 802.11i standards. I wish you luck.

The IEEE has been making a few small steps to ease this problem, but they’re hyper-timid incrementalist bullshit. There’s an IEEE program called GET that allows researchers to access certain standards (including 802.11) for free, but only after they’ve been public for six months — coincidentally, about the same time it takes for vendors to bake them irrevocably into their hardware and software.

This whole process is dumb and — in this specific case — probably just cost industry tens of millions of dollars. It should stop.

Nicholas Weaver explains why most people shouldn’t worry about this:

So unless your Wi-Fi password looks something like a cat’s hairball (e.g. “:SNEIufeli7rc” — which is not guessable with a few million tries by a computer), a local attacker had the capability to determine the password, decrypt all the traffic, and join the network before KRACK.

KRACK is, however, relevant for enterprise Wi-Fi networks: networks where you needed to accept a cryptographic certificate to join initially and have to provide both a username and password. KRACK represents a new vulnerability for these networks. Depending on some esoteric details, the attacker can decrypt encrypted traffic and, in some cases, inject traffic onto the network.

But in none of these cases can the attacker join the network completely. And the most significant of these attacks affects Linux devices and Android phones, they don’t affect Macs, iPhones, or Windows systems. Even when feasible, these attacks require physical proximity: An attacker on the other side of the planet can’t exploit KRACK, only an attacker in the parking lot can.

Some notes on the KRACK attack

Post Syndicated from Robert Graham original http://blog.erratasec.com/2017/10/some-notes-on-krack-attack.html

This is my interpretation of the KRACK attacks paper that describes a way of decrypting encrypted WiFi traffic with an active attack.

tl;dr: Wow. Everyone needs to be afraid. (Well, worried — not panicked.) It means in practice, attackers can decrypt a lot of wifi traffic, with varying levels of difficulty depending on your precise network setup. My post last July about the DEF CON network being safe was in error.

Details

This is not a crypto bug but a protocol bug (a pretty obvious and trivial protocol bug).
When a client connects to the network, the access-point will at some point send a random “key” data to use for encryption. Because this packet may be lost in transmission, it can be repeated many times.
What the hacker does is just repeatedly sends this packet, potentially hours later. Each time it does so, it resets the “keystream” back to the starting conditions. The obvious patch that device vendors will make is to only accept the first such packet it receives, ignore all the duplicates.
At this point, the protocol bug becomes a crypto bug. We know how to break crypto when we have two keystreams from the same starting position. It’s not always reliable, but reliable enough that people need to be afraid.
Android, though, is the biggest danger. Rather than simply replaying the packet, a packet with key data of all zeroes can be sent. This allows attackers to setup a fake WiFi access-point and man-in-the-middle all traffic.
In a related case, the access-point/base-station can sometimes also be attacked, affecting the stream sent to the client.
Not only is sniffing possible, but in some limited cases, injection. This allows the traditional attack of adding bad code to the end of HTML pages in order to trick users into installing a virus.

This is an active attack, not a passive attack, so in theory, it’s detectable.

Who is vulnerable?

Everyone, pretty much.
The hacker only needs to be within range of your WiFi. Your neighbor’s teenage kid is going to be downloading and running the tool in order to eavesdrop on your packets.
The hacker doesn’t need to be logged into your network.
It affects all WPA1/WPA2, the personal one with passwords that we use in home, and the enterprise version with certificates we use in enterprises.
It can’t defeat SSL/TLS or VPNs. Thus, if you feel your laptop is safe surfing the public WiFi at airports, then your laptop is still safe from this attack. With Android, it does allow running tools like sslstrip, which can fool many users.
Your home network is vulnerable. Many devices will be using SSL/TLS, so are fine, like your Amazon echo, which you can continue to use without worrying about this attack. Other devices, like your Phillips lightbulbs, may not be so protected.

How can I defend myself?

Patch.
More to the point, measure your current vendors by how long it takes them to patch. Throw away gear by those vendors that took a long time to patch and replace it with vendors that took a short time.
High-end access-points that contains “WIPS” (WiFi Intrusion Prevention Systems) features should be able to detect this and block vulnerable clients from connecting to the network (once the vendor upgrades the systems, of course). Even low-end access-points, like the $30 ones you get for home, can easily be updated to prevent packet sequence numbers from going back to the start (i.e. from the keystream resetting back to the start).
At some point, you’ll need to run the attack against yourself, to make sure all your devices are secure. Since you’ll be constantly allowing random phones to connect to your network, you’ll need to check their vulnerability status before connecting them. You’ll need to continue doing this for several years.
Of course, if you are using SSL/TLS for everything, then your danger is mitigated. This is yet another reason why you should be using SSL/TLS for internal communications.
Most security vendors will add things to their products/services to defend you. While valuable in some cases, it’s not a defense. The defense is patching the devices you know about, and preventing vulnerable devices from attaching to your network.
If I remember correctly, DEF CON uses Aruba. Aruba contains WIPS functionality, which means by the time DEF CON roles around again next year, they should have the feature to deny vulnerable devices from connecting, and specifically to detect an attack in progress and prevent further communication.
However, for an attacker near an Android device using a low-powered WiFi, it’s likely they will be able to conduct man-in-the-middle without any WIPS preventing them.

Coaxing 2D platforming out of Unity

Post Syndicated from Eevee original https://eev.ee/blog/2017/10/13/coaxing-2d-platforming-out-of-unity/

An anonymous donor asked a question that I can’t even begin to figure out how to answer, but they also said anything else is fine, so here’s anything else.

I’ve been avoiding writing about game physics, since I want to save it for ✨ the book I’m writing ✨, but that book will almost certainly not touch on Unity. Here, then, is a brief run through some of the brick walls I ran into while trying to convince Unity to do 2D platforming.

This is fairly high-level — there are no blocks of code or helpful diagrams. I’m just getting this out of my head because it’s interesting. If you want more gritty details, I guess you’ll have to wait for ✨ the book ✨.

The setup

I hadn’t used Unity before. I hadn’t even used a “real” physics engine before. My games so far have mostly used LÖVE, a Lua-based engine. LÖVE includes box2d bindings, but for various reasons (not all of them good), I opted to avoid them and instead write my own physics completely from scratch. (How, you ask? ✨ Book ✨!)

I was invited to work on a Unity project, Chaos Composer, that someone else had already started. It had basic movement already implemented; I taught myself Unity’s physics system by hacking on it. It’s entirely possible that none of this is actually the best way to do anything, since I was really trying to reproduce my own homegrown stuff in Unity, but it’s the best I’ve managed to come up with.

Two recurring snags were that you can’t ask Unity to do multiple physics updates in a row, and sometimes getting the information I wanted was difficult. Working with my own code spoiled me a little, since I could invoke it at any time and ask it anything I wanted; Unity, on the other hand, is someone else’s black box with a rigid interface on top.

Also, wow, Googling for a lot of this was not quite as helpful as expected. A lot of what’s out there is just the first thing that works, and often that’s pretty hacky and imposes severe limits on the game design (e.g., “this won’t work with slopes”). Basic movement and collision are the first thing you do, which seems to me like the worst time to be locking yourself out of a lot of design options. I tried very (very, very, very) hard to minimize those kinds of constraints.

Problem 1: Movement

When I showed up, movement was already working. Problem solved!

Like any good programmer, I immediately set out to un-solve it. Given a “real” physics engine like Unity prominently features, you have two options: ⓐ treat the player as a physics object, or ⓑ don’t. The existing code went with option ⓑ, like I’d done myself with LÖVE, and like I’d seen countless people advise. Using a physics sim makes for bad platforming.

But… why? I believed it, but I couldn’t concretely defend it. I had to know for myself. So I started a blank project, drew some physics boxes, and wrote a dozen-line player controller.

Ah! Immediate enlightenment.

If the player was sliding down a wall, and I tried to move them into the wall, they would simply freeze in midair until I let go of the movement key. The trouble is that the physics sim works in terms of forces — moving the player involves giving them a nudge in some direction, like a giant invisible hand pushing them around the level. Surprise! If you press a real object against a real wall with your real hand, you’ll see the same effect — friction will cancel out gravity, and the object will stay in midair..

Platformer movement, as it turns out, doesn’t make any goddamn physical sense. What is air control? What are you pushing against? Nothing, really; we just have it because it’s nice to play with, because not having it is a nightmare.

I looked to see if there were any common solutions to this, and I only really found one: make all your walls frictionless.

Game development is full of hacks like this, and I… don’t like them. I can accept that minor hacks are necessary sometimes, but this one makes an early and widespread change to a fundamental system to “fix” something that was wrong in the first place. It also imposes an “invisible” requirement, something I try to avoid at all costs — if you forget to make a particular wall frictionless, you’ll never know unless you happen to try sliding down it.

And so, I swiftly returned to the existing code. It wasn’t too different from what I’d come up with for LÖVE: it applied gravity by hand, tracked the player’s velocity, computed the intended movement each frame, and moved by that amount. The interesting thing was that it used MovePosition, which schedules a movement for the next physics update and stops the movement if the player hits something solid.

It’s kind of a nice hybrid approach, actually; all the “physics” for conscious actors is done by hand, but the physics engine is still used for collision detection. It’s also used for collision rejection — if the player manages to wedge themselves several pixels into a solid object, for example, the physics engine will try to gently nudge them back out of it with no extra effort required on my part. I still haven’t figured out how to get that to work with my homegrown stuff, which is built to prevent overlap rather than to jiggle things out of it.

But wait, what about…

Our player is a dynamic body with rotation lock and no gravity. Why not just use a kinematic body?

I must be missing something, because I do not understand the point of kinematic bodies. I ran into this with Godot, too, which documented them the same way: as intended for use as players and other manually-moved objects. But by default, they don’t even collide with other kinematic bodies or static geometry. What? There’s a checkbox to turn this on, which I enabled, but then I found out that MovePosition doesn’t stop kinematic bodies when they hit something, so I would’ve had to cast along the intended path of movement to figure out when to stop, thus duplicating the same work the physics engine was about to do.

But that’s impossible anyway! Static geometry generally wants to be made of edge colliders, right? They don’t care about concave/convex. Imagine the player is standing on the ground near a wall and tries to move towards the wall. Both the ground and the wall are different edges from the same edge collider.

If you try to cast the player’s hitbox horizontally, parallel to the ground, you’ll only get one collision: the existing collision with the ground. Casting doesn’t distinguish between touching and hitting. And because Unity only reports one collision per collider, and because the ground will always show up first, you will never find out about the impending wall collision.

So you’re forced to either use raycasts for collision detection or decomposed polygons for world geometry, both of which are slightly worse tools for no real gain.

I ended up sticking with a dynamic body.


Oh, one other thing that doesn’t really fit anywhere else: keep track of units! If you’re adding something called “velocity” directly to something called “position”, something has gone very wrong. Acceleration is distance per time squared; velocity is distance per time; position is distance. You must multiply or divide by time to convert between them.

I never even, say, add a constant directly to position every frame; I always phrase it as velocity and multiply by Δt. It keeps the units consistent: time is always in seconds, not in tics.

Problem 2: Slopes

Ah, now we start to get off in the weeds.

A sort of pre-problem here was detecting whether we’re on a slope, which means detecting the ground. The codebase originally used a manual physics query of the area around the player’s feet to check for the ground, which seems to be somewhat common, but that can’t tell me the angle of the detected ground. (It’s also kind of error-prone, since “around the player’s feet” has to be specified by hand and may not stay correct through animations or changes in the hitbox.)

I replaced that with what I’d eventually settled on in LÖVE: detect the ground by detecting collisions, and looking at the normal of the collision. A normal is a vector that points straight out from a surface, so if you’re standing on the ground, the normal points straight up; if you’re on a 10° incline, the normal points 10° away from straight up.

Not all collisions are with the ground, of course, so I assumed something is ground if the normal pointed away from gravity. (I like this definition more than “points upwards”, because it avoids assuming anything about the direction of gravity, which leaves some interesting doors open for later on.) That’s easily detected by taking the dot product — if it’s negative, the collision was with the ground, and I now have the normal of the ground.

Actually doing this in practice was slightly tricky. With my LÖVE engine, I could cram this right into the middle of collision resolution. With Unity, not quite so much. I went through a couple iterations before I really grasped Unity’s execution order, which I guess I will have to briefly recap for this to make sense.

Unity essentially has two update cycles. It performs physics updates at fixed intervals for consistency, and updates everything else just before rendering. Within a single frame, Unity does as many fixed physics updates as it has spare time for (which might be zero, one, or more), then does a regular update, then renders. User code can implement either or both of Update, which runs during a regular update, and FixedUpdate, which runs just before Unity does a physics pass.

So my solution was:

  • At the very end of FixedUpdate, clear the actor’s “on ground” flag and ground normal.

  • During OnCollisionEnter2D and OnCollisionStay2D (which are called from within a physics pass), if there’s a collision that looks like it’s with the ground, set the “on ground” flag and ground normal. (If there are multiple ground collisions, well, good luck figuring out the best way to resolve that! At the moment I’m just taking the first and hoping for the best.)

That means there’s a brief window between the end of FixedUpdate and Unity’s physics pass during which a grounded actor might mistakenly believe it’s not on the ground, which is a bit of a shame, but there are very few good reasons for anything to be happening in that window.

Okay! Now we can do slopes.

Just kidding! First we have to do sliding.

When I first looked at this code, it didn’t apply gravity while the player was on the ground. I think I may have had some problems with detecting the ground as result, since the player was no longer pushing down against it? Either way, it seemed like a silly special case, so I made gravity always apply.

Lo! I was a fool. The player could no longer move.

Why? Because MovePosition does exactly what it promises. If the player collides with something, they’ll stop moving. Applying gravity means that the player is trying to move diagonally downwards into the ground, and so MovePosition stops them immediately.

Hence, sliding. I don’t want the player to actually try to move into the ground. I want them to move the unblocked part of that movement. For flat ground, that means the horizontal part, which is pretty much the same as discarding gravity. For sloped ground, it’s a bit more complicated!

Okay but actually it’s less complicated than you’d think. It can be done with some cross products fairly easily, but Unity makes it even easier with a couple casts. There’s a Vector3.ProjectOnPlane function that projects an arbitrary vector on a plane given by its normal — exactly the thing I want! So I apply that to the attempted movement before passing it along to MovePosition. I do the same thing with the current velocity, to prevent the player from accelerating infinitely downwards while standing on flat ground.

One other thing: I don’t actually use the detected ground normal for this. The player might be touching two ground surfaces at the same time, and I’d want to project on both of them. Instead, I use the player body’s GetContacts method, which returns contact points (and normals!) for everything the player is currently touching. I believe those contact points are tracked by the physics engine anyway, so asking for them doesn’t require any actual physics work.

(Looking at the code I have, I notice that I still only perform the slide for surfaces facing upwards — but I’d want to slide against sloped ceilings, too. Why did I do this? Maybe I should remove that.)

(Also, I’m pretty sure projecting a vector on a plane is non-commutative, which raises the question of which order the projections should happen in and what difference it makes. I don’t have a good answer.)

(I note that my LÖVE setup does something slightly different: it just tries whatever the movement ought to be, and if there’s a collision, then it projects — and tries again with the remaining movement. But I can’t ask Unity to do multiple moves in one physics update, alas.)

Okay! Now, slopes. But actually, with the above work done, slopes are most of the way there already.

One obvious problem is that the player tries to move horizontally even when on a slope, and the easy fix is to change their movement from speed * Vector2.right to speed * new Vector2(ground.y, -ground.x) while on the ground. That’s the ground normal rotated a quarter-turn clockwise, so for flat ground it still points to the right, and in general it points rightwards along the ground. (Note that it assumes the ground normal is a unit vector, but as far as I’m aware, that’s true for all the normals Unity gives you.)

Another issue is that if the player stands motionless on a slope, gravity will cause them to slowly slide down it — because the movement from gravity will be projected onto the slope, and unlike flat ground, the result is no longer zero. For conscious actors only, I counter this by adding the opposite factor to the player’s velocity as part of adding in their walking speed. This matches how the real world works, to some extent: when you’re standing on a hill, you’re exerting some small amount of effort just to stay in place.

(Note that slope resistance is not the same as friction. Okay, yes, in the real world, virtually all resistance to movement happens as a result of friction, but bracing yourself against the ground isn’t the same as being passively resisted.)

From here there are a lot of things you can do, depending on how you think slopes should be handled. You could make the player unable to walk up slopes that are too steep. You could make walking down a slope faster than walking up it. You could make jumping go along the ground normal, rather than straight up. You could raise the player’s max allowed speed while running downhill. Whatever you want, really. Armed with a normal and awareness of dot products, you can do whatever you want.

But first you might want to fix a few aggravating side effects.

Problem 3: Ground adherence

I don’t know if there’s a better name for this. I rarely even see anyone talk about it, which surprises me; it seems like it should be a very common problem.

The problem is: if the player runs up a slope which then abruptly changes to flat ground, their momentum will carry them into the air. For very fast players going off the top of very steep slopes, this makes sense, but it becomes visible even for relatively gentle slopes. It was a mild nightmare in the original release of our game Lunar Depot 38, which has very “rough” ground made up of lots of shallow slopes — so the player is very frequently slightly off the ground, which meant they couldn’t jump, for seemingly no reason. (I even had code to fix this, but I disabled it because of a silly visual side effect that I never got around to fixing.)

Anyway! The reason this is a problem is that game protagonists are generally not boxes sliding around — they have legs. We don’t go flying off the top of real-world hilltops because we put our foot down until it touches the ground.

Simulating this footfall is surprisingly fiddly to get right, especially with someone else’s physics engine. It’s made somewhat easier by Cast, which casts the entire hitbox — no matter what shape it is — in a particular direction, as if it had moved, and tells you all the hypothetical collisions in order.

So I cast the player in the direction of gravity by some distance. If the cast hits something solid with a ground-like collision normal, then the player must be close to the ground, and I move them down to touch it (and set that ground as the new ground normal).

There are some wrinkles.

Wrinkle 1: I only want to do this if the player is off the ground now, but was on the ground last frame, and is not deliberately moving upwards. That latter condition means I want to skip this logic if the player jumps, for example, but also if the player is thrust upwards by a spring or abducted by a UFO or whatever. As long as external code goes through some interface and doesn’t mess with the player’s velocity directly, that shouldn’t be too hard to track.

Wrinkle 2: When does this logic run? It needs to happen after the player moves, which means after a Unity physics pass… but there’s no callback for that point in time. I ended up running it at the beginning of FixedUpdate and the beginning of Update — since I definitely want to do it before rendering happens! That means it’ll sometimes happen twice between physics updates. (I could carefully juggle a flag to skip the second run, but I… didn’t do that. Yet?)

Wrinkle 3: I can’t move the player with MovePosition! Remember, MovePosition schedules a movement, it doesn’t actually perform one; that means if it’s called twice before the physics pass, the first call is effectively ignored. I can’t easily combine the drop with the player’s regular movement, for various fiddly reasons. I ended up doing it “by hand” using transform.Translate, which I think was the “old way” to do manual movement before MovePosition existed. I’m not totally sure if it activates triggers? For that matter, I’m not sure it even notices collisions — but since I did a full-body Cast, there shouldn’t be any anyway.

Wrinkle 4: What, exactly, is “some distance”? I’ve yet to find a satisfying answer for this. It seems like it ought to be based on the player’s current speed and the slope of the ground they’re moving along, but every time I’ve done that math, I’ve gotten totally ludicrous answers that sometimes exceed the size of a tile. But maybe that’s not wrong? Play around, I guess, and think about when the effect should “break” and the player should go flying off the top of a hill.

Wrinkle 5: It’s possible that the player will launch off a slope, hit something, and then be adhered to the ground where they wouldn’t have hit it. I don’t much like this edge case, but I don’t see a way around it either.

This problem is surprisingly awkward for how simple it sounds, and the solution isn’t entirely satisfying. Oh, well; the results are much nicer than the solution. As an added bonus, this also fixes occasional problems with running down a hill and becoming detached from the ground due to precision issues or whathaveyou.

Problem 4: One-way platforms

Ah, what a nightmare.

It took me ages just to figure out how to define one-way platforms. Only block when the player is moving downwards? Nope. Only block when the player is above the platform? Nuh-uh.

Well, okay, yes, those approaches might work for convex players and flat platforms. But what about… sloped, one-way platforms? There’s no reason you shouldn’t be able to have those. If Super Mario World can do it, surely Unity can do it almost 30 years later.

The trick is, again, to look at the collision normal. If it faces away from gravity, the player is hitting a ground-like surface, so the platform should block them. Otherwise (or if the player overlaps the platform), it shouldn’t.

Here’s the catch: Unity doesn’t have conditional collision. I can’t decide, on the fly, whether a collision should block or not. In fact, I think that by the time I get a callback like OnCollisionEnter2D, the physics pass is already over.

I could go the other way and use triggers (which are non-blocking), but then I have the opposite problem: I can’t stop the player on the fly. I could move them back to where they hit the trigger, but I envision all kinds of problems as a result. What if they were moving fast enough to activate something on the other side of the platform? What if something else moved to where I’m trying to shove them back to in the meantime? How does this interact with ground detection and listing contacts, which would rightly ignore a trigger as non-blocking?

I beat my head against this for a while, but the inability to respond to collision conditionally was a huge roadblock. It’s all the more infuriating a problem, because Unity ships with a one-way platform modifier thing. Unfortunately, it seems to have been implemented by someone who has never played a platformer. It’s literally one-way — the player is only allowed to move straight upwards through it, not in from the sides. It also tries to block the player if they’re moving downwards while inside the platform, which invokes clumsy rejection behavior. And this all seems to be built into the physics engine itself somehow, so I can’t simply copy whatever they did.

Eventually, I settled on the following. After calculating attempted movement (including sliding), just at the end of FixedUpdate, I do a Cast along the movement vector. I’m not thrilled about having to duplicate the physics engine’s own work, but I do filter to only things on a “one-way platform” physics layer, which should at least help. For each object the cast hits, I use Physics2D.IgnoreCollision to either ignore or un-ignore the collision between the player and the platform, depending on whether the collision was ground-like or not.

(A lot of people suggested turning off collision between layers, but that can’t possibly work — the player might be standing on one platform while inside another, and anyway, this should work for all actors!)

Again, wrinkles! But fewer this time. Actually, maybe just one: handling the case where the player already overlaps the platform. I can’t just check for that with e.g. OverlapCollider, because that doesn’t distinguish between overlapping and merely touching.

I came up with a fairly simple fix: if I was going to un-ignore the collision (i.e. make the platform block), and the cast distance is reported as zero (either already touching or overlapping), I simply do nothing instead. If I’m standing on the platform, I must have already set it blocking when I was approaching it from the top anyway; if I’m overlapping it, I must have already set it non-blocking to get here in the first place.

I can imagine a few cases where this might go wrong. Moving platforms, especially, are going to cause some interesting issues. But this is the best I can do with what I know, and it seems to work well enough so far.

Oh, and our player can deliberately drop down through platforms, which was easy enough to implement; I just decide the platform is always passable while some button is held down.

Problem 5: Pushers and carriers

I haven’t gotten to this yet! Oh boy, can’t wait. I implemented it in LÖVE, but my way was hilariously invasive; I’m hoping that having a physics engine that supports a handwaved “this pushes that” will help. Of course, you also have to worry about sticking to platforms, for which the recommended solution is apparently to parent the cargo to the platform, which sounds goofy to me? I guess I’ll find out when I throw myself at it later.

Overall result

I ended up with a fairly pleasant-feeling system that supports slopes and one-way platforms and whatnot, with all the same pieces as I came up with for LÖVE. The code somehow ended up as less of a mess, too, but it probably helps that I’ve been down this rabbit hole once before and kinda knew what I was aiming for this time.

Animation of a character running smoothly along the top of an irregular dinosaur skeleton

Sorry that I don’t have a big block of code for you to copy-paste into your project. I don’t think there are nearly enough narrative discussions of these fundamentals, though, so hopefully this is useful to someone. If not, well, look forward to ✨ my book, that I am writing ✨!

The Pi Hut’s 3D Xmas Tree pre-order

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/pi-hut-3d-xmas-tree/

We appreciate it’s only October, but hear us out. The Pi Hut’s 3D Xmas Tree is only available for pre-order until the 15th, and we’d hate for you to find out about it too late. So please share in a few minutes of premature Christmas cheer as we introduce you to this gorgeous kit.

The Pi Hut's 3D Xmas Tree for Raspberry Pi

Oooo…aaaaahhhh…

Super early Christmas prep

Designed by Pi Towers alumna Rachel Rayns, the 3D Xmas Tree kit is a 25-LED add-on board for the Raspberry Pi, on sale as a pre-soldered and as a ‘solder yourself’ version. You can control each LED independently via the GPIO pins, allowing you to create some wonderful, twinkly displays this coming holiday season.

The Pi Hut's 3D Xmas Tree for Raspberry Pi

The tree works with any 40-pin Raspberry Pi, including the Zero and Zero W.

You may remember the kit from last Christmas, when The Pi Hut teasingly hinted at its existence. We’ve been itching to get our hands on one for months now, and last week we finally received our own to build and play with.

3D Xmas Tree

So I took the time to record my entire build process for you…only to discover that I had managed to do most of the soldering out of frame. I blame Ben Nuttall for this, as we all rightly should, and offer instead this short GIF of me proudly showing off my finished piece.

The Pi Hut’s website has complete soldering instructions for the tree, as well as example code to get you started. Thus, even the most novice of Raspberry Pi enthusiasts and digital makers should be able to put this kit together and get it twinkling for Christmas.

If you don’t own helping hands for soldering, you’re missing out on, well, a helping hand when soldering.

If you need any help with soldering, check out our video resource. And once you’ve mastered this skill, how about upgrading your tree to twinkle in time with your favourite Christmas song? Or getting two or three, and having them flash in a beautiful synchronised multi-tree display?

Get your own 3D Xmas Tree

As mentioned above, you can pre-order the kit until Sunday 15 October. Once this deadline passes, that’s it — the boat will have sailed and you’ll be left stranded at the dock, waving goodbye to the missed opportunity.

The Pi Hut's 3D Xmas Tree for Raspberry Pi

Don’t be this kid.

With 2730 trees already ordered, you know this kit is going to be in the Christmas stocking of many a maker on 25 December.

And another thing

Shhh…while you’re there, The Pi Hut still has a few Google AIY Projects voice kits available for pre-order…but you didn’t hear that from me. Quick!

The post The Pi Hut’s 3D Xmas Tree pre-order appeared first on Raspberry Pi.

PureVPN Logs Helped FBI Net Alleged Cyberstalker

Post Syndicated from Andy original https://torrentfreak.com/purevpn-logs-helped-fbi-net-alleged-cyberstalker-171009/

Last Thursday, Ryan S. Lin, 24, of Newton, Massachusetts, was arrested on suspicion of conducting “an extensive cyberstalking campaign” against his former roommate, a 24-year-old Massachusetts woman, as well as her family members and friends.

According to the Department of Justice, Lin’s “multi-faceted campaign of computer hacking and cyberstalking” began in April 2016 when he began hacking into the victim’s online accounts, obtaining personal photographs, sensitive information about her medical and sexual histories, and other private details.

It’s alleged that after obtaining the above material, Lin distributed it to hundreds of others. It’s claimed he created fake online profiles showing the victim’s home address while soliciting sexual activity. This caused men to show up at her home.

“Mr. Lin allegedly carried out a relentless cyber stalking campaign against a young woman in a chilling effort to violate her privacy and threaten those around her,” said Acting United States Attorney William D. Weinreb.

“While using anonymizing services and other online tools to avoid attribution, Mr. Lin harassed the victim, her family, friends, co-workers and roommates, and then targeted local schools and institutions in her community. Mr. Lin will now face the consequences of his crimes.”

While Lin awaits his ultimate fate (he appeared in U.S. District Court in Boston Friday), the allegation he used anonymization tools to hide himself online but still managed to get caught raises a number of questions. An affidavit submitted by Special Agent Jeffrey Williams in support of the criminal complaint against Lin provides most of the answers.

Describing Lin’s actions against the victim as “doxing”, Williams begins by noting that while Lin was the initial aggressor, the fact he made the information so widely available raises the possibility that other people got involved with malicious acts later on. Nevertheless, Lin remains the investigation’s prime suspect.

According to the affidavit, Lin is computer savvy having majored in computer science. He allegedly utilized a number of methods to hide his identity and IP address, including TOR, Virtual Private Network (VPN) services and email providers that “do not maintain logs or other records.”

But if that genuinely is the case, how was Lin caught?

First up, it’s worth noting that plenty of Lin’s aggressive and stalking behaviors towards the victim were demonstrated in a physical sense, offline. In that respect, it appears the authorities already had him as the prime suspect and worked back from there.

In one instance, the FBI examined a computer that had been used by Lin at a former workplace. Although Windows had been reinstalled, the FBI managed to find Google Chrome data which indicated Lin had viewed articles about bomb threats he allegedly made. They were also able to determine he’d accessed the victim’s Gmail account and additional data suggested that he’d used a VPN service.

“Artifacts indicated that PureVPN, a VPN service that was used repeatedly in the cyberstalking scheme, was installed on the computer,” the affidavit reads.

From here the Special Agent’s report reveals that the FBI received cooperation from Hong Kong-based PureVPN.

“Significantly, PureVPN was able to determine that their service was accessed by the same customer from two originating IP addresses: the RCN IP address from the home Lin was living in at the time, and the software company where Lin was employed at the time,” the agent’s affidavit reads.

Needless to say, while this information will prove useful to the FBI’s prosecution of Lin, it’s also likely to turn into a huge headache for the VPN provider. The company claims zero-logging, which clearly isn’t the case.

“PureVPN operates a self-managed VPN network that currently stands at 750+ Servers in 141 Countries. But is this enough to ensure complete security?” the company’s marketing statement reads.

“That’s why PureVPN has launched advanced features to add proactive, preventive and complete security. There are no third-parties involved and NO logs of your activities.”

PureVPN privacy graphic

However, if one drills down into the PureVPN privacy policy proper, one sees the following:

Our servers automatically record the time at which you connect to any of our servers. From here on forward, we do not keep any records of anything that could associate any specific activity to a specific user. The time when a successful connection is made with our servers is counted as a ‘connection’ and the total bandwidth used during this connection is called ‘bandwidth’. Connection and bandwidth are kept in record to maintain the quality of our service. This helps us understand the flow of traffic to specific servers so we could optimize them better.

This seems to match what the FBI says – almost. While it says it doesn’t log, PureVPN admits to keeping records of when a user connects to the service and for how long. The FBI clearly states that the service also captures the user’s IP address too. In fact, it appears that PureVPN also logged the IP address belonging to another VPN service (WANSecurity) that was allegedly used by Lin to connect to PureVPN.

That record also helped to complete another circle of evidence. IP addresses used by
Kansas-based WANSecurity and Secure Internet LLC (servers operated by PureVPN) were allegedly used to access Gmail accounts known to be under Lin’s control.

Somewhat ironically, this summer Lin took to Twitter to criticize VPN provider IPVanish (which is not involved in the case) over its no-logging claims.

“There is no such thing as a VPN that doesn’t keep logs,” Lin said. “If they can limit your connections or track bandwidth usage, they keep logs.”

Or, in the case of PureVPN, if they log a connection time and a source IP address, that could be enough to raise the suspicions of the FBI and boost what already appears to be a pretty strong case.

If convicted, Lin faces up to five years in prison and three years of supervised release.

Source: TF, for the latest info on copyright, file-sharing, torrent sites and ANONYMOUS VPN services.

Spooktacular Halloween Haunted Portrait

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/spooktacular-halloween-haunted-portrait/

October has come at last, and with it, the joy of Halloween is now upon us. So while I spend the next 30 days quoting Hocus Pocus at every opportunity, here’s Adafruit’s latest spooky build … the spooktacular Haunted Portrait.

Adafruit Raspberry Pi Haunted Portrait

Haunted Portraits

If you’ve visited a haunted house such as Disney’s Haunted Mansion, or walked the halls of Hogwarts at Universal Studios, you will have seen a ‘moving portrait’. Whether it’s the classic ‘did that painting just blink?’ approach, or occupants moving in and out of frame, they’re an effective piece of spooky decoration – and now you can make your own!

Adafruit’s AdaBox

John Park, maker extraordinaire, recently posted a live make video where he used the contents of the Raspberry Pi-themed AdaBox 005 to create a blinking portrait.

AdaBox 005 Raspberry Pi Haunted Portrait

The Adabox is Adafruit’s own maker subscription service where plucky makers receive a mystery parcel containing exciting tech and inspirational builds. Their more recent delivery, the AdaBox 005, contains a Raspberry Pi Zero, their own Joy Bonnet, a case, and peripherals, including Pimoroni’s no-solder Hammer Headers.

AdaBox 005 Raspberry Pi Haunted Portrait

While you can purchase the AdaBoxes as one-off buys, subscribers get extra goodies. With AdaBox 005, they received bonus content including Raspberry Pi swag in the form of stickers, and a copy of The MagPi Magazine.

AdaBox 005 Raspberry Pi Haunted Portrait

The contents of AdaBox 005 allows makers to build their own Raspberry Pi Zero tiny gaming machine. But the ever-working minds of the Adafruit team didn’t want to settle there, so they decided to create more tutorials based on the box’s contents, such as John Park’s Haunted Portrait.

Bringing a portrait to life

Alongside the AdaBox 005 content, all of which can be purchased from Adafruit directly, you’ll need a flat-screen monitor and a fancy frame. The former could be an old TV or computer screen while the latter, unless you happen to have an ornate frame that perfectly fits your monitor, can be made from cardboard, CNC-cut wood or gold-painted macaroni and tape … probably.

Adafruit Raspberry Pi Haunted Portrait

You’ll need to attach headers to your Raspberry Pi Zero. For those of you who fear the soldering iron, the Hammer Headers can be hammered into place without the need for melty hot metal. If you’d like to give soldering a go, you can follow Laura’s Getting Started With Soldering tutorial video.

Adafruit Raspberry Pi Haunted Portrait Hammer Header

In his tutorial, John goes on to explain how to set up the Joy Bonnet (if you wish to use it as an added controller), set your Raspberry Pi to display in portrait mode, and manipulate an image in Photoshop or GIMP to create the blinking effect.

Adafruit Raspberry Pi Haunted Portrait

Blinking eyes are just the start of the possibilities for this project. This is your moment to show off your image manipulation skills! Why not have the entire head flash to show the skull within? Or have an ethereal image appear in the background of an otherwise unexceptional painting of a bowl of fruit?

In the final stages of the tutorial, John explains how to set an image slideshow running on the Pi, and how to complete the look with the aforementioned ornate frame. He also goes into detail about the importance of using a matte effect screen or transparent gels to give a more realistic ‘painted’ feel.

You’ll find everything you need to make your own haunted portrait here, including a link to John’s entire live stream.

Get spooky!

We’re going to make this for Pi Towers. In fact, I’m wondering whether I could create an entire gallery of portraits specifically for our reception area and see how long it takes people to notice …

… though I possibly shouldn’t have given my idea away on this rather public blog post.

If you make the Haunted Portrait, or any other Halloween-themed Pi build, make sure you share it with us via social media, or in the comments below.

The post Spooktacular Halloween Haunted Portrait appeared first on Raspberry Pi.

Adafruit’s read-only Raspberry Pi

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/adafruits-read-only/

For passive projects such as point-of-sale displays, video loopers, and your upcoming Halloween builds, Adafruit have come up with a read-only solution for powering down your Raspberry Pi without endangering your SD card.

Adafruit read-only raspberry pi

Pulling the plug

At home, at a coding club, or at a Jam, you rarely need to pull the plug on your Raspberry Pi without going through the correct shutdown procedure. To ensure a long life for your SD card and its contents, you should always turn off you Pi by selecting the shutdown option from the menu. This way the Pi saves any temporary files to the card before relinquishing power.

Dramatic reconstruction

By pulling the plug while your OS is still running, you might corrupt these files, which could result in the Pi failing to boot up again. The only fix? Wipe the SD card clean and start over, waving goodbye to all files you didn’t back up.

Passive projects

But what if it’s not as easy as selecting shutdown, because your Raspberry Pi is embedded deep inside the belly of a project? Maybe you’ve hot-glued your Zero W into a pumpkin which is now screwed to the roof of your porch, or your store has a bank of Pi-powered monitors playing ads and the power is set to shut off every evening. Without the ability to shut down your Pi via the menu, you risk the SD card’s contents every time you power down your project.

Read-only

Just in time of the plethora of Halloween projects we’re looking forward to this month, the clever folk at Adafruit have designed a solution for this issue. They’ve shared a script which forces the Raspberry Pi to run in read-only mode, so that powering it down via a plug pull will not corrupt the SD card.

But how?

The script makes the Pi save temporary files to the RAM instead of the SD card. Of course, this means that no files or new software can be written to the card. However, if that’s not necessary for your Pi project, you might be happy to make the trade-off. Note that you can only use Adafruit’s script on Raspbian Lite.

Find more about the read-only Raspberry Pi solution, including the script and optional GPIO-halt utility, on the Adafruit Learn page. And be aware that making your Pi read-only is irreversible, so be sure to back up the contents of your SD card before you implement the script.

Halloween!

It’s October, and we’re now allowed to get excited about Halloween and all of the wonderful projects you plan on making for the big night.

Adafruit read-only raspberry pi

Adafruit’s animated snake eyes

We’ll be covering some of our favourite spooky build on social media throughout the month — make sure to share yours with us, either in the comments below or on Facebook, Twitter, Instagram, or G+.

The post Adafruit’s read-only Raspberry Pi appeared first on Raspberry Pi.

PlayerUnknown’s Battlegrounds on a Game Boy?!

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/playerunknowns-battlegrounds-game-boy/

My evenings spent watching the Polygon Awful Squad play PlayerUnknown’s Battlegrounds for hours on end have made me mildly obsessed with this record-breaking Steam game.

PlayerUnknown's Battlegrounds Raspberry Pi

So when Michael Darby’s latest PUBG-inspired Game Boy build appeared in my notifications last week, I squealed with excitement and quickly sent the link to my team…while drinking a cocktail by a pool in Turkey ☀️🍹

PUBG ON A GAMEBOY

https://314reactor.com/ https://www.hackster.io/314reactor https://twitter.com/the_mikey_d

PlayerUnknown’s Battlegrounds

For those unfamiliar with the game: PlayerUnknown’s Battlegrounds, or PUBG for short, is a Battle-Royale-style multiplayer online video game in which individuals or teams fight to the death on an island map. As players collect weapons, ammo, and transport, their ‘safe zone’ shrinks, forcing a final face-off until only one character remains.

The game has been an astounding success on Steam, the digital distribution platform which brings PUBG to the masses. It records daily player counts of over a million!

PlayerUnknown's Battlegrounds Raspberry Pi

Yeah, I’d say one or two people seem to enjoy it!

PUBG on a Game Boy?!

As it’s a fairly complex game, let’s get this out of the way right now: no, Michael is not running the entire game on a Nintendo Game Boy. That would be magic silly impossible. Instead, he’s streaming the game from his home PC to a Raspberry Pi Zero W fitted within the hacked handheld console.

Michael removed the excess plastic inside an old Game Boy Color shell to make space for a Zero W, LiPo battery, and TFT screen. He then soldered the necessary buttons to GPIO pins, and wrote a Python script to control them.

PlayerUnknown's Battlegrounds Raspberry Pi

The maker battleground

The full script can be found here, along with a more detailed tutorial for the build.

In order to stream PUBG to the Zero W, Michael uses the open-source NVIDIA steaming service Moonlight. He set his PC’s screen resolution to 800×600 and its frame rate to 30, so that streaming the game to the TFT screen works perfectly, albeit with no sound.

PlayerUnknown's Battlegrounds Raspberry Pi

The end result is a rather impressive build that has confused YouTube commenters since he uploaded footage for it last week. The video has more than 60000 views to date, so it appears we’re not the only ones impressed with Michael’s make.

314reactor

If you’re a regular reader of our blog, you may recognise Michael’s name from his recent Nerf blaster mod. And fans of Raspberry Pi may also have seen his Pi-powered Windows 98 wristwatch earlier in the year. He blogs at 314reactor, where you can read more about his digital making projects.

Windows 98 Wrist watch Raspberry Pi PlayerUnknown's Battlegrounds

Player Two has entered the game

Now it’s your turn. Have you used a Raspberry Pi to create a gaming system? I’m not just talking arcades and RetroPie here. We want to see everything, from Pi-powered board games to tech on the football field.

Share your builds in the comments below and while you’re at it, what game would you like to stream to a handheld device?

The post PlayerUnknown’s Battlegrounds on a Game Boy?! appeared first on Raspberry Pi.

Weekly roundup: Apocalypse

Post Syndicated from Eevee original https://eev.ee/dev/2017/10/02/weekly-roundup-apocalypse/

Uh, hey. What’s up. Been a while. My computer died? Linux abruptly put the primary hard drive in read-only mode, which seemed Really Bad, but then it refused to boot up entirely. I suspect the motherboard was on its last legs (though the drive itself was getting pretty worn out too), so long story short, I lost a week to ordering/building an entirely new machine and rearranging/zeroing hard drives. The old one was six years old, so it was about time anyway.

I also had some… internet stuff… to deal with, so overall I’ve had a rollercoaster of a week. Oh, and now my keyboard is finally starting to break.

  • fox flux: I’m at the point where the protagonists are almost all done and I’ve started touching up particular poses (times ten). So that’s cool. If I hadn’t lost the last week I might’ve been done with it by now!

  • devops: Well, there was that whole computer thing. Also I suddenly have support for colored fonts (read: emoji) in all GTK apps (except Chromium), and that led me to spend at least half a day trying to find a way to get Twemoji into a font using Google’s font extensions. Alas, no dice, so I’m currently stuck with a fairly outdated copy of the Android emoji, which I don’t want to upgrade because Google makes them worse with every revision.

  • blog: I started on a post. I didn’t get very far. I still owe two for September. Oops.

  • book: Did some editing, worked on some illustrations. I figured out how to get math sections to (mostly) use the same font as body text, so inline math doesn’t look quite so comically out of place any more.

  • cc: Fixed some stuff I broke, as usual, and worked some more on a Unity GUI for defining and editing sprite animations.

I’m now way behind and have completely lost all my trains of thought, though I guess having my computer break is a pretty good excuse. Trying to get back up to speed as quickly as possible.

Oh, and happy October. 🎃

MagPi 62: become a LEGO master builder

Post Syndicated from Rob Zwetsloot original https://www.raspberrypi.org/blog/magpi-62-lego-raspberry-pi/

Hi folks, Rob here from The MagPi. I’m excited to introduce to you all issue 62 of The MagPi, in which we go block crazy with LEGO! This month’s magazine is brimming with 14 pages of magnificent Raspberry Pi projects using these ubiquitous building blocks.

LEGO of everything and get one from the shops right now!

LEGO + Raspberry Pi

In our cover feature you’ll find fun tutorials from our friends at Dexter Industries, such as a Rubik’s cube-solving robot and a special automaton that balances on two wheels. We also show you how to build a retro console case for your Pi out of LEGO, and we have eight other projects to inspire you to make your own incredible brick creations.

Weekend fun

Back at school and looking for a weekend distraction? Check out our weekend projects feature, and build yourself a smart fridge or a door trigger that plays your theme song as you enter the room! Mine is You’re Welcome from Moana. What’s yours?

We have a ton of other wonderful projects, tutorials, and reviews in this issue as well, including a GIF camera, a hydroponic garden, and a Halloween game!

MagPi 62 Halloween game article

You can’t escape our annual spooktacular puns. That would be impossi-ghoul.

Get The MagPi 62

Grab the latest issue of The MagPi from WH Smith, Tesco, Sainsbury’s, and Asda. If you live in the US, check out your local Barnes & Noble or Micro Center over the next few days. You can also get the new issue online from our store, or digitally via our Android or iOS app. And don’t forget, there’s always the free PDF as well.

Subscribe for free goodies

Some of you have asked me about the goodies that we give out to subscribers. This is how it works: if you take out a twelve-month print subscription to The MagPi, you’ll get a Pi Zero W, Pi Zero case, and adapter cables absolutely free! This offer does not currently have an end date.

Pre-order AIY Projects kits

We have news about the AIY Projects voice kit! Micro Center has opened pre-orders for the kits in the US, and Pi Hut will soon be accepting pre-orders in the UK. Pimoroni has set up a notification service in case you want to know when you can pre-order more stock from them.

Now go enjoy building some fun LEGO Pi projects, and we’ll see you next month!

The post MagPi 62: become a LEGO master builder appeared first on Raspberry Pi.

The possibilities of the Sense HAT

Post Syndicated from Janina Ander original https://www.raspberrypi.org/blog/sense-hat-projects/

Did you realise the Sense HAT has been available for over two years now? Used by astronauts on the International Space Station, the exact same hardware is available to you on Earth. With a new Astro Pi challenge just launched, it’s time for a retrospective/roundup/inspiration post about this marvellous bit of kit.

Sense HAT attached to Pi and power cord

The Sense HAT on a Pi in full glory

The Sense HAT explained

We developed our scientific add-on board to be part of the Astro Pi computers we sent to the International Space Station with ESA astronaut Tim Peake. For a play-by-play of Astro Pi’s history, head to the blog archive.

Astro Pi logo with starry background

Just to remind you, this is all the cool stuff our engineers have managed to fit onto the HAT:

  • A gyroscope (sensing pitch, roll, and yaw)
  • An accelerometer
  • A magnetometer
  • Sensors for temperature, humidity, and barometric pressure
  • A joystick
  • An 8×8 LED matrix

You can find a roundup of the technical specs here on the blog.

How to Sense HAT

It’s easy to begin exploring this device: take a look at our free Getting started with the Sense HAT resource, or use one of our Code Club Sense HAT projects. You can also try out the emulator, available offline on Raspbian and online on Trinket.

Sense HAT emulator on Trinket

The Sense HAT emulator on trinket.io

Fun and games with the Sense HAT

Use the LED matrix and joystick to recreate games such as Pong or Flappy Bird. Of course, you could also add sensor input to your game: code an egg drop game or a Magic 8 Ball that reacts to how the device moves.

Sense HAT Random Sparkles

Create random sparkles on the Sense HAT

Once December rolls around, you could brighten up your home with a voice-controlled Christmas tree or an advent calendar on your Sense HAT.

If you like the great outdoors, you could also use your Sense HAT to recreate this Hiking Companion by Marcus Johnson. Take it with you on your next hike!

Art with the Sense HAT

The LED matrix is perfect for getting creative. To draw something basic without having to squint at a Python list, use this app by our very own Richard Hayler. Feeling more ambitious? The MagPi will teach you how to create magnificent pixel art. Ben Nuttall has created this neat little Python script for displaying a photo taken by the Raspberry Pi Camera Module on the Sense HAT.

Brett Haines Mathematica on the Sense HAT

It’s also possible to incorporate Sense HAT data into your digital art! The Python Turtle module and the Processing language are both useful tools for creating beautiful animations based on real-world information.

A Sense HAT project that also uses this principle is Giorgio Sancristoforo’s Tableau, a ‘generative music album’. This device creates music according to the sensor data:

Tableau Generative Album

“There is no doubt that, as music is removed by the phonographrecord from the realm of live production and from the imperative of artistic activity and becomes petrified, it absorbs into itself, in this process of petrification, the very life that would otherwise vanish.”

Science with the Sense HAT

This free Essentials book from The MagPi team covers all the Sense HAT science basics. You can, for example, learn how to measure gravity.

Cropped cover of Experiment with the Sense HAT book

Our online resource shows you how to record the information your HAT picks up. Next you can analyse and graph your data using Mathematica, which is included for free on Raspbian. This resource walks you through how this software works.

If you’re seeking inspiration for experiments you can do on our Astro Pis Izzy and Ed on the ISS, check out the winning entries of previous rounds of the Astro Pi challenge.

Thomas Pesquet with Ed and Izzy

Thomas Pesquet with Ed and Izzy

But you can also stick to terrestrial scientific investigations. For example, why not build a weather station and share its data on your own web server or via Weather Underground?

Your code in space!

If you’re a student or an educator in one of the 22 ESA member states, you can get a team together to enter our 2017-18 Astro Pi challenge. There are two missions to choose from, including Mission Zero: follow a few guidelines, and your code is guaranteed to run in space!

The post The possibilities of the Sense HAT appeared first on Raspberry Pi.

Announcing the 2017-18 European Astro Pi challenge!

Post Syndicated from David Honess original https://www.raspberrypi.org/blog/announcing-2017-18-astro-pi/

Astro Pi is back! Today we’re excited to announce the 2017-18 European Astro Pi challenge in partnership with the European Space Agency (ESA). We are searching for the next generation of space scientists.

YouTube

Enjoy the videos and music you love, upload original content, and share it all with friends, family, and the world on YouTube.

Astro Pi is an annual science and coding competition where student-written code is run on the International Space Station under the oversight of an ESA astronaut. The challenge is open to students from all 22 ESA member countries, including — for the first time — associate members Canada and Slovenia.

The format of the competition is changing slightly this year, and we also have a brand-new non-competitive mission in which participants are guaranteed to have their code run on the ISS for 30 seconds!

Mission Zero

Until now, students have worked on Astro Pi projects in an extra-curricular context and over multiple sessions. For teachers and students who don’t have much spare capacity, we wanted to provide an accessible activity that teams can complete in just one session.

So we came up with Mission Zero for young people no older than 14. To complete it, form a team of two to four people and use our step-by-step guide to help you write a simple Python program that shows your personal message and the ambient temperature on the Astro Pi. If you adhere to a few rules, your code is guaranteed to run in space for 30 seconds, and you’ll receive a certificate showing the exact time period during which your code has run in space. No special hardware is needed for this mission, since everything is done in a web browser.

Mission Zero is open until 26 November 2017! Find out more.

Mission Space Lab

Students aged up to 19 can take part in Mission Space Lab. Form a team of two to six people, and work like real space scientists to design your own experiment. Receive free kit to work with, and write the Python code to carry out your experiment.

There are two themes for Mission Space Lab teams to choose from for their projects:

  • Life in space
    You will make use of Astro Pi Vis (“Ed”) in the European Columbus module. You can use all of its sensors, but you cannot record images or videos.
  • Life on Earth
    You will make use of Astro Pi IR (“Izzy”), which will be aimed towards the Earth through a window. You can use all of its sensors and its camera.

The Astro Pi kit, delivered to Space Lab teams by ESA

If you achieve flight status, your code will be uploaded to the ISS and run for three hours (two orbits). All the data that your code records in space will be downloaded and returned to you for analysis. Then submit a short report on your findings to be in with a chance to win exclusive, money-can’t-buy prizes! You can also submit your project for a Bronze CREST Award.

Mission Space Lab registration is open until 29 October 2017, and accepted teams will continue to spring 2018. Find out more.

How do I get started?

There are loads of materials available that will help you begin your Astro Pi journey — check out the Getting started with the Sense HAT resource and this video explaining how to build the flight case.

Questions?

If you have any questions, please post them in the comments below. We’re standing by to answer them!

The post Announcing the 2017-18 European Astro Pi challenge! appeared first on Raspberry Pi.

FRED-209 Nerf gun tank

Post Syndicated from Janina Ander original https://www.raspberrypi.org/blog/nerf-gun-tank-fred-209/

David Pride, known to many of you as an active member of our maker community, has done it again! His FRED-209 build combines a Nerf gun, 3D printing, a Raspberry Pi Zero, and robotics to make one neat remotely controlled Nerf tank.

FRED-209 – 3D printed Raspberry Pi Nerf Tank

Uploaded by David Pride on 2017-09-17.

A Nerf gun for FRED-209

David says he worked on FRED-209 over the summer in order to have some fun with Nerf guns, which weren’t around when he was a kid. He purchased an Elite Stryfe model at a car boot sale, and took it apart to see what made it tick. Then he set about figuring out how to power it with motors and a servo.

Nerf Elite Stryfe components for the FRED-209 Nerf tank of David Pride

To control the motors, David used a ZeroBorg add-on board for the Pi Zero, and he set up a PlayStation 3 controller to pilot his tank. These components were also part of a robot that David entered into the Pi Wars competition, so he had already written code for them.

3D printing for FRED-209

During prototyping for his Nerf tank, which David named after ED-209 from RoboCop, he used lots of eBay loot and several 3D-printed parts. He used the free OpenSCAD software package to design the parts he wanted to print. If you’re a novice at 3D printing, you might find the printing advice he shares in the write-up on his blog very useful.

3D-printed lid of FRED-209 nerf gun tank by David Pride

David found the 3D printing of the 24cm-long lid of FRED-209 tricky

On eBay, David found some cool-looking chunky wheels, but these turned out to be too heavy for the motors. In the end, he decided to use a Rover 5 chassis, which changed the look of FRED-209 from ‘monster truck’ to ‘tank’.

FRED-209 Nerf tank by David Pride

Next step: teach it to use stairs

The final result looks awesome, and David’s video demonstrates that it shoots very accurately as well. A make like this might be a great defensive project for our new apocalypse-themed Pioneers challenge!

Taking FRED-209 further

David will be uploading code and STL files for FRED-209 soon, so keep an eye on his blog or Twitter for updates. He’s also bringing the Nerf tank to the Cotswold Raspberry Jam this weekend. If you’re attending the event, make sure you catch him and try FRED-209 out yourself.

Never one to rest on his laurels, David is already working on taking his build to the next level. He wants to include a web interface controller and a camera, and is working on implementing OpenCV to give the Nerf tank the ability to autonomously detect targets.

Pi Wars 2018

I have a feeling we might get to see an advanced version of David’s project at next year’s Pi Wars!

The 2018 Pi Wars have just been announced. They will take place on 21-22 April at the Cambridge Computer Laboratory, and you have until 3 October to apply to enter the competition. What are you waiting for? Get making! And as always, do share your robot builds with us via social media.

The post FRED-209 Nerf gun tank appeared first on Raspberry Pi.

Laser Cookies: a YouTube collaboration

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/laser-cookies/

Lasers! Cookies! Raspberry Pi! We’re buzzing with excitement about sharing our latest YouTube video with you, which comes directly from the kitchen of maker Estefannie Explains It All!

Laser-guarded cookies feat. Estefannie Explains It All

Uploaded by Raspberry Pi on 2017-09-18.

Estefannie Explains It All + Raspberry Pi

When Estefannie visited Pi Towers earlier this year, we introduced her to the Raspberry Pi Digital Curriculum and the free resources on our website. We’d already chatted to her via email about the idea of creating a collab video for the Raspberry Pi channel. Once she’d met members of the Raspberry Pi Foundation team and listened to them wax lyrical about the work we do here, she was even more keen to collaborate with us.

Estefannie on Twitter

Ahhhh!!! I still can’t believe I got to hang out and make stuff at the @Raspberry_Pi towers!! Thank you thank you!!

Estefannie returned to the US filled with inspiration for a video for our channel, and we’re so pleased with how awesome her final result is. The video is a super addition to our Raspberry Pi YouTube channel, it shows what our resources can help you achieve, and it’s great fun. You might also have noticed that the project fits in perfectly with this season’s Pioneers challenge. A win all around!

So yeah, we’re really chuffed about this video, and we hope you all like it too!

Estefannie’s Laser Cookies guide

For those of you wanting to try your hand at building your own Cookie Jar Laser Surveillance Security System, Estefannie has provided a complete guide to talk you through it. Here she goes:

First off, you’ll need:

  • 10 lasers
  • 10 photoresistors
  • 10 capacitors
  • 1 Raspberry Pi Zero W
  • 1 buzzer
  • 1 Raspberry Pi Camera Module
  • 12 ft PVC pipes + 4 corners
  • 1 acrylic panel
  • 1 battery pack
  • 8 zip ties
  • tons of cookies

I used the Raspberry Pi Foundation’s Laser trip wire and the Tweeting Babbage resources to get one laser working and to set up the camera and Twitter API. This took me less than an hour, and it was easy, breezy, beautiful, Raspberry Pi.


I soldered ten lasers in parallel and connected ten photoresistors to their own GPIO pins. I didn’t wire them up in series because of sensitivity reasons and to make debugging easier.

Building the frame took a few tries: I actually started with a wood frame, then tried a clear case, and finally realized the best and cleaner solution would be pipes. All the wires go inside the pipes and come out in a small window on the top to wire up to the Zero W.



Using pipes also made the build cheaper, since they were about $3 for 12 ft. Wiring inside the pipes was tricky, and to finish the circuit, I soldered some of the wires after they were already in the pipes.

I tried glueing the lasers to the frame, but the lasers melted the glue and became decalibrated. Next I tried tape, and then I found picture mounting putty. The putty worked perfectly — it was easy to mold a putty base for the lasers and to calibrate and re-calibrate them if needed. Moreover, the lasers stayed in place no matter how hot they got.

Estefannie Explains It All Raspberry Pi Cookie Jar

Although the lasers were not very strong, I still strained my eyes after long hours of calibrating — hence the sunglasses! Working indoors with lasers, sunglasses, and code was weird. But now I can say I’ve done that…in my kitchen.

Using all the knowledge I have shared, this project should take a couple of hours. The code you need lives on my GitHub!

Estefannie Explains It All Raspberry Pi Cookie Jar

“The cookie recipe is my grandma’s, and I am not allowed to share it.”

Estefannie on YouTube

Estefannie made this video for us as a gift, and we’re so grateful for the time and effort she put into it! If you enjoyed it and would like to also show your gratitude, subscribe to her channel on YouTube and follow her on Instagram and Twitter. And if you make something similar, or build anything with our free resources, make sure to share it with us in the comments below or via our social media channels.

The post Laser Cookies: a YouTube collaboration appeared first on Raspberry Pi.