Tag Archives: physics

Colour sensing with a Raspberry Pi

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/colour-sensing-raspberry-pi/

In their latest video and tutorial, Electronic Hub shows you how to detect colour using a Raspberry Pi and a TCS3200 colour sensor.

Raspberry Pi Color Sensor (TCS3200) Interface | Color Detector

A simple Raspberry Pi based project using TCS3200 Color Sensor. The project demonstrates how to interface a Color Sensor (like TCS3200) with Raspberry Pi and implement a simple Color Detector using Raspberry Pi.

What is a TCS3200 colour sensor?

Colour sensors sense reflected light from nearby objects. The bright light of the TCS3200’s on-board white LEDs hits an object’s surface and is reflected back. The sensor has an 8×8 array of photodiodes, which are covered by either a red, blue, green, or clear filter. The type of filter determines what colour a diode can detect. Then the overall colour of an object is determined by how much light of each colour it reflects. (For example, a red object reflects mostly red light.)

Colour sensing with the TCS3200 Color Sensor and a Raspberry Pi

As Electronics Hub explains:

TCS3200 is one of the easily available colour sensors that students and hobbyists can work on. It is basically a light-to-frequency converter, i.e. based on colour and intensity of the light falling on it, the frequency of its output signal varies.

I’ll save you a physics lesson here, but you can find a detailed explanation of colour sensing and the TCS3200 on the Electronics Hub blog.

Raspberry Pi colour sensor

The TCS3200 colour sensor is connected to several of the onboard General Purpose Input Output (GPIO) pins on the Raspberry Pi.

Colour sensing with the TCS3200 Color Sensor and a Raspberry Pi

These connections allow the Raspberry Pi 3 to run one of two Python scripts that Electronics Hub has written for the project. The first displays the RAW RGB values read by the sensor. The second detects the primary colours red, green, and blue, and it can be expanded for more colours with the help of the first script.

Colour sensing with the TCS3200 Color Sensor and a Raspberry Pi

Electronic Hub’s complete build uses a breadboard for simply prototyping

Use it in your projects

This colour sensing setup is a simple means of adding a new dimension to your builds. Why not build a candy-sorting robot that organises your favourite sweets by colour? Or add colour sensing to your line-following buggy to allow for multiple path options!

If your Raspberry Pi project uses colour sensing, we’d love to see it, so be sure to share it in the comments!

The post Colour sensing with a Raspberry Pi appeared first on Raspberry Pi.

Weekly roundup: Fortnite

Post Syndicated from Eevee original https://eev.ee/dev/2018/04/02/weekly-roundup-fortnite/

I skipped a week again because, surprise, I’ve been mostly working on the same game…

  • art: Actually been doing a bit of it! I painted a thing on a whim, and some misc sketches, a few of which I even felt like posting.

  • alice: Finally kind of hit my stride here and wrote, um, a pretty good chunk of stuff. Also played with extending the syntax a bit, and came up with a choice menu that hangs around while the dialogue continues. Kinda cool, though I’m not totally sure what we’ll use it for yet.

    Even with my figuring out how to accelerate, it’s looking like we’ll have to rush if we want to hit our promised date of June 9. So we might delay that a little… maybe even Kickstart some stretch goals? I dunno, I’m leaving that all up to glip and just writing stuff.

  • writing: While I’m at it, I actually picked up and worked on a Twine from ages ago. Cool.

  • idchoppers: Holy moly, it actually works. The basics actually work, at least. I can’t believe how much effort this hecking took.

    I also tried to start putting together an actual map API, with mixed results. And tried to figure out the maximum distance you can jump in Doom, which is surprisingly tricky? Doom physics are super goofy.

  • blog: I actually published a post, which is even tangentially about that idchoppers stuff! Wow! Maybe I’ll do it again, even!

Huh, that almost makes it sound like I’ve been busy.

Simulate sand with Adafruit’s newest project

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/simulate-sand-with-adafruits-newest-project/

The Ruiz brothers at Adafruit have used Phillip Burgess’s PixieDust code to turn a 64×64 LED Matrix and a Raspberry Pi Zero into an awesome sand toy that refuses to defy the laws of gravity. Here’s how to make your own.

BIG LED Sand Toy – Raspberry Pi RGB LED Matrix

Simulated LED Sand Physics! These LEDs interact with motion and looks like they’re affect by gravity. An Adafruit LED matrix displays the LEDs as little grains of sand which are driven by sampling an accelerometer with Raspberry Pi Zero!

Obey gravity

As the latest addition to their online learning system, Adafruit have produced the BIG LED Sand Toy, or as I like to call it, Have you seen this awesome thing Adafuit have made?

Adafruit Sand Toy Raspberry Pi

The build uses a Raspberry Pi Zero, a 64×64 LED matrix, the Adafruit RGB Matrix Bonnet, 3D-printed parts, and a few smaller peripherals. Find the entire tutorial, including downloadable STL files, on their website.

How does it work?

Alongside the aforementioned ingredients, the project utilises the Adafruit LIS3DH Triple-Axis Accelerometer. This sensor is packed with features, and it allows the Raspberry Pi to control the virtual sand depending on how the toy is moved.

Adafruit Sand Toy Raspberry Pi

The Ruiz brothers inserted an SD card loaded with Raspbian Lite into the Raspberry Pi Zero, installed the LED Matrix driver, cloned the Adafruit_PixieDust library, and then just executed the code. They created some preset modes, but once you’re comfortable with the project code, you’ll be able to add your own take on the project.

Accelerometers and Raspberry Pi

This isn’t the first time a Raspberry Pi has met an accelerometer: the two Raspberry Pis aboard the International Space Station for the Astro Pi mission both have accelerometers thanks to their Sense HATs.

Comprised of a bundle of sensors, an LED matrix, and a five-point joystick, the Sense HAT is a great tool for exploring your surroundings with the Raspberry Pi, as well as for using your surroundings to control the Pi. You can find a whole variety of Sense HAT–based projects and tutorials on our website.

Raspberry Pi Sense HAT Slug free resource

And if you’d like to try out the Sense HAT, including its onboard accelerometer, without purchasing one, head over to our online emulator, or use the emulator preinstalled on Raspbian.

The post Simulate sand with Adafruit’s newest project appeared first on Raspberry Pi.

Eevee gained 2791 experience points

Post Syndicated from Eevee original https://eev.ee/blog/2018/01/15/eevee-gained-2791-experience-points/

Eevee grew to level 31!

A year strongly defined by mixed success! Also, a lot of video games.

I ran three game jams, resulting in a total of 157 games existing that may not have otherwise, which is totally mindblowing?!

For GAMES MADE QUICK???, glip and I made NEON PHASE, a short little exploratory platformer. Honestly, I should give myself more credit for this and the rest of the LÖVE games I’ve based on the same codebase — I wove a physics engine (and everything else!) from scratch and it has held up remarkably well for a variety of different uses.

I successfully finished an HD version of Isaac’s Descent using my LÖVE engine, though it doesn’t have anything new over the original and I’ve only released it as a tech demo on Patreon.

For Strawberry Jam (NSFW!) we made fox flux (slightly NSFW!), which felt like a huge milestone: the first game where I made all the art! I mean, not counting Isaac’s Descent, which was for a very limited platform. It’s a pretty arbitrary milestone, yes, but it feels significant. I’ve been working on expanding the game into a longer and slightly less buggy experience, but the art is taking the longest by far. I must’ve spent weeks on player sprites alone.

We then set about working on Bolthaven, a sequel of sorts to NEON PHASE, and got decently far, and then abandond it. Oops.

We then started a cute little PICO-8 game, and forgot about it. Oops.

I was recruited to help with Chaos Composer, a more ambitious game glip started with someone else in Unity. I had to get used to Unity, and we squabbled a bit, but the game is finally about at the point where it’s “playable” and “maps” can be designed? It’s slightly on hold at the moment while we all finish up some other stuff, though.

We made a birthday game for two of our friends whose birthdays were very close together! Only they got to see it.

For Ludum Dare 38, we made Lunar Depot 38, a little “wave shooter” or whatever you call those? The AI is pretty rough, seeing as this was the first time I’d really made enemies and I had 72 hours to figure out how to do it, but I still think it’s pretty fun to play and I love the circular world.

I made Roguelike Simulator as an experiment with making something small and quick with a simple tool, and I had a lot of fun! I definitely want to do more stuff like this in the future.

And now we’re working on a game about Star Anise, my cat’s self-insert, which is looking to have more polish and depth than anything we’ve done so far! We’ve definitely come a long way in a year.

Somewhere along the line, I put out a call for a “potluck” project, where everyone would give me sprites of a given size without knowing what anyone else had contributed, and I would then make a game using only those sprites. Unfortunately, that stalled a few times: I tried using the Phaser JS library, but we didn’t get along; I tried LÖVE, but didn’t know where to go with the game; and then I decided to use this as an experiment with procedural generation, and didn’t get around to it. I still feel bad that everyone did work for me and I didn’t follow through, but I don’t know whether this will ever become a game.

veekun, alas, consumed months of my life. I finally got Sun and Moon loaded, but it took weeks of work since I was basically reinventing all the tooling we’d ever had from scratch, without even having most of that tooling available as a reference. It was worth it in the end, at least: Ultra Sun and Ultra Moon only took a few days to get loaded. But veekun itself is still missing some obvious Sun/Moon features, and the whole site needs an overhaul, and I just don’t know if I want to dedicate that much time to it when I have so much other stuff going on that’s much more interesting to me right now.

I finally turned my blog into more of a website, giving it a neat front page that lists a bunch of stuff I’ve done. I made a release category at last, though I’m still not quite in the habit of using it.

I wrote some blog posts, of course! I think the most interesting were JavaScript got better while I wasn’t looking and Object models. I was also asked to write a couple pieces for money for a column that then promptly shut down.

On a whim, I made a set of Eevee mugshots for Doom, which I think is a decent indication of my (pixel) art progress over the year?

I started idchoppers, a Doom parsing and manipulation library written in Rust, though it didn’t get very far and I’ve spent most of the time fighting with Rust because it won’t let me implement all my extremely bad ideas. It can do a couple things, at least, like flip maps very quickly and render maps to SVG.

I did toy around with music a little, but not a lot.

I wrote two short twines for Flora. They’re okay. I’m working on another; I think it’ll be better.

I didn’t do a lot of art overall, at least compared to the two previous years; most of my art effort over the year has gone into fox flux, which requires me to learn a whole lot of things. I did dip my toes into 3D modelling, most notably producing my current Twitter banner as well as this cool Star Anise animation. I wouldn’t mind doing more of that; maybe I’ll even try to make a low-poly pixel-textured 3D game sometime.

I restarted my book with a much better concept, though so far I’ve only written about half a chapter. Argh. I see that the vast majority of the work was done within the span of a single week, which is bad since that means I only worked on it for a week, but good since that means I can actually do a pretty good amount of work in only a week. I also did a lot of squabbling with tooling, which is hopefully mostly out of the way now.

My computer broke? That was an exciting week.

A lot of stuff, but the year as a whole still feels hit or miss. All the time I spent on veekun feels like a black void in the middle of the year, which seems like a good sign that I maybe don’t want to pour even more weeks into it in the near future.

Mostly, I want to do: more games, more art, more writing, more music.

I want to try out some tiny game making tools and make some tiny games with them — partly to get exposure to different things, partly to get more little ideas out into the world regularly, and partly to get more practice at letting myself have ideas. I have a couple tools in mind and I guess I’ll aim at a microgame every two months or so? I’d also like to finish the expanded fox flux by the end of the year, of course, though at the moment I can’t even gauge how long it might take.

I seriously lapsed on drawing last year, largely because fox flux pixel art took me so much time. So I want to draw more, and I want to get much faster at pixel art. It would probably help if I had a more concrete goal for drawing, so I might try to draw some short comics and write a little visual novel or something, which would also force me to aim for consistency.

I want to work on my book more, of course, but I also want to try my hand at a bit more fiction. I’ve had a blast writing dialogue for our games! I just shy away from longer-form writing for some reason — which seems ridiculous when a large part of my audience found me through my blog. I do think I’ve had some sort of breakthrough in the last month or two; I suddenly feel a good bit more confident about writing in general and figuring out what I want to say? One recent post I know I wrote in a single afternoon, which virtually never happens because I keep rewriting and rearranging stuff. Again, a visual novel would be a good excuse to practice writing fiction without getting too bogged down in details.

And, ah, music. I shy heavily away from music, since I have no idea what I’m doing, and also I seem to spend a lot of time fighting with tools. (Surprise.) I tried out SunVox for the first time just a few days ago and have been enjoying it quite a bit for making sound effects, so I might try it for music as well. And once again, visual novel background music is a pretty low-pressure thing to compose for. Hell, visual novels are small games, too, so that checks all the boxes. I guess I’ll go make a visual novel.

Here’s to twenty gayteen!

Weekly roundup: AOOOWR

Post Syndicated from Eevee original https://eev.ee/dev/2018/01/09/weekly-roundup-aooowr/

  • anise!!: Work continues! glip is busy with a big Flora update, so I’m left to just do code things in the meantime. I did some refactoring I’d been wanting to do for months (splitting apart the “map” and the “world” and the scene that draws the world), drew some final-ish menu art (it looks so good), switched to a vastly more accurate way to integrate position, added a bunch of transitions that make the game feel way more polished, and drew some pretty slick dialogue boxes. Nice!

    I’ll be continuing to work on this game during GAMES MADE QUICK??? 2.0, my jam for making games while watching AGDQ all week! Maybe join if you’re watching AGDQ all week!

  • art: I tried drawing a picture and this time I liked it. I also drew the header art for the aforementioned game jam, though I didn’t have time to finish it, but I think I pulled off a deliberate-looking scratchy sketchy style that’s appropriate for a game jam? Sure we’ll go with that.

  • blog: I finished a post about picking random numbers and a post about how game physics cheat. Which, ah, catches me up for December! Heck! I think I’ve found a slightly more casual style that feels easier to get down, though?

  • writing: I finally wrangled a sensible outline for a Twine I’ve been dragging my feet on, so now I don’t have any excuses! Oh no!

Playing tic-tac-toe against a Raspberry Pi at Maker Faire

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/tic-tac-toe-maker-faire/

At Maker Faire New York, we met up with student Toby Goebeler of Dover High School, Pennsylvania, to learn more about his Tic-Tac-Toe Robot.

Play Tic-Tac-Toe against a Raspberry Pi #MFNYC

Uploaded by Raspberry Pi on 2017-12-18.

Tic-tac-toe with Dover Robotics

We came to see Toby and Brian Bahn, physics teacher for Dover High School and leader of the Dover Robotics club, so they could tell us about the inner workings of the Tic-Tac-Toe Robot project, and how the Raspberry Pi fit within it. Check out our video for Toby’s explanation of the build and the software controlling it.

Wooden robotic arm — Toby Goebeler Tic-Tac-Toe arm Raspberry Pi

Toby’s original robotic arm prototype used a weight to direct the pen on and off the paper. He later replaced this with a servo motor.

Toby documented the prototyping process for the robot on the Dover Robotics blog. Head over there to hear more about the highs and lows of building a robotic arm from scratch, and about how Toby learned to integrate a Raspberry Pi for both software and hardware control.

Wooden robotic arm playing tic-tac-toe — Toby Goebeler Tic-Tac-Toe arm Raspberry Pi

The finished build is a tic-tac-toe beast, besting everyone who dares to challenge it to a game.

And in case you’re wondering: no, none of the Raspberry Pi team were able to beat the Tic-Tac-Toe Robot when we played against it.

Your turn

We always love seeing Raspberry Pis being used in schools to teach coding and digital making, whether in the classroom or during after-school activities such as the Dover Robotics club and our own Code Clubs and CoderDojos. If you are part of a coding or robotics club, we’d love to hear your story! So make sure to share your experiences and projects in the comments below, or via our social media accounts.

The post Playing tic-tac-toe against a Raspberry Pi at Maker Faire appeared first on Raspberry Pi.

Physics cheats

Post Syndicated from Eevee original https://eev.ee/blog/2018/01/06/physics-cheats/

Anonymous asks:

something about how we tweak physics to “work” better in games?

Ho ho! Work. Get it? Like in physics…?


Hitbox” is perhaps not the most accurate term, since the shape used for colliding with the environment and the shape used for detecting damage might be totally different. They’re usually the same in simple platformers, though, and that’s what most of my games have been.

The hitbox is the biggest physics fudge by far, and it exists because of a single massive approximation that (most) games make: you’re controlling a single entity in the abstract, not a physical body in great detail.

That is: when you walk with your real-world meat shell, you perform a complex dance of putting one foot in front of the other, a motion you spent years perfecting. When you walk in a video game, you press a single “walk” button. Your avatar may play an animation that moves its legs back and forth, but since you’re not actually controlling the legs independently (and since simulating them is way harder), the game just treats you like a simple shape. Fairly often, this is a box, or something very box-like.

An Eevee sprite standing on faux ground; the size of the underlying image and the hitbox are outlined

Since the player has no direct control over the exact placement of their limbs, it would be slightly frustrating to have them collide with the world. This is especially true in cases like the above, where the tail and left ear protrude significantly out from the main body. If that Eevee wanted to stand against a real-world wall, she would simply tilt her ear or tail out of the way, so there’s no reason for the ear to block her from standing against a game wall. To compensate for this, the ear and tail are left out of the collision box entirely and will simply jut into a wall if necessary — a goofy affordance that’s so common it doesn’t even register as unusual. As a bonus (assuming this same box is used for combat), she won’t take damage from projectiles that merely graze past an ear.

(One extra consideration for sprite games in particular: the hitbox ought to be horizontally symmetric around the sprite’s pivot — i.e. the point where the entity is truly considered to be standing — so that the hitbox doesn’t abruptly move when the entity turns around!)


Treating the player (and indeed most objects) as a box has one annoying side effect: boxes have corners. Corners can catch on other corners, even by a single pixel. Real-world bodies tend to be a bit rounder and squishier and this can tolerate grazing a corner; even real-world boxes will simply rotate a bit.

Ah, but in our faux physics world, we generally don’t want conscious actors (such as the player) to rotate, even with a realistic physics simulator! Real-world bodies are made of parts that will generally try to keep you upright, after all; you don’t tilt back and forth much.

One way to handle corners is to simply remove them from conscious actors. A hitbox doesn’t have to be a literal box, after all. A popular alternative — especially in Unity where it’s a standard asset — is the pill-shaped capsule, which has semicircles/hemispheres on the top and bottom and a cylindrical body in 3D. No corners, no problem.

Of course, that introduces a new problem: now the player can’t balance precariously on edges without their rounded bottom sliding them off. Alas.

If you’re stuck with corners, then, you may want to use a corner bump, a term I just made up. If the player would collide with a corner, but the collision is only by a few pixels, just nudge them to the side a bit and carry on.

An Eevee sprite trying to move sideways into a shallow ledge; the game bumps her upwards slightly, so she steps onto it instead

When the corner is horizontal, this creates stairs! This is, more or less kinda, how steps work in Doom: when the player tries to cross from one sector into another, if the height difference is 24 units or less, the game simply bumps them upwards to the height of the new floor and lets them continue on.

Implementing this in a game without Doom’s notion of sectors is a little trickier. In fact, I still haven’t done it. Collision detection based on rejection gets it for free, kinda, but it’s not very deterministic and it breaks other things. But that’s a whole other post.


Gravity is pretty easy. Everything accelerates downwards all the time. What’s interesting are the exceptions.


Jumping is a giant hack.

Think about how actual jumping works: you tense your legs, which generally involves bending your knees first, and then spring upwards. In a platformer, you can just leap whenever you feel like it, which is nonsense. Also you go like twenty feet into the air?

Worse, most platformers allow variable-height jumping, where your jump is lower if you let go of the jump button while you’re in the air. Normally, one would expect to have to decide how much force to put into the jump beforehand.

But of course this is about convenience of controls: when jumping is your primary action, you want to be able to do it immediately, without any windup for how high you want to jump.

(And then there’s double jumping? Come on.)

Air control is a similar phenomenon: usually you’d jump in a particular direction by controlling how you push off the ground with your feet, but in a video game, you don’t have feet! You only have the box. The compromise is to let you control your horizontal movement to a limit degree in midair, even though that doesn’t make any sense. (It’s way more fun, though, and overall gives you more movement options, which are good to have in an interactive medium.)

Air control also exposes an obvious place that game physics collide with the realistic model of serious physics engines. I’ve mentioned this before, but: if you use Real Physics™ and air control yourself into a wall, you might find that you’ll simply stick to the wall until you let go of the movement buttons. Why? Remember, player movement acts as though an external force were pushing you around (and from the perspective of a Real™ physics engine, this is exactly how you’d implement it) — so air-controlling into a wall is equivalent to pushing a book against a wall with your hand, and the friction with the wall holds you in place. Oops.

Ground sticking

Another place game physics conflict with physics engines is with running to the top of a slope. On a real hill, of course, you land on top of the slope and are probably glad of it; slopes are hard to climb!

An Eevee moves to the top of a slope, and rather than step onto the flat top, she goes flying off into the air

In a video game, you go flying. Because you’re a box. With momentum. So you hit the peak and keep going in the same direction. Which is diagonally upwards.


To make them more predictable, projectiles generally aren’t subject to gravity, at least as far as I’ve seen. The real world does not have such an exemption. The real world imposes gravity even on sniper rifles, which in a video game are often implemented as an instant trace unaffected by anything in the world because the bullet never actually exists in the world.


Ah. Welcome to hell.


Water is an interesting case, and offhand I don’t know the gritty details of how games implement it. In the real world, water applies a resistant drag force to movement — and that force is proportional to the square of velocity, which I’d completely forgotten until right now. I am almost positive that no game handles that correctly. But then, in real-world water, you can push against the water itself for movement, and games don’t simulate that either. What’s the rough equivalent?

The Sonic Physics Guide suggests that Sonic handles it by basically halving everything: acceleration, max speed, friction, etc. When Sonic enters water, his speed is cut; when Sonic exits water, his speed is increased.

That last bit feels validating — I could swear Metroid Prime did the same thing, and built my own solution around it, but couldn’t remember for sure. It makes no sense, of course, for a jump to become faster just because you happened to break the surface of the water, but it feels fantastic.

The thing I did was similar, except that I didn’t want to add a multiplier in a dozen places when you happen to be underwater (and remember which ones need it to be squared, etc.). So instead, I calculate everything completely as normal, so velocity is exactly the same as it would be on dry land — but the distance you would move gets halved. The effect seems to be pretty similar to most platformers with water, at least as far as I can tell. It hasn’t shown up in a published game and I only added this fairly recently, so I might be overlooking some reason this is a bad idea.

(One reason that comes to mind is that velocity is now a little white lie while underwater, so anything relying on velocity for interesting effects might be thrown off. Or maybe that’s correct, because velocity thresholds should be halved underwater too? Hm!)

Notably, air is also a fluid, so it should behave the same way (just with different constants). I definitely don’t think any games apply air drag that’s proportional to the square of velocity.


Friction is, in my experience, a little handwaved. Probably because real-world friction is so darn complicated.

Consider that in the real world, we want very high friction on the surfaces we walk on — shoes and tires are explicitly designed to increase it, even. We move by bracing a back foot against the ground and using that to push ourselves forward, so we want the ground to resist our push as much as possible.

In a game world, we are a box. We move by being pushed by some invisible outside force, so if the friction between ourselves and the ground is too high, we won’t be able to move at all! That’s complete nonsense physically, but it turns out to be handy in some cases — for example, highish friction can simulate walking through deep mud, which should be difficult due to fluid drag and low friction.

But the best-known example of the fakeness of game friction is video game ice. Walking on real-world ice is difficult because the low friction means low grip; your feet are likely to slip out from under you, and you’ll simply fall down and have trouble moving at all. In a video game, you can’t fall down, so you have the opposite experience: you spend most of your time sliding around uncontrollably. Yet ice is so common in video games (and perhaps so uncommon in places I’ve lived) that I, at least, had never really thought about this disparity until an hour or so ago.

Game friction vs real-world friction

Real-world friction is a force. It’s the normal force (which is the force exerted by the object on the surface) times some constant that depends on how the two materials interact.

Force is mass times acceleration, and platformers often ignore mass, so friction ought to be an acceleration — applied against the object’s movement, but never enough to push it backwards.

I haven’t made any games where variable friction plays a significant role, but my gut instinct is that low friction should mean the player accelerates more slowly but has a higher max speed, and high friction should mean the opposite. I see from my own source code that I didn’t even do what I just said, so let’s defer to some better-made and well-documented games: Sonic and Doom.

In Sonic, friction is a fixed value subtracted from the player’s velocity (regardless of direction) each tic. Sonic has a fixed framerate, so the units are really pixels per tic squared (i.e. acceleration), multiplied by an implicit 1 tic per tic. So far, so good.

But Sonic’s friction only applies if the player isn’t pressing or . Hang on, that isn’t friction at all; that’s just deceleration! That’s equivalent to jogging to a stop. If friction were lower, Sonic would take longer to stop, but otherwise this is only tangentially related to friction.

(In fairness, this approach would decently emulate friction for non-conscious sliding objects, which are never going to be pressing movement buttons. Also, we don’t have the Sonic source code, and the name “friction” is a fan invention; the Sonic Physics Guide already uses “deceleration” to describe the player’s acceleration when turning around.)

Okay, let’s try Doom. In Doom, the default friction is 90.625%.

Hang on, what?

Yes, in Doom, friction is a multiplier applied every tic. Doom runs at 35 tics per second, so this is a multiplier of 0.032 per second. Yikes!

This isn’t anything remotely like real friction, but it’s much easier to implement. With friction as acceleration, the game has to know both the direction of movement (so it can apply friction in the opposite direction) and the magnitude (so it doesn’t overshoot and launch the object in the other direction). That means taking a semi-costly square root and also writing extra code to cap the amount of friction. With a multiplier, neither is necessary; just multiply the whole velocity vector and you’re done.

There are some downsides. One is that objects will never actually stop, since multiplying by 3% repeatedly will never produce a result of zero — though eventually the speed will become small enough to either slip below a “minimum speed” threshold or simply no longer fit in a float representation. Another is that the units are fairly meaningless: with Doom’s default friction of 90.625%, about how long does it take for the player to stop? I have no idea, partly because “stop” is ambiguous here! If friction were an acceleration, I could divide it into the player’s max speed to get a time.

All that aside, what are the actual effects of changing Doom’s friction? What an excellent question that’s surprisingly tricky to answer. (Note that friction can’t be changed in original Doom, only in the Boom port and its derivatives.) Here’s what I’ve pieced together.

Doom’s “friction” is really two values. “Friction” itself is a multiplier applied to moving objects on every tic, but there’s also a move factor which defaults to \(\frac{1}{32} = 0.03125\) and is derived from friction for custom values.

Every tic, the player’s velocity is multiplied by friction, and then increased by their speed times the move factor.

v(n) = v(n – 1) \times friction + speed \times move factor

Eventually, the reduction from friction will balance out the speed boost. That happens when \(v(n) = v(n – 1)\), so we can rearrange it to find the player’s effective max speed:

v = v \times friction + speed \times move factor \\
v – v \times friction = speed \times move factor \\
v = speed \times \frac{move factor}{1 – friction}

For vanilla Doom’s move factor of 0.03125 and friction of 0.90625, that becomes:

v = speed \times \frac{\frac{1}{32}}{1 – \frac{29}{32}} = speed \times \frac{\frac{1}{32}}{\frac{3}{32}} = \frac{1}{3} \times speed

Curiously, “speed” is three times the maximum speed an actor can actually move. Doomguy’s run speed is 50, so in practice he moves a third of that, or 16⅔ units per tic. (Of course, this isn’t counting SR40, a bug that lets Doomguy run ~40% faster than intended diagonally.)

So now, what if you change friction? Even more curiously, the move factor is calculated completely differently depending on whether friction is higher or lower than the default Doom amount:

move factor = \begin{cases}
\frac{133 – 128 \times friction}{544} &≈ 0.244 – 0.235 \times friction & \text{ if } friction \ge \frac{29}{32} \\
\frac{81920 \times friction – 70145}{1048576} &≈ 0.078 \times friction – 0.067 & \text{ otherwise }

That’s pretty weird? Complicating things further is that low friction (which means muddy terrain, remember) has an extra multiplier on its move factor, depending on how fast you’re already going — the idea is apparently that you have a hard time getting going, but it gets easier as you find your footing. The extra multiplier maxes out at 8, which makes the two halves of that function meet at the vanilla Doom value.

A graph of the relationship between friction and move factor

That very top point corresponds to the move factor from the original game. So no matter what you do to friction, the move factor becomes lower. At 0.85 and change, you can no longer move at all; below that, you move backwards.

From the formula above, it’s easy to see what changes to friction and move factor will do to Doomguy’s stable velocity. Move factor is in the numerator, so increasing it will increase stable velocity — but it can’t increase, so stable velocity can only ever decrease. Friction is in the denominator, but it’s subtracted from 1, so increasing friction will make the denominator a smaller value less than 1, i.e. increase stable velocity. Combined, we get this relationship between friction and stable velocity.

A graph showing stable velocity shooting up dramatically as friction increases

As friction approaches 1, stable velocity grows without bound. This makes sense, given the definition of \(v(n)\) — if friction is 1, the velocity from the previous tic isn’t reduced at all, so we just keep accelerating freely.

All of this is why I’m wary of using multipliers.

Anyway, this leaves me with one last question about the effects of Doom’s friction: how long does it take to reach stable velocity? Barring precision errors, we’ll never truly reach stable velocity, but let’s say within 5%. First we need a closed formula for the velocity after some number of tics. This is a simple recurrence relation, and you can write a few terms out yourself if you want to be sure this is right.

v(n) = v_0 \times friction^n + speed \times move factor \times \frac{friction^n – 1}{friction – 1}

Our initial velocity is zero, so the first term disappears. Set this equal to the stable formula and solve for n:

speed \times move factor \times \frac{friction^n – 1}{friction – 1} = (1 – 5\%) \times speed \times \frac{move factor}{1 – friction} \\
friction^n – 1 = -(1 – 5\%) \\
n = \frac{\ln 5\%}{\ln friction}

Speed” and move factor disappear entirely, which makes sense, and this is purely a function of friction (and how close we want to get). For vanilla Doom, that comes out to 30.4, which is a little less than a second. For other values of friction:

A graph of time to stability which leaps upwards dramatically towards the right

As friction increases (which in Doom terms means the surface is more slippery), it takes longer and longer to reach stable speed, which is in turn greater and greater. For lesser friction (i.e. mud), stable speed is lower, but reached fairly quickly. (Of course, the extra “getting going” multiplier while in mud adds some extra time here, but including that in the graph is a bit more complicated.)

I think this matches with my instincts above. How fascinating!

What’s that? This is way too much math and you hate it? Then don’t use multipliers in game physics.


That was a hell of a diversion!

I guess the goofiest stuff in basic game physics is really just about mapping player controls to in-game actions like jumping and deceleration; the rest consists of hacks to compensate for representing everything as a box.

Weekly roundup: Anise’s very own video game

Post Syndicated from Eevee original https://eev.ee/dev/2018/01/01/weekly-roundup-anises-very-own-video-game/

Happy new year! 🎆

In an unprecedented move, I did one thing for an entire calendar week. I say “unprecedented” but I guess the same thing happened with fox flux. And NEON PHASE. Hmm. Sensing a pattern. See if you can guess what the one thing was!

  • anise!!: Wow! It’s Anise! The game has come so far that I can’t even believe that any of this was a recent change. I made monster AI vastly more sensible, added a boatload of mechanics, fleshed out more than half the map (and sketched out the rest), and drew and implemented most of a menu with a number of excellent goodies. Also, FINALLY (after a full year of daydreaming about it), eliminated the terrible “clock” structure I invented for collision detection, as well as cut down on a huge source of completely pointless allocations, which sped physics up in general by at least 10% and cut GC churn significantly. Hooray! And I’ve done even more just in the last day and a half. Still a good bit of work left, but this game is gonna be fantastic.

  • art: Oh right I tried drawing a picture but I didn’t like it so I stopped.

I have some writing to catch up on — I have several things 80% written, but had to stop because I was just starting to get a cold and couldn’t even tell if my own writing was sensible any more. And then I had to work on a video game about my cat. Sorry. Actually, not sorry, video games about my cat are always top priority. You knew what you were signing up for.

Pioneers winners: only you can save us

Post Syndicated from Erin Brindley original https://www.raspberrypi.org/blog/pioneers-winners-only-you-can-save-us/

She asked for help, and you came to her aid. Pioneers, the winners of the Only you can save us challenge have been picked!

Can you see me? Only YOU can save us!

I need your help. This is a call out for those between 11- and 16-years-old in the UK and Republic of Ireland. Something has gone very, very wrong and only you can save us. I’ve collected together as much information for you as I can. You’ll find it at http://www.raspberrypi.org/pioneers.

The challenge

In August we intercepted an emergency communication from a lonesome survivor. She seemed to be in quite a bit of trouble, and asked all you young people aged 11 to 16 to come up with something to help tackle the oncoming crisis, using whatever technology you had to hand. You had ten weeks to work in teams of two to five with an adult mentor to fulfil your mission.

The judges

We received your world-saving ideas, and our savvy survivor pulled together a ragtag bunch of apocalyptic experts to help us judge which ones would be the winning entries.

Dr Shini Somara

Dr Shini Somara is an advocate for STEM education and a mechanical engineer. She was host of The Health Show and has appeared in documentaries for the BBC, PBS Digital, and Sky. You can check out her work hosting Crash Course Physics on YouTube.

Prof Lewis Dartnell is an astrobiologist and author of the book The Knowledge: How to Rebuild Our World From Scratch.

Emma Stephenson has a background in aeronautical engineering and currently works in the Shell Foundation’s Access to Energy and Sustainable Mobility portfolio.

Currently sifting through the entries with the other judges of #makeyourideas with @raspberrypifoundation @_raspberrypi_

151 Likes, 3 Comments – Shini Somara (@drshinisomara) on Instagram: “Currently sifting through the entries with the other judges of #makeyourideas with…”

The winners

Our survivor is currently putting your entries to good use repairing, rebuilding, and defending her base. Our judges chose the following projects as outstanding examples of world-saving digital making.

Theme winner: Computatron

Raspberry Pioneers 2017 – Nerfus Dislikus Killer Robot

This is our entry to the pioneers ‘Only you can save us’ competition. Our team name is Computatrum. Hope you enjoy!

Are you facing an unknown enemy whose only weakness is Nerf bullets? Then this is the robot for you! We loved the especially apocalyptic feel of the Computatron’s cleverly hacked and repurposed elements. The team even used an old floppy disc mechanism to help fire their bullets!

Technically brilliant: Robot Apocalypse Committee

Pioneers Apocalypse 2017 – RationalPi

Thousands of lines of code… Many sheets of acrylic… A camera, touchscreen and fingerprint scanner… This is our entry into the Raspberry Pi Pioneers2017 ‘Only YOU can Save Us’ theme. When zombies or other survivors break into your base, you want a secure way of storing your crackers.

The Robot Apocalypse Committee is back, and this time they’ve brought cheese! The crew designed a cheese- and cracker-dispensing machine complete with face and fingerprint recognition to ensure those rations last until the next supply drop.

Best explanation: Pi Chasers

Tala – Raspberry Pi Pioneers Project

Hi! We are PiChasers and we entered the Raspberry Pi Pionners challenge last time when the theme was “Make it Outdoors!” but now we’ve been faced with another theme “Apocolypse”. We spent a while thinking of an original thing that would help in an apocolypse and decided upon a ‘text-only phone’ which uses local radio communication rather than cellular.

This text-based communication device encased in a tupperware container could be a lifesaver in a crisis! And luckily, the Pi Chasers produced an excellent video and amazing GitHub repo, ensuring that any and all survivors will be able to build their own in the safety of their base.

Most inspiring journey: Three Musketeers

Pioneers Entry – The Apocalypse

Pioneers Entry Team Name: The Three Musketeers Team Participants: James, Zach and Tom

We all know that zombies are terrible at geometry, and the Three Musketeers used this fact to their advantage when building their zombie security system. We were impressed to see the team working together to overcome the roadblocks they faced along the way.

We appreciate what you’re trying to do: Zombie Trolls

Zombie In The Middle

Uploaded by CDA Bodgers on 2017-12-01.

Playing piggy in the middle with zombies sure is a unique way of saving humankind from total extinction! We loved this project idea, and although the Zombie Trolls had a little trouble with their motors, we’re sure with a little more tinkering this zombie-fooling contraption could save us all.

Most awesome

Our judges also wanted to give a special commendation to the following teams for their equally awesome apocalypse-averting ideas:

  • PiRates, for their multifaceted zombie-proofing defence system and the high production value of their video
  • Byte them Pis, for their beautiful zombie-detecting doormat
  • Unatecxon, for their impressive bunker security system
  • Team Crompton, for their pressure-activated door system
  • Team Ernest, for their adventures in LEGO

The prizes

All our winning teams have secured exclusive digital maker boxes. These are jam-packed with tantalising tech to satisfy all tinkering needs, including:

Our theme winners have also secured themselves a place at Coolest Projects 2018 in Dublin, Ireland!

Thank you to everyone who got involved in this round of Pioneers. Look out for your awesome submission swag arriving in the mail!

The post Pioneers winners: only you can save us appeared first on Raspberry Pi.

Weekly roundup: VK Ultra

Post Syndicated from Eevee original https://eev.ee/dev/2017/11/27/weekly-roundup-vk-ultra/

  • fox flux: Cleaned up and committed the “heart get” overlay and worked on some more art for it. Diagnosed a very obscure physics problem, but didn’t come up with a good solution yet; physics is hard! Drew a very good tree trunk to use as a spawn point; also worked on some background foliage, though less successfully. Played with colors a bit. Tried to work out a tileset for underground areas.

  • music: I wrote like half of a little chiptune song that I actually like so far! I’m now seriously toying with the idea of doing my own music for fox flux. Played a bit with more sound effects, too.

  • blog: I wrote up the Eevee mugshot set for Doom I made, as an inaugural post for the release category.

  • veekun: Finished up Ultra Sun and Ultra Moon! Pokémon sprites, box sprites, item sprites, and the same data as Sun/Moon. I say “finished” but of course plenty of stuff is still missing, alas.

  • cc: I’m trying to make glip some building blocks so that they can actually start building the game, so I made some breakable blocks. Also wrote a little shader for implementing their parallax background, which involves a bunch of layer modes.

  • misc: I got a new keyboard. Also I installed umatrix because noscript’s web extension version is half-broken and driving me up the wall. Sorry, noscript.

Huh, that’s not a bad haul, despite a few nights of incredibly bad sleep. Cool.

Presenting Amazon Sumerian: An easy way to create VR, AR, and 3D experiences

Post Syndicated from Tara Walker original https://aws.amazon.com/blogs/aws/launch-presenting-amazon-sumerian/

If you have had an opportunity to read any of my blog posts or attended any session I’ve conducted at various conferences, you are probably aware that I am definitively a geek girl. I am absolutely enamored with all of the latest advancements that have been made in technology areas like cloud, artificial intelligence, internet of things and the maker space, as well as, with virtual reality and augmented reality. In my opinion, it is a wonderful time to be a geek. All the things that we dreamed about building while we sweated through our algorithms and discrete mathematics classes or the technology we marveled at when watching Star Wars and Star Trek are now coming to fruition.  So hopefully this means it will only be a matter of time before I can hyperdrive to other galaxies in space, but until then I can at least build the 3D virtual reality and augmented reality characters and images like those featured in some of my favorite shows.

Amazon Sumerian provides tools and resources that allows anyone to create and run augmented reality (AR), virtual reality (VR), and 3D applications with ease.  With Sumerian, you can build multi-platform experiences that run on hardware like the Oculus, HTC Vive, and iOS devices using WebVR compatible browsers and with support for ARCore on Android devices coming soon.

This exciting new service, currently in preview, delivers features to allow you to design highly immersive and interactive 3D experiences from your browser. Some of these features are:

  • Editor: A web-based editor for constructing 3D scenes, importing assets, scripting interactions and special effects, with cross-platform publishing.
  • Object Library: a library of pre-built objects and templates.
  • Asset Import: Upload 3D assets to use in your scene. Sumerian supports importing FBX, OBJ, and coming soon Unity projects.
  • Scripting Library: provides a JavaScript scripting library via its 3D engine for advanced scripting capabilities.
  • Hosts: animated, lifelike 3D characters that can be customized for gender, voice, and language.
  • AWS Services Integration: baked in integration with Amazon Polly and Amazon Lex to add speech and natural language to into Sumerian hosts. Additionally, the scripting library can be used with AWS Lambda allowing use of the full range of AWS services.

Since Amazon Sumerian doesn’t require you to have 3D graphics or programming experience to build rich, interactive VR and AR scenes, let’s take a quick run to the Sumerian Dashboard and check it out.

From the Sumerian Dashboard, I can easily create a new scene with a push of a button.

A default view of the new scene opens and is displayed in the Sumerian Editor. With the Tara Blog Scene opened in the editor, I can easily import assets into my scene.

I’ll click the Import Asset button and pick an asset, View Room, to import into the scene. With the desired asset selected, I’ll click the Add button to import it.

Excellent, my asset was successfully imported into the Sumerian Editor and is shown in the Asset panel.  Now, I have the option to add the View Room object into my scene by selecting it in the Asset panel and then dragging it onto the editor’s canvas.

I’ll repeat the import asset process and this time I will add the Mannequin asset to the scene.

Additionally, with Sumerian, I can add scripting to Entity assets to make my scene even more exciting by adding a ScriptComponent to an entity and creating a script.  I can use the provided built-in scripts or create my own custom scripts. If I create a new custom script, I will get a blank script with some base JavaScript code that looks similar to the code below.

'use strict';
/* global sumerian */
//This is Me-- trying out the custom scripts - Tara

var setup = function (args, ctx) {
// Called when play mode starts.
var fixedUpdate = function (args, ctx) {
// Called on every physics update, after setup().
var update = function (args, ctx) {
// Called on every render frame, after setup().
var lateUpdate = function (args, ctx) {
// Called after all script "update" methods in the scene has been called.
var cleanup = function (args, ctx) {
// Called when play mode stops.
var parameters = [];

Very cool, I just created a 3D scene using Amazon Sumerian in a matter of minutes and I have only scratched the surface.


The Amazon Sumerian service enables you to create, build, and run virtual reality (VR), augmented reality (AR), and 3D applications with ease.  You don’t need any 3D graphics or specialized programming knowledge to get started building scenes and immersive experiences.  You can import FBX, OBJ, and Unity projects in Sumerian, as well as upload your own 3D assets for use in your scene. In addition, you can create digital characters to narrate your scene and with these digital assets, you have choices for the character’s appearance, speech and behavior.

You can learn more about Amazon Sumerian and sign up for the preview to get started with the new service on the product page.  I can’t wait to see what rich experiences you all will build.



Community Profile: Matthew Timmons-Brown

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/community-profile-matthew-timmons-brown/

This column is from The MagPi issue 57. You can download a PDF of the full issue for free, or subscribe to receive the print edition in your mailbox or the digital edition on your tablet. All proceeds from the print and digital editions help the Raspberry Pi Foundation achieve its charitable goals.

“I first set up my YouTube channel because I noticed a massive lack of video tutorials for the Raspberry Pi,” explains Matthew Timmons-Brown, known to many as The Raspberry Pi Guy. At 18 years old, the Cambridge-based student has more than 60 000 subscribers to his channel, making his account the most successful Raspberry Pi–specific YouTube account to date.

Matthew Timmons-Brown

Matt gives a talk at the Raspberry Pi 5th Birthday weekend event

The Raspberry Pi Guy

If you’ve attended a Raspberry Pi event, there’s a good chance you’ve already met Matt. And if not, you’ll have no doubt come across one or more of his tutorials and builds online. On more than one occasion, his work has featured on the Raspberry Pi blog, with his yearly Raspberry Pi roundup videos being a staple of the birthday celebrations.

Matthew Timmons-Brown

With his website, Matt aimed to collect together “the many strands of The Raspberry Pi Guy” into one, neat, cohesive resource — and it works. From newcomers to the credit card-sized computer to hardened Pi veterans, The Raspberry Pi Guy offers aid and inspiration for many. Looking for a review of the Raspberry Pi Zero W? He’s filmed one. Looking for a step-by-step guide to building a Pi-powered Amazon Alexa? No problem, there’s one of those too.

Make your Raspberry Pi artificially intelligent! – Amazon Alexa Personal Assistant Tutorial

Artificial Intelligence. A hefty topic that has dominated the field since computers were first conceived. What if I told you that you could put an artificial intelligence service on your own $30 computer?! That’s right! In this tutorial I will show you how to create your own artificially intelligent personal assistant, using Amazon’s Alexa voice recognition and information service!

Raspberry Pi electric skateboard

Last summer, Matt introduced the world to his Raspberry Pi-controlled electric skateboard, soon finding himself plastered over local press as well as the BBC and tech sites like Adafruit and geek.com. And there’s no question as to why the build was so popular. With YouTubers such as Casey Neistat increasing the demand for electric skateboards on a near-daily basis, the call for a cheaper, home-brew version has quickly grown.


Over the summer, I made my own electric skateboard using a £4 Raspberry Pi Zero. Controlled with a Nintendo Wiimote, capable of going 30km/h, and with a range of over 10km, this project has been pretty darn fun. In this video, you see me racing around Cambridge and I explain the ins and outs of this project.

Using a Raspberry Pi Zero, a Nintendo Wii Remote, and a little help from members of the Cambridge Makespace community, Matt built a board capable of reaching 30km/h, with a battery range of 10km per charge. Alongside Neistat, Matt attributes the project inspiration to Australian student Tim Maier, whose build we previously covered in The MagPi.

Matthew Timmons-Brown and Eben Upton standing in a car park looking at a smartphone


Despite the success and the fun of the electric skateboard (including convincing Raspberry Pi Trading CEO Eben Upton to have a go for local television news coverage), the project Matt is most proud of is his wireless LiDAR system for theoretical use on the Mars rovers.

Matthew Timmons-Brown's LiDAR project for scanning terrains with lasers

Using a tablet app to define the angles, Matt’s A Level coursework LiDAR build scans the surrounding area, returning the results to the touchscreen, where they can be manipulated by the user. With his passion for the cosmos and the International Space Station, it’s no wonder that this is Matt’s proudest build.

Built for his A Level Computer Science coursework, the build demonstrates Matt’s passion for space and physics. Used as a means of surveying terrain, LiDAR uses laser light to measure distance, allowing users to create 3D-scanned, high-resolution maps of a specific area. It is a perfect technology for exploring unknown worlds.

Matthew Timmons-Brown and two other young people at a reception in the Houses of Parliament

Matt was invited to St James’s Palace and the Houses of Parliament as part of the Raspberry Pi community celebrations in 2016

Joining the community

In a recent interview at Hills Road Sixth Form College, where he is studying mathematics, further mathematics, physics, and computer science, Matt revealed where his love of electronics and computer science started. “I originally became interested in computer science in 2012, when I read a tiny magazine article about a computer that I would be able to buy with pocket money. This was a pretty exciting thing for a 12-year-old! Your own computer… for less than £30?!” He went on to explain how it became his mission to learn all he could on the subject and how, months later, his YouTube channel came to life, cementing him firmly into the Raspberry Pi community

The post Community Profile: Matthew Timmons-Brown appeared first on Raspberry Pi.

Coaxing 2D platforming out of Unity

Post Syndicated from Eevee original https://eev.ee/blog/2017/10/13/coaxing-2d-platforming-out-of-unity/

An anonymous donor asked a question that I can’t even begin to figure out how to answer, but they also said anything else is fine, so here’s anything else.

I’ve been avoiding writing about game physics, since I want to save it for ✨ the book I’m writing ✨, but that book will almost certainly not touch on Unity. Here, then, is a brief run through some of the brick walls I ran into while trying to convince Unity to do 2D platforming.

This is fairly high-level — there are no blocks of code or helpful diagrams. I’m just getting this out of my head because it’s interesting. If you want more gritty details, I guess you’ll have to wait for ✨ the book ✨.

The setup

I hadn’t used Unity before. I hadn’t even used a “real” physics engine before. My games so far have mostly used LÖVE, a Lua-based engine. LÖVE includes box2d bindings, but for various reasons (not all of them good), I opted to avoid them and instead write my own physics completely from scratch. (How, you ask? ✨ Book ✨!)

I was invited to work on a Unity project, Chaos Composer, that someone else had already started. It had basic movement already implemented; I taught myself Unity’s physics system by hacking on it. It’s entirely possible that none of this is actually the best way to do anything, since I was really trying to reproduce my own homegrown stuff in Unity, but it’s the best I’ve managed to come up with.

Two recurring snags were that you can’t ask Unity to do multiple physics updates in a row, and sometimes getting the information I wanted was difficult. Working with my own code spoiled me a little, since I could invoke it at any time and ask it anything I wanted; Unity, on the other hand, is someone else’s black box with a rigid interface on top.

Also, wow, Googling for a lot of this was not quite as helpful as expected. A lot of what’s out there is just the first thing that works, and often that’s pretty hacky and imposes severe limits on the game design (e.g., “this won’t work with slopes”). Basic movement and collision are the first thing you do, which seems to me like the worst time to be locking yourself out of a lot of design options. I tried very (very, very, very) hard to minimize those kinds of constraints.

Problem 1: Movement

When I showed up, movement was already working. Problem solved!

Like any good programmer, I immediately set out to un-solve it. Given a “real” physics engine like Unity prominently features, you have two options: ⓐ treat the player as a physics object, or ⓑ don’t. The existing code went with option ⓑ, like I’d done myself with LÖVE, and like I’d seen countless people advise. Using a physics sim makes for bad platforming.

But… why? I believed it, but I couldn’t concretely defend it. I had to know for myself. So I started a blank project, drew some physics boxes, and wrote a dozen-line player controller.

Ah! Immediate enlightenment.

If the player was sliding down a wall, and I tried to move them into the wall, they would simply freeze in midair until I let go of the movement key. The trouble is that the physics sim works in terms of forces — moving the player involves giving them a nudge in some direction, like a giant invisible hand pushing them around the level. Surprise! If you press a real object against a real wall with your real hand, you’ll see the same effect — friction will cancel out gravity, and the object will stay in midair..

Platformer movement, as it turns out, doesn’t make any goddamn physical sense. What is air control? What are you pushing against? Nothing, really; we just have it because it’s nice to play with, because not having it is a nightmare.

I looked to see if there were any common solutions to this, and I only really found one: make all your walls frictionless.

Game development is full of hacks like this, and I… don’t like them. I can accept that minor hacks are necessary sometimes, but this one makes an early and widespread change to a fundamental system to “fix” something that was wrong in the first place. It also imposes an “invisible” requirement, something I try to avoid at all costs — if you forget to make a particular wall frictionless, you’ll never know unless you happen to try sliding down it.

And so, I swiftly returned to the existing code. It wasn’t too different from what I’d come up with for LÖVE: it applied gravity by hand, tracked the player’s velocity, computed the intended movement each frame, and moved by that amount. The interesting thing was that it used MovePosition, which schedules a movement for the next physics update and stops the movement if the player hits something solid.

It’s kind of a nice hybrid approach, actually; all the “physics” for conscious actors is done by hand, but the physics engine is still used for collision detection. It’s also used for collision rejection — if the player manages to wedge themselves several pixels into a solid object, for example, the physics engine will try to gently nudge them back out of it with no extra effort required on my part. I still haven’t figured out how to get that to work with my homegrown stuff, which is built to prevent overlap rather than to jiggle things out of it.

But wait, what about…

Our player is a dynamic body with rotation lock and no gravity. Why not just use a kinematic body?

I must be missing something, because I do not understand the point of kinematic bodies. I ran into this with Godot, too, which documented them the same way: as intended for use as players and other manually-moved objects. But by default, they don’t even collide with other kinematic bodies or static geometry. What? There’s a checkbox to turn this on, which I enabled, but then I found out that MovePosition doesn’t stop kinematic bodies when they hit something, so I would’ve had to cast along the intended path of movement to figure out when to stop, thus duplicating the same work the physics engine was about to do.

But that’s impossible anyway! Static geometry generally wants to be made of edge colliders, right? They don’t care about concave/convex. Imagine the player is standing on the ground near a wall and tries to move towards the wall. Both the ground and the wall are different edges from the same edge collider.

If you try to cast the player’s hitbox horizontally, parallel to the ground, you’ll only get one collision: the existing collision with the ground. Casting doesn’t distinguish between touching and hitting. And because Unity only reports one collision per collider, and because the ground will always show up first, you will never find out about the impending wall collision.

So you’re forced to either use raycasts for collision detection or decomposed polygons for world geometry, both of which are slightly worse tools for no real gain.

I ended up sticking with a dynamic body.

Oh, one other thing that doesn’t really fit anywhere else: keep track of units! If you’re adding something called “velocity” directly to something called “position”, something has gone very wrong. Acceleration is distance per time squared; velocity is distance per time; position is distance. You must multiply or divide by time to convert between them.

I never even, say, add a constant directly to position every frame; I always phrase it as velocity and multiply by Δt. It keeps the units consistent: time is always in seconds, not in tics.

Problem 2: Slopes

Ah, now we start to get off in the weeds.

A sort of pre-problem here was detecting whether we’re on a slope, which means detecting the ground. The codebase originally used a manual physics query of the area around the player’s feet to check for the ground, which seems to be somewhat common, but that can’t tell me the angle of the detected ground. (It’s also kind of error-prone, since “around the player’s feet” has to be specified by hand and may not stay correct through animations or changes in the hitbox.)

I replaced that with what I’d eventually settled on in LÖVE: detect the ground by detecting collisions, and looking at the normal of the collision. A normal is a vector that points straight out from a surface, so if you’re standing on the ground, the normal points straight up; if you’re on a 10° incline, the normal points 10° away from straight up.

Not all collisions are with the ground, of course, so I assumed something is ground if the normal pointed away from gravity. (I like this definition more than “points upwards”, because it avoids assuming anything about the direction of gravity, which leaves some interesting doors open for later on.) That’s easily detected by taking the dot product — if it’s negative, the collision was with the ground, and I now have the normal of the ground.

Actually doing this in practice was slightly tricky. With my LÖVE engine, I could cram this right into the middle of collision resolution. With Unity, not quite so much. I went through a couple iterations before I really grasped Unity’s execution order, which I guess I will have to briefly recap for this to make sense.

Unity essentially has two update cycles. It performs physics updates at fixed intervals for consistency, and updates everything else just before rendering. Within a single frame, Unity does as many fixed physics updates as it has spare time for (which might be zero, one, or more), then does a regular update, then renders. User code can implement either or both of Update, which runs during a regular update, and FixedUpdate, which runs just before Unity does a physics pass.

So my solution was:

  • At the very end of FixedUpdate, clear the actor’s “on ground” flag and ground normal.

  • During OnCollisionEnter2D and OnCollisionStay2D (which are called from within a physics pass), if there’s a collision that looks like it’s with the ground, set the “on ground” flag and ground normal. (If there are multiple ground collisions, well, good luck figuring out the best way to resolve that! At the moment I’m just taking the first and hoping for the best.)

That means there’s a brief window between the end of FixedUpdate and Unity’s physics pass during which a grounded actor might mistakenly believe it’s not on the ground, which is a bit of a shame, but there are very few good reasons for anything to be happening in that window.

Okay! Now we can do slopes.

Just kidding! First we have to do sliding.

When I first looked at this code, it didn’t apply gravity while the player was on the ground. I think I may have had some problems with detecting the ground as result, since the player was no longer pushing down against it? Either way, it seemed like a silly special case, so I made gravity always apply.

Lo! I was a fool. The player could no longer move.

Why? Because MovePosition does exactly what it promises. If the player collides with something, they’ll stop moving. Applying gravity means that the player is trying to move diagonally downwards into the ground, and so MovePosition stops them immediately.

Hence, sliding. I don’t want the player to actually try to move into the ground. I want them to move the unblocked part of that movement. For flat ground, that means the horizontal part, which is pretty much the same as discarding gravity. For sloped ground, it’s a bit more complicated!

Okay but actually it’s less complicated than you’d think. It can be done with some cross products fairly easily, but Unity makes it even easier with a couple casts. There’s a Vector3.ProjectOnPlane function that projects an arbitrary vector on a plane given by its normal — exactly the thing I want! So I apply that to the attempted movement before passing it along to MovePosition. I do the same thing with the current velocity, to prevent the player from accelerating infinitely downwards while standing on flat ground.

One other thing: I don’t actually use the detected ground normal for this. The player might be touching two ground surfaces at the same time, and I’d want to project on both of them. Instead, I use the player body’s GetContacts method, which returns contact points (and normals!) for everything the player is currently touching. I believe those contact points are tracked by the physics engine anyway, so asking for them doesn’t require any actual physics work.

(Looking at the code I have, I notice that I still only perform the slide for surfaces facing upwards — but I’d want to slide against sloped ceilings, too. Why did I do this? Maybe I should remove that.)

(Also, I’m pretty sure projecting a vector on a plane is non-commutative, which raises the question of which order the projections should happen in and what difference it makes. I don’t have a good answer.)

(I note that my LÖVE setup does something slightly different: it just tries whatever the movement ought to be, and if there’s a collision, then it projects — and tries again with the remaining movement. But I can’t ask Unity to do multiple moves in one physics update, alas.)

Okay! Now, slopes. But actually, with the above work done, slopes are most of the way there already.

One obvious problem is that the player tries to move horizontally even when on a slope, and the easy fix is to change their movement from speed * Vector2.right to speed * new Vector2(ground.y, -ground.x) while on the ground. That’s the ground normal rotated a quarter-turn clockwise, so for flat ground it still points to the right, and in general it points rightwards along the ground. (Note that it assumes the ground normal is a unit vector, but as far as I’m aware, that’s true for all the normals Unity gives you.)

Another issue is that if the player stands motionless on a slope, gravity will cause them to slowly slide down it — because the movement from gravity will be projected onto the slope, and unlike flat ground, the result is no longer zero. For conscious actors only, I counter this by adding the opposite factor to the player’s velocity as part of adding in their walking speed. This matches how the real world works, to some extent: when you’re standing on a hill, you’re exerting some small amount of effort just to stay in place.

(Note that slope resistance is not the same as friction. Okay, yes, in the real world, virtually all resistance to movement happens as a result of friction, but bracing yourself against the ground isn’t the same as being passively resisted.)

From here there are a lot of things you can do, depending on how you think slopes should be handled. You could make the player unable to walk up slopes that are too steep. You could make walking down a slope faster than walking up it. You could make jumping go along the ground normal, rather than straight up. You could raise the player’s max allowed speed while running downhill. Whatever you want, really. Armed with a normal and awareness of dot products, you can do whatever you want.

But first you might want to fix a few aggravating side effects.

Problem 3: Ground adherence

I don’t know if there’s a better name for this. I rarely even see anyone talk about it, which surprises me; it seems like it should be a very common problem.

The problem is: if the player runs up a slope which then abruptly changes to flat ground, their momentum will carry them into the air. For very fast players going off the top of very steep slopes, this makes sense, but it becomes visible even for relatively gentle slopes. It was a mild nightmare in the original release of our game Lunar Depot 38, which has very “rough” ground made up of lots of shallow slopes — so the player is very frequently slightly off the ground, which meant they couldn’t jump, for seemingly no reason. (I even had code to fix this, but I disabled it because of a silly visual side effect that I never got around to fixing.)

Anyway! The reason this is a problem is that game protagonists are generally not boxes sliding around — they have legs. We don’t go flying off the top of real-world hilltops because we put our foot down until it touches the ground.

Simulating this footfall is surprisingly fiddly to get right, especially with someone else’s physics engine. It’s made somewhat easier by Cast, which casts the entire hitbox — no matter what shape it is — in a particular direction, as if it had moved, and tells you all the hypothetical collisions in order.

So I cast the player in the direction of gravity by some distance. If the cast hits something solid with a ground-like collision normal, then the player must be close to the ground, and I move them down to touch it (and set that ground as the new ground normal).

There are some wrinkles.

Wrinkle 1: I only want to do this if the player is off the ground now, but was on the ground last frame, and is not deliberately moving upwards. That latter condition means I want to skip this logic if the player jumps, for example, but also if the player is thrust upwards by a spring or abducted by a UFO or whatever. As long as external code goes through some interface and doesn’t mess with the player’s velocity directly, that shouldn’t be too hard to track.

Wrinkle 2: When does this logic run? It needs to happen after the player moves, which means after a Unity physics pass… but there’s no callback for that point in time. I ended up running it at the beginning of FixedUpdate and the beginning of Update — since I definitely want to do it before rendering happens! That means it’ll sometimes happen twice between physics updates. (I could carefully juggle a flag to skip the second run, but I… didn’t do that. Yet?)

Wrinkle 3: I can’t move the player with MovePosition! Remember, MovePosition schedules a movement, it doesn’t actually perform one; that means if it’s called twice before the physics pass, the first call is effectively ignored. I can’t easily combine the drop with the player’s regular movement, for various fiddly reasons. I ended up doing it “by hand” using transform.Translate, which I think was the “old way” to do manual movement before MovePosition existed. I’m not totally sure if it activates triggers? For that matter, I’m not sure it even notices collisions — but since I did a full-body Cast, there shouldn’t be any anyway.

Wrinkle 4: What, exactly, is “some distance”? I’ve yet to find a satisfying answer for this. It seems like it ought to be based on the player’s current speed and the slope of the ground they’re moving along, but every time I’ve done that math, I’ve gotten totally ludicrous answers that sometimes exceed the size of a tile. But maybe that’s not wrong? Play around, I guess, and think about when the effect should “break” and the player should go flying off the top of a hill.

Wrinkle 5: It’s possible that the player will launch off a slope, hit something, and then be adhered to the ground where they wouldn’t have hit it. I don’t much like this edge case, but I don’t see a way around it either.

This problem is surprisingly awkward for how simple it sounds, and the solution isn’t entirely satisfying. Oh, well; the results are much nicer than the solution. As an added bonus, this also fixes occasional problems with running down a hill and becoming detached from the ground due to precision issues or whathaveyou.

Problem 4: One-way platforms

Ah, what a nightmare.

It took me ages just to figure out how to define one-way platforms. Only block when the player is moving downwards? Nope. Only block when the player is above the platform? Nuh-uh.

Well, okay, yes, those approaches might work for convex players and flat platforms. But what about… sloped, one-way platforms? There’s no reason you shouldn’t be able to have those. If Super Mario World can do it, surely Unity can do it almost 30 years later.

The trick is, again, to look at the collision normal. If it faces away from gravity, the player is hitting a ground-like surface, so the platform should block them. Otherwise (or if the player overlaps the platform), it shouldn’t.

Here’s the catch: Unity doesn’t have conditional collision. I can’t decide, on the fly, whether a collision should block or not. In fact, I think that by the time I get a callback like OnCollisionEnter2D, the physics pass is already over.

I could go the other way and use triggers (which are non-blocking), but then I have the opposite problem: I can’t stop the player on the fly. I could move them back to where they hit the trigger, but I envision all kinds of problems as a result. What if they were moving fast enough to activate something on the other side of the platform? What if something else moved to where I’m trying to shove them back to in the meantime? How does this interact with ground detection and listing contacts, which would rightly ignore a trigger as non-blocking?

I beat my head against this for a while, but the inability to respond to collision conditionally was a huge roadblock. It’s all the more infuriating a problem, because Unity ships with a one-way platform modifier thing. Unfortunately, it seems to have been implemented by someone who has never played a platformer. It’s literally one-way — the player is only allowed to move straight upwards through it, not in from the sides. It also tries to block the player if they’re moving downwards while inside the platform, which invokes clumsy rejection behavior. And this all seems to be built into the physics engine itself somehow, so I can’t simply copy whatever they did.

Eventually, I settled on the following. After calculating attempted movement (including sliding), just at the end of FixedUpdate, I do a Cast along the movement vector. I’m not thrilled about having to duplicate the physics engine’s own work, but I do filter to only things on a “one-way platform” physics layer, which should at least help. For each object the cast hits, I use Physics2D.IgnoreCollision to either ignore or un-ignore the collision between the player and the platform, depending on whether the collision was ground-like or not.

(A lot of people suggested turning off collision between layers, but that can’t possibly work — the player might be standing on one platform while inside another, and anyway, this should work for all actors!)

Again, wrinkles! But fewer this time. Actually, maybe just one: handling the case where the player already overlaps the platform. I can’t just check for that with e.g. OverlapCollider, because that doesn’t distinguish between overlapping and merely touching.

I came up with a fairly simple fix: if I was going to un-ignore the collision (i.e. make the platform block), and the cast distance is reported as zero (either already touching or overlapping), I simply do nothing instead. If I’m standing on the platform, I must have already set it blocking when I was approaching it from the top anyway; if I’m overlapping it, I must have already set it non-blocking to get here in the first place.

I can imagine a few cases where this might go wrong. Moving platforms, especially, are going to cause some interesting issues. But this is the best I can do with what I know, and it seems to work well enough so far.

Oh, and our player can deliberately drop down through platforms, which was easy enough to implement; I just decide the platform is always passable while some button is held down.

Problem 5: Pushers and carriers

I haven’t gotten to this yet! Oh boy, can’t wait. I implemented it in LÖVE, but my way was hilariously invasive; I’m hoping that having a physics engine that supports a handwaved “this pushes that” will help. Of course, you also have to worry about sticking to platforms, for which the recommended solution is apparently to parent the cargo to the platform, which sounds goofy to me? I guess I’ll find out when I throw myself at it later.

Overall result

I ended up with a fairly pleasant-feeling system that supports slopes and one-way platforms and whatnot, with all the same pieces as I came up with for LÖVE. The code somehow ended up as less of a mess, too, but it probably helps that I’ve been down this rabbit hole once before and kinda knew what I was aiming for this time.

Animation of a character running smoothly along the top of an irregular dinosaur skeleton

Sorry that I don’t have a big block of code for you to copy-paste into your project. I don’t think there are nearly enough narrative discussions of these fundamentals, though, so hopefully this is useful to someone. If not, well, look forward to ✨ my book, that I am writing ✨!

Make your own game with CoderDojo’s new book

Post Syndicated from Nuala McHale original https://www.raspberrypi.org/blog/coderdojo-nano/

The first official CoderDojo book, CoderDojo Nano: Build Your Own Website, was a resounding success: thousands of copies have been bought by aspiring CoderDojo Ninjas, and it‘s available in ten languages, including Bulgarian, Czech, Dutch, Lithuanian, Latvian, Portuguese, Spanish, and Slovakian. Now we are delighted to announce the release of the second book in our Create with Code trilogy, titled CoderDojo Nano: Make Your Own Game.

Cover of CoderDojo Nano Make your own game

The paperback book will be available in English from Thursday 7 September (with English flexibound and Dutch versions scheduled to follow in the coming months), enabling young people and adults to learn creative and fun coding skills!

What will you learn?

The new book explains the fundamentals of the JavaScript language in a clear, logical way while supporting you to create your very own computer game.

Pixel image of laptop displaying a jump-and-run game

You will learn how to animate characters, create a world for your game, and use the physics of movement within it. The book is full of clear step-by-step instructions and illustrated screenshots to make reviewing your code easy. Additionally, challenges and open-ended prompts at the end of each section will encourage you to get creative while making your game.

This book is the perfect first step towards understanding game development, particularly for those of you who do not (yet) have a local Dojo. Regardless of where you live, using our books you too can learn to ‘Create with Code’!

Tried and tested

As always, CoderDojo Ninjas from all around the world tested our book, and their reactions have been hugely positive. Here is a selection of their thoughts:

“The book is brilliant. The [game] is simple yet innovative. I personally love it, and want to get stuck in making it right away!”

“What I really like is that, unlike most books on coding, this one properly explains what’s happening, and what each piece of code does and where it comes from.”

“I found the book most enjoyable. The layout is great, with lots of colour, and I found the information very easy to follow. The Ninja Tips are a great help in case you get a bit stuck. I liked that the book represents a mix of boy and girl Ninjas — it really makes coding fun for all.”

“The book is a great guide for both beginners and people who want to do something creative with their knowledge of code. Even people who cannot go to a CoderDojo can learn code using this book!”

Writer Jurie Horneman

Author of CoderDojo Nano: Make Your Own Game Jurie Horneman has been working in the game development industry for more than 15 years.

stuffed toy rabbit wearing glasses

Jurie would get on well with Babbage, I think.

He shares how he got into coding, and what he has learnt while creating this awesome book:

“I’ve been designing and programming games since 1991, starting with ancient home computers, and now I’m working with PCs and consoles. As a game designer, it’s my job to teach players the rules of the game in a fun and playful manner — that gave me some useful experience for writing the book.

I believe that, if you want to understand something properly, you have to teach it to others. Therefore, writing this book was very educational for me, as I hope reading it will be for learners.”

Asked what his favorite thing about the book is, Jurie said he loves the incredible pixel art design: “The artist (Gary J Lucken, Army of Trolls) did a great job to help explain some of the abstract concepts in the book.”

Pixel image of a landscape with an East Asian temple on a lonely mountain

Gary’s art is also just gorgeous.

How can you get your copy?

You can pre-order CoderDojo Nano: Make Your Own Game here. Its initial pricing is £9.99 (around €11), and discounted copies with free international delivery are available here.

The post Make your own game with CoderDojo’s new book appeared first on Raspberry Pi.

Weekly roundup: Let’s get physical

Post Syndicated from Eevee original https://eev.ee/dev/2017/08/29/weekly-roundup-lets-get-physical/

  • cc: I did a lot of physics and a lot of tearing my hair out. I changed ground detection to be based on collision, which opened the door to making slopes work, and clumsily hacked sliding into working. Then I wasted two days banging my head against a wall and getting nowhere.

    (But spoilers for next week: I did get slopes working perfectly on Sunday.)

  • fox flux: Focusing just on player sprites since the game is fundamentally not playable without them. I can finally see the light at the end of the tunnel; almost all of the walking sequences are done, and about half of the forms are more or less done. So, like, 60% of the way there maybe?

  • veekun: Loaded and fixed a bunch of little things that were missing, and now the site is basically functional. Had to fix some more form ordering problems, little obscure connections we forget about with every game, and some issues with evolutions. But I do have a website, and that’s nice. Ideally I’ll have something worth publishing by the end of the month.

And, hm, that’s all. Looks like most of my week went to CC physics, which has me in a mood to work on my book again… horf, so much to do.

Pi-powered hands-on statistical model at the Royal Society

Post Syndicated from Janina Ander original https://www.raspberrypi.org/blog/royal-society-galton-board/

Physics! Particles! Statistical modelling! Quantum theory! How can non-scientists understand any of it? Well, students from Durham University are here to help you wrap your head around it all – and to our delight, they’re using the power of the Raspberry Pi to do it!

At the Royal Society’s Summer Science Exhibition, taking place in London from 4-9 July, the students are presenting a Pi-based experiment demonstrating the importance of statistics in their field of research.

Modelling the invisible – Summer Science Exhibition 2017

The Royal Society Summer Science Exhibition 2017 features 22 exhibits of cutting-edge, hands-on UK science , along with special events and talks. You can meet the scientists behind the research. Find out more about the exhibition at our website: https://royalsociety.org/science-events-and-lectures/2017/summer-science-exhibition/

Ramona, Matthew, and their colleagues are particle physicists keen to bring their science to those of us whose heads start to hurt as soon as we hear the word ‘subatomic’. In their work, they create computer models of subatomic particles to make predictions about real-world particles. Their models help scientists to design better experiments and to improve sensor calibrations. If this doesn’t sound straightforward to you, never fear – this group of scientists has set out to show exactly how statistical models are useful.

The Galton board model

They’ve built a Pi-powered Galton board, also called a bean machine (much less intimidating, I think). This is an upright board, shaped like an upside-down funnel, with nails hammered into it. Drop a ball in at the top, and it will randomly bounce off the nails on its way down. How the nails are spread out determines where a ball is most likely to land at the bottom of the board.

If you’re having trouble picturing this, you can try out an online Galton board. Go ahead, I’ll wait.

You’re back? All clear? Great!

Now, if you drop 100 balls down the board and collect them at the bottom, the result might look something like this:

Galton board

By Antoine Taveneaux CC BY-SA 3.0

The distribution of the balls is determined by the locations of the nails in the board. This means that, if you don’t know where the nails are, you can look at the distribution of balls to figure out where they are most likely to be located. And you’ll be able to do all this using … statistics!!!

Statistical models

Similarly, how particles behave is determined by the laws of physics – think of the particles as balls, and laws of physics as nails. Physicists can observe the behaviour of particles to learn about laws of physics, and create statistical models simulating the laws of physics to predict the behaviour of particles.

I can hear you say, “Alright, thanks for the info, but how does the Raspberry Pi come into this?” Don’t worry – I’m getting to that.

Modelling the invisible – the interactive exhibit

As I said, Ramona and the other physicists have not created a regular old Galton board. Instead, this one records where the balls land using a Raspberry Pi, and other portable Pis around the exhibition space can access the records of the experimental results. These Pis in turn run Galton board simulators, and visitors can use them to recreate a virtual Galton board that produces the same results as the physical one. Then, they can check whether their model board does, in fact, look like the one the physicists built. In this way, people directly experience the relationship between statistical models and experimental results.

Hurrah for science!

The other exhibit the Durham students will be showing is a demo dark matter detector! So if you decide to visit the Summer Science Exhibition, you will also have the chance to learn about the very boundaries of human understanding of the cosmos.

The Pi in museums

At the Raspberry Pi Foundation, education is our mission, and of course we love museums. It is always a pleasure to see our computers incorporated into exhibits: the Pi-powered visual theremin teaches visitors about music; the Museum in a Box uses Pis to engage people in hands-on encounters with exhibits; and this Pi is itself a museum piece! If you want to learn more about Raspberry Pis and museums, you can listen to this interview with Pi Towers’ social media maestro Alex Bate.

It’s amazing that our tech is used to educate people in areas beyond computer science. If you’ve created a pi-powered educational project, please share it with us in the comments.

The post Pi-powered hands-on statistical model at the Royal Society appeared first on Raspberry Pi.

Weekly roundup: Flux capacity

Post Syndicated from Eevee original https://eev.ee/dev/2017/06/06/weekly-roundup-flux-capacity/

  • blog: I wrote about introspection, or more specifically, what it’s been like to gradually shift from “pure tech” to more creative work.

  • fox flux: I am trapped in animation hell, but making progress. I made a pretty cool new parallax background. Also took a short break from drawing to do some physics, and now I have critters that can stop at ledges.

  • idchoppers: I wrote a quick thing that counts texture usage in maps, by both count and area.

    Honestly, I’ve been avoiding idchoppers because getting the API right is going to be a good bit of tedious work, so if it’s going to go anywhere then I’ll need to devote some serious time to it. Not sure whether I will or not. Little cute tools like a texture count make for useful example cases for an API, though, so I guess this helps.

  • flora: I did some secret work for a not-yet-released thing. It is very good.

  • art: I drew a June avatar which I am super happy with!

Mostly I’ve been doing a lot of pixel art, and I should maybe take a break from it for a little while, because I have a lot of stuff to do this month. Stay tuned.


Post Syndicated from Eevee original https://eev.ee/blog/2017/05/28/introspection/

This month, IndustrialRobot has generously donated in order to ask:

How do you go about learning about yourself? Has your view of yourself changed recently? How did you handle it?

Whoof. That’s incredibly abstract and open-ended — there’s a lot I could say, but most of it is hard to turn into words.

The first example to come to mind — and the most conspicuous, at least from where I’m sitting — has been the transition from technical to creative since quitting my tech job. I think I touched on this a year ago, but it’s become all the more pronounced since then.

I quit in part because I wanted more time to work on my own projects. Two years ago, those projects included such things as: giving the Python ecosystem a better imaging library, designing an alternative to regular expressions, building a Very Correct IRC bot framework, and a few more things along similar lines. The goals were all to solve problems — not hugely important ones, but mildly inconvenient ones that I thought I could bring something novel to. Problem-solving for its own sake.

Now that I had all the time in the world to work on these things, I… didn’t. It turned out they were almost as much of a slog as my job had been!

The problem, I think, was that there was no point.

This was really weird to realize and come to terms with. I do like solving problems for its own sake; it’s interesting and educational. And most of the programming folks I know and surround myself with have that same drive and use it to create interesting tools like Twisted. So besides taking for granted that this was the kind of stuff I wanted to do, it seemed like the kind of stuff I should want to do.

But even if I create a really interesting tool, what do I have? I don’t have a thing; I have a tool that can be used to build things. If I want a thing, I have to either now build it myself — starting from nearly zero despite all the work on the tool, because it can only do so much in isolation — or convince a bunch of other people to use my tool to build things. Then they’d be depending on my tool, which means I have to maintain and support it, which is even more time and effort poured into this non-thing.

Despite frequently being drawn to think about solving abstract tooling problems, it seems I truly want to make things. This is probably why I have a lot of abandoned projects boldly described as “let’s solve X problem forever!” — I go to scratch the itch, I do just enough work that it doesn’t itch any more, and then I lose interest.

I spent a few months quietly flailing over this minor existential crisis. I’d spent years daydreaming about making tools; what did I have if not that drive? I was having to force myself to work on what I thought were my passion projects.

Meanwhile, I’d vaguely intended to do some game development, but for some reason dragged my feet forever and then took my sweet time dipping my toes in the water. I did work on a text adventure, Runed Awakening, on and off… but it was a fractal of creative decisions and I had a hard time making all of them. It might’ve been too ambitious, despite feeling small, and that might’ve discouraged me from pursuing other kinds of games earlier.

A big part of it might have been the same reason I took so long to even give art a serious try. I thought of myself as a technical person, and art is a thing for creative people, so I’m simply disqualified, right? Maybe the same thing applies to games.

Lord knows I had enough trouble when I tried. I’d orbited the Doom community for years but never released a single finished level. I did finally give it a shot again, now that I had the time. Six months into my funemployment, I wrote a three-part guide on making Doom levels. Three months after that, I finally released one of my own.

I suppose that opened the floodgates; a couple weeks later, glip and I decided to try making something for the PICO-8, and then we did that (almost exactly a year ago!). Then kept doing it.

It’s been incredibly rewarding — far moreso than any “pure” tooling problem I’ve ever approached. Moreso than even something like veekun, which is a useful thing. People have thoughts and opinions on games. Games give people feelings, which they then tell you about. Most of the commentary on a reference website is that something is missing or incorrect.

I like doing creative work. There was never a singular moment when this dawned on me; it was a slow process over the course of a year or more. I probably should’ve had an inkling when I started drawing, half a year before I quit; even my early (and very rough) daily comics made people laugh, and I liked that a lot. Even the most well-crafted software doesn’t tend to bring joy to people, but amateur art can.

I still like doing technical work, but I prefer when it’s a means to a creative end. And, just as important, I prefer when it has a clear and constrained scope. “Make a library/tool for X” is a nebulous problem that could go in a great many directions; “make a bot that tweets Perlin noise” has a pretty definitive finish line. It was interesting to write a little physics engine, but I would’ve hated doing it if it weren’t for a game I were making and didn’t have the clear scope of “do what I need for this game”.

It feels like creative work is something I’ve been wanting to do for a long time. If this were a made-for-TV movie, I would’ve discovered this impulse one day and immediately revealed myself as a natural-born artistic genius of immense unrealized talent.

That didn’t happen. Instead I’ve found that even something as mundane as having ideas is a skill, and while it’s one I enjoy, I’ve barely ever exercised it at all. I have plenty of ideas with technical work, but I run into brick walls all the time with creative stuff.

How do I theme this area? Well, I don’t know. How do I think of something? I don’t know that either. It’s a strange paradox to have an urge to create things but not quite know what those things are.

It’s such a new and completely different kind of problem. There’s no right answer, or even an answer I can check for “correctness”. I can do anything. With no landmarks to start from, it’s easy to feel completely lost and just draw blanks.

I’ve essentially recalibrated the texture of stuff I work on, and I have to find some completely new ways to approach problems. I haven’t found them yet. I don’t think they’re anything that can be told or taught. But I’m starting to get there, and part of it is just accepting that I can’t treat these like problems with clear best solutions and clear algorithms to find those solutions.

A particularly glaring irony is that I’ve had a really tough problem designing abstract spaces, even though that’s exactly the kind of architecture I praise in Doom. It’s much trickier than it looks — a good abstract design is reminiscent of something without quite being that something.

I suppose it’s similar to a struggle I’ve had with art. I’m drawn to a cartoony style, and cartooning is also a mild form of abstraction, of whittling away details to leave only what’s most important. I’m reminded in particular of the forest background in fox flux — I was completely lost on how to make something reminiscent of a tree line. I knew enough to know that drawing trees would’ve made the background far too busy, but trees are naturally busy, so how do you represent that?

The answer glip gave me was to make big chunky leaf shapes around the edges and where light levels change. Merely overlapping those shapes implies depth well enough to convey the overall shape of the tree. The result works very well and looks very simple — yet it took a lot of effort just to get to the idea.

It reminds me of mathematical research, in a way? You know the general outcome you want, and you know the tools at your disposal, and it’s up to you to make some creative leaps. I don’t think there’s a way to directly learn how to approach that kind of problem; all you can do is look at what others have done and let it fuel your imagination.

I think I’m getting a little distracted here, but this is stuff that’s been rattling around lately.

If there’s a more personal meaning to the tree story, it’s that this is a thing I can do. I can learn it, and it makes sense to me, despite being a huge nerd.

Two and a half years ago, I never would’ve thought I’d ever make an entire game from scratch and do all the art for it. It was completely unfathomable. Maybe we can do a lot of things we don’t expect we’re capable of, if only we give them a serious shot.

And ask for help, of course. I have a hell of a time doing that. I did a painting recently that factored in mountains of glip’s advice, and on some level I feel like I didn’t quite do it myself, even though every stroke was made by my hand. Hell, I don’t even look at references nearly as much as I should. It feels like cheating, somehow? I know that’s ridiculous, but my natural impulse is to put my head down and figure it out myself. Maybe I’ve been doing that for too long with programming. Trust me, it doesn’t work quite so well in a brand new field.

I’m getting distracted again!

To answer your actual questions: how do I go about learning about myself? I don’t! It happens completely by accident. I’ll consciously examine my surface-level thoughts or behaviors or whatever, sure, but the serious fundamental revelations have all caught me completely by surprise — sometimes slowly, sometimes suddenly.

Most of them also came from listening to the people who observe me from the outside: I only started drawing in the first place because of some ridiculous deal I made with glip. At the time I thought they just wanted everyone to draw because art is their thing, but now I’m starting to suspect they’d caught on after eight years of watching me lament that I couldn’t draw.

I don’t know how I handle such discoveries, either. What is handling? I imagine someone discovering something and trying to come to grips with it, but I don’t know that I have quite that experience — my grappling usually comes earlier, when I’m still trying to figure the thing out despite not knowing that there’s a thing to find out. Once I know it, it’s on the table; I can’t un-know it or reject it meaningfully. All I can do is figure out what to do with it, and I approach that the same way I approach every other problem: by flailing at it and hoping for the best.

This isn’t quite 2000 words. Sorry. I’ve run out of things to say about me. This paragraph is very conspicuous filler. Banana. Atmosphere. Vocation.