Tag Archives: game development

Improving and securing your game-binaries distribution at scale

Post Syndicated from Ignacio Riesgo original https://aws.amazon.com/blogs/compute/improving-and-securing-your-game-binaries-distribution-at-scale/

This post is contributed by Yahav Biran | Sr. Solutions Architect, AWS and Scott Selinger | Associate Solutions Architect, AWS 

One of the challenges that game publishers face when employing CI/CD processes is the distribution of updated game binaries in a scalable, secure, and cost-effective way. Continuous integration and continuous deployment (CI/CD) processes enable game publishers to improve games throughout their lifecycle.

Often, CI/CD jobs contain minor changes that cause the CI/CD processes to push a full set of game binaries over the internet. This is a suboptimal approach. It negatively affects the cost of development network resources, customer network resources (output and input bandwidth), and the time it takes for a game update to propagate.

This post proposes a method of optimizing the game integration and deployments. Specifically, this method improves the distribution of updated game binaries to various targets, such as game-server farms. The proposed mechanism also adds to the security model designed to include progressive layers, starting from the Amazon EC2 instance that runs the game server. It also improves security of the game binaries, the game assets, and the monitoring of the game server deployments across several AWS Regions.

Why CI/CD in gaming is hard today

Game server binaries are usually a native application that includes binaries like graphic, sound, network, and physics assets, as well as scripts and media files. Game servers are usually developed with game engines like Unreal, Amazon Lumberyard, and Unity. Game binaries typically take up tens of gigabytes. However, because game developer teams modify only a few tens of kilobytes every day, frequent distribution of a full set of binaries is wasteful.

For a standard global game deployment, distributing game binaries requires compressing the entire binaries set and transferring the compressed version to destinations, then decompressing it upon arrival. You can optimize the process by decoupling the various layers, pushing and deploying them individually.

In both cases, the continuous deployment process might be slow due to the compression and transfer durations. Also, distributing the image binaries incurs unnecessary data transfer costs, since data is duplicated. Other game-binary distribution methods may require the game publisher’s DevOps teams to install and maintain custom caching mechanisms.

This post demonstrates an optimal method for distributing game server updates. The solution uses containerized images stored in Amazon ECR and deployed using Amazon ECS or Amazon EKS to shorten the distribution duration and reduce network usage.

How can containers help?

Dockerized game binaries enable standard caching with no implementation from the game publisher. Dockerized game binaries allow game publishers to stage their continuous build process in two ways:

  • To rebuild only the layer that was updated in a particular build process and uses the other cached layers.
  • To reassemble both packages into a deployable game server.

The use of ECR with either ECS or EKS takes care of the last mile deployment to the Docker container host.

Larger application binaries mean longer application loading times. To reduce the overall application initialization time, I decouple the deployment of the binaries and media files to allow the application to update faster. For example, updates in the application media files do not require the replication of the engine binaries or media files. This is achievable if the application binaries can be deployed in a separate directory structure. For example:

/opt/local/engine

/opt/local/engine-media

/opt/local/app

/opt/local/app-media

Containerized game servers deployment on EKS

The application server can be deployed as a single Kubernetes pod with multiple containers. The engine media (/opt/local/engine-media), the application (/opt/local/app), and the application media (/opt/local/app-media) spawn as Kubernetes initContainers and the engine binary (/opt/local/engine) runs as the main container.

apiVersion: v1
kind: Pod
metadata:
  name: my-game-app-pod
  labels:
    app: my-game-app
volumes:
      - name: engine-media-volume
          emptyDir: {}
      - name: app-volume
          emptyDir: {}
      - name: app-media-volume
          emptyDir: {}
      initContainers:
        - name: app
          image: the-app- image
          imagePullPolicy: Always
          command:
            - "sh"
            - "-c"
            - "cp /* /opt/local/engine-media"
          volumeMounts:
            - name: engine-media-volume
              mountPath: /opt/local/engine-media
        - name: engine-media
          image: the-engine-media-image
          imagePullPolicy: Always
          command:
            - "sh"
            - "-c"
            - "cp /* /opt/local/app"
          volumeMounts:
            - name: app-volume
              mountPath: /opt/local/app
        - name: app-media
          image: the-app-media-image
          imagePullPolicy: Always
          command:
            - "sh"
            - "-c"
            - "cp /* /opt/local/app-media"
          volumeMounts:
            - name: app-media-volume
              mountPath: /opt/local/app-media
spec:
  containers:
  - name: the-engine
    image: the-engine-image
    imagePullPolicy: Always
    volumeMounts:
       - name: engine-media-volume
         mountPath: /opt/local/engine-media
       - name: app-volume
         mountPath: /opt/local/app
       - name: app-media-volume
         mountPath: /opt/local/app-media
    command: ['sh', '-c', '/opt/local/engine/start.sh']

Applying multi-stage game binaries builds

In this post, I use Docker multi-stage builds for containerizing the game asset builds. I use AWS CodeBuild to manage the build and to deploy the updates of game engines like Amazon Lumberyard as ready-to-play dedicated game servers.

Using this method, frequent changes in the game binaries require less than 1% of the data transfer typically required by full image replication to the nodes that run the game-server instances. This results in significant improvements in build and integration time.

I provide a deployment example for Amazon Lumberyard Multiplayer Sample that is deployed to an EKS cluster, but this can also be done using different container orchestration technology and different game engines. I also show that the image being deployed as a game-server instance is always the latest image, which allows centralized control of the code to be scheduled upon distribution.

This example shows an update of only 50 MB of game assets, whereas the full game-server binary is 3.1 GB. With only 1.5% of the content being updated, that speeds up the build process by 90% compared to non-containerized game binaries.

For security with EKS, apply the imagePullPolicy: Always option as part of the Kubernetes best practice container images deployment option. This option ensures that the latest image is pulled every time that the pod is started, thus deploying images from a single source in ECR, in this case.

Example setup

  • Read through the following sample, a multiplayer game sample, and see how to build and structure multiplayer games to employ the various features of the GridMate networking library.
  • Create an AWS CodeCommit or GitHub repository (multiplayersample-lmbr) that includes the game engine binaries, the game assets (.pak, .cfg and more), AWS CodeBuild specs, and EKS deployment specs.
  • Create a CodeBuild project that points to the CodeCommit repo. The build image uses aws/codebuild/docker:18.09.0: the built-in image maintained by CodeBuild configured with 3 GB of memory and two vCPUs. The compute allocated for build capacity can be modified for cost and build time tradeoff.
  • Create an EKS cluster designated as a staging or an integration environment for the game title. In this case, it’s multiplayersample.

The binaries build Git repository

The Git repository is composed of five core components ordered by their size:

  • The game engine binaries (for example, BinLinux64.Dedicated.tar.gz). This is the compressed version of the game engine artifacts that are not updated regularly, hence they are deployed as a compressed file. The maintenance of this file is usually done by a different team than the developers working on the game title.
  • The game binaries (for example, MultiplayerSample_pc_Paks_Dedicated). This directory is maintained by the game development team and managed as a standard multi-branch repository. The artifacts under this directory get updated on a daily or weekly basis, depending on the game development plan.
  • The build-related specifications (for example, buildspec.yml  and Dockerfile). These files specify the build process. For simplicity, I only included the Docker build process to convey the speed of continuous integration. The process can be easily extended to include the game compilation and linked process as well.
  • The Docker artifacts for containerizing the game engine and the game binaries (for example, start.sh and start.py). These scripts usually are maintained by the game DevOps teams and updated outside of the regular game development plan. More details about these scripts can be found in a sample that describes how to deploy a game-server in Amazon EKS.
  • The deployment specifications (for example, eks-spec) specify the Kubernetes game-server deployment specs. This is for reference only, since the CD process usually runs in a separate set of resources like staging EKS clusters, which are owned and maintained by a different team.

The game build process

The build process starts with any Git push event on the Git repository. The build process includes three core phases denoted by pre_build, buildand post_build in multiplayersample-lmbr/buildspec.yml

  1. The pre_build phase unzips the game-engine binaries and logs in to the container registry (Amazon ECR) to prepare.
  2. The buildphase executes the docker build command that includes the multi-stage build.
    • The Dockerfile spec file describes the multi-stage image build process. It starts by adding the game-engine binaries to the Linux OS, ubuntu:18.04 in this example.
    • FROM ubuntu:18.04
    • ADD BinLinux64.Dedicated.tar /
    • It continues by adding the necessary packages to the game server (for example, ec2-metadata, boto3, libc, and Python) and the necessary scripts for controlling the game server runtime in EKS. These packages are only required for the CI/CD process. Therefore, they are only added in the CI/CD process. This enables a clean decoupling between the necessary packages for development, integration, and deployment, and simplifies the process for both teams.
    • RUN apt-get install -y python python-pip
    • RUN apt-get install -y net-tools vim
    • RUN apt-get install -y libc++-dev
    • RUN pip install mcstatus ec2-metadata boto3
    • ADD start.sh /start.sh
    • ADD start.py /start.py
    • The second part is to copy the game engine from the previous stage --from=0 to the next build stage. In this case, you copy the game engine binaries with the two COPY Docker directives.
    • COPY --from=0 /BinLinux64.Dedicated/* /BinLinux64.Dedicated/
    • COPY --from=0 /BinLinux64.Dedicated/qtlibs /BinLinux64.Dedicated/qtlibs/
    • Finally, the game binaries are added as a separate layer on top of the game-engine layers, which concludes the build. It’s expected that constant daily changes are made to this layer, which is why it is packaged separately. If your game includes other abstractions, you can break this step into several discrete Docker image layers.
    • ADD MultiplayerSample_pc_Paks_Dedicated /BinLinux64.Dedicated/
  3. The post_build phase pushes the game Docker image to the centralized container registry for further deployment to the various regional EKS clusters. In this phase, tag and push the new image to the designated container registry in ECR.

- docker tag $IMAGE_REPO_NAME:$IMAGE_TAG

$AWS_ACCOUNT_ID.dkr.ecr.$AWS_DEFAULT_REGION.amazonaws.com/$IMAGE_REPO_NAME:$IMAGE_TAG

docker push

$AWS_ACCOUNT_ID.dkr.ecr.$AWS_DEFAULT_REGION.amazonaws.com/$IMAGE_REPO_NAME:$IMAGE_TAG

The game deployment process in EKS

At this point, you’ve pushed the updated image to the designated container registry in ECR (/$IMAGE_REPO_NAME:$IMAGE_TAG). This image is scheduled as a game server in an EKS cluster as game-server Kubernetes deployment, as described in the sample.

In this example, I use  imagePullPolicy: Always.


containers:
…
        image: /$IMAGE_REPO_NAME:$IMAGE_TAG/multiplayersample-build
        imagePullPolicy: Always
        name: multiplayersample
…

By using imagePullPolicy, you ensure that no one can circumvent Amazon ECR security. You can securely make ECR the single source of truth with regards to scheduled binaries. However, ECR to the worker nodes via kubelet, the node agent. Given the size of a whole image combined with the frequency with which it is pulled, that would amount to a significant additional cost to your project.

However, Docker layers allow you to update only the layers that were modified, preventing a whole image update. Also, they enable secure image distribution. In this example, only the layer MultiplayerSample_pc_Paks_Dedicated is updated.

Proposed CI/CD process

The following diagram shows an example end-to-end architecture of a full-scale game-server deployment using EKS as the orchestration system, ECR as the container registry, and CodeBuild as the build engine.

Game developers merge changes to the Git repository that include both the preconfigured game-engine binaries and the game artifacts. Upon merge events, CodeBuild builds a multistage game-server image that is pushed to a centralized container registry hosted by ECR. At this point, DevOps teams in different Regions continuously schedule the image as a game server, pulling only the updated layer in the game server image. This keeps the entire game-server fleet running the same game binaries set, making for a secure deployment.

 

Try it out

I published two examples to guide you through the process of building an Amazon EKS cluster and deploying a containerized game server with large binaries.

Conclusion

Adopting CI/CD in game development improves the software development lifecycle by continuously deploying quality-based updated game binaries. CI/CD in game development is usually hindered by the cost of distributing large binaries, in particular, by cross-regional deployments.

Non-containerized paradigms require deployment of the full set of binaries, which is an expensive and time-consuming task. Containerized game-server binaries with AWS build tools and Amazon EKS-based regional clusters of game servers enable secure and cost-effective distribution of large binary sets to enable increased agility in today’s game development.

In this post, I demonstrated a reduction of more than 90% of the network traffic required by implementing an effective CI/CD system in a large-scale deployment of multiplayer game servers.

Coding an isometric game map | Wireframe issue 15

Post Syndicated from Ryan Lambie original https://www.raspberrypi.org/blog/coding-an-isometric-game-map-wireframe-issue-15/

Isometric graphics give 2D games the illusion of depth. Mark Vanstone explains how to make an isometric game map of your own.

Published by Quicksilva in 1983, Ant Attack was one of the earliest games to use isometric graphics. And you threw grenades at giant ants. It was brilliant.

Isometric projection

Most early arcade games were 2D, but in 1982, a new dimension emerged: isometric projection. The first isometric game to hit arcades was Sega’s pseudo-3D shooter, Zaxxon. The eye-catching format soon caught on, and other isometric titles followed: Q*bert came out the same year, and in 1983 the first isometric game for home computers was published: Ant Attack, written by Sandy White.

Ant Attack

Ant Attack was first released on the ZX Spectrum, and the aim of the game was for the player to find and rescue a hostage in a city infested with giant ants. The isometric map has since been used by countless titles, including Ultimate Play The Game’s classics Knight Lore and Alien 8, and my own educational history series ArcVenture.

Let’s look at how an isometric display is created, and code a simple example of how this can be done in Pygame Zero — so let’s start with the basics. The isometric view displays objects as if you’re looking down at 45 degrees onto them, so the top of a cube looks like a diamond shape. The scene is made by drawing cubes on a diagonal grid so that the cubes overlap and create solid-looking structures. Additional layers can be used above them to create the illusion of height.

Blocks are drawn from the back forward, one line at a time and then one layer on top of another until the whole map is drawn.

The cubes are actually two-dimensional bitmaps, which we start printing at the top of the display and move along a diagonal line, drawing cubes as we go. The map is defined by a three-dimensional list (or array). The list is the width of the map by the height of the map, and has as many layers as we want to represent in the upward direction. In our example, we’ll represent the floor as the value 0 and a block as value 1. We’ll make a border around the map and create some arches and pyramids, but you could use any method you like — such as a map editor — to create the map data.

To make things a bit easier on the processor, we only need to draw cubes that are visible in the window, so we can do a check of the coordinates before we draw each cube. Once we’ve looped over the x, y, and z axes of the data list, we should have a 3D map displayed. The whole map doesn’t fit in the window, and in a full game, the map is likely to be many times the size of the screen. To see more of the map, we can add some keyboard controls.

Here’s Mark’s isometric map, coded in Python. To get it running on your system, you’ll first need to install Pygame Zero. And to download the full code, visit our Github repository here.

If we detect keyboard presses in the update() function, all we need to do to move the map is change the coordinates we start drawing the map from. If we start drawing further to the left, the right-hand side of the map emerges, and if we draw the map higher, the lower part of the map can be seen.

We now have a basic map made of cubes that we can move around the window. If we want to make this into a game, we can expand the way the data represents the display. We could add differently shaped blocks represented by different numbers in the data, and we could include a player block which gets drawn in the draw() function and can be moved around the map. We could also have some enemies moving around — and before we know it, we’ll have a game a bit like Ant Attack.

Tiled

When writing games with large isometric maps, an editor will come in handy. You can write your own, but there are several out there that you can use. One very good one is called Tiled and can be downloaded free from mapeditor.org. Tiled allows you to define your own tilesets and export the data in various formats, including JSON, which can be easily read into Python.

Get your copy of Wireframe issue 15

You can read more features like this one in Wireframe issue 15, available now at Tesco, WHSmith, and all good independent UK newsagents.

Or you can buy Wireframe directly from Raspberry Pi Press — delivery is available worldwide. And if you’d like a handy digital version of the magazine, you can also download issue 15 for free in PDF format.

Make sure to follow Wireframe on Twitter and Facebook for updates and exclusive offers and giveaways. Subscribe on the Wireframe website to save up to 49% compared to newsstand pricing!

The post Coding an isometric game map | Wireframe issue 15 appeared first on Raspberry Pi.

How musical game worlds are made | Wireframe #8

Post Syndicated from Ryan Lambie original https://www.raspberrypi.org/blog/how-musical-game-worlds-are-made-wireframe-8/

88 Heroes composer Mike Clark explains how music and sound intertwine to create atmospheric game worlds in this excerpt from Wireframe issue 8, available now.

Music for video games is often underappreciated. When I first started writing music in my bedroom, it took me a while to realise how much I was influenced by the worlds that came from my tiny CRT TV. A couple of years ago, I was lucky enough to be approached by Bitmap Bureau, an indie startup who hired me to compose the music for their first game, 88 Heroes.

88 Heroes is a platformer styled like a Saturday morning cartoon. Interestingly, cartoon soundtracks have a lot in common with those for stage productions: short musical cues accompany the actions on screen, so if someone violently falls downstairs, you hear a piano rolling down the keys. This is called ‘mickey mousing’ in cartoons, but we hear similar things in film soundtracks.

Take Raiders of the Lost Ark, scored by John Williams: for every heroic rope swing, leap of faith, or close encounter with danger, the main theme can be heard powering through the dissonances and changing rhythms. It fills the audience with hope and becomes synonymous with the lead character – we want to see him succeed. Let’s not forget the title theme. Every time you see the Star Wars logo, does that grand title theme play in your head? It’s the same with video games. The challenge here, of course, is that players often leave the title screen after three seconds.

Three seconds is all you need though. Take Super Mario World’s soundtrack, composed by Koji Kondo. Many of its levels have the same leading melody, which changes subtly in tonality and rhythm to create the appropriate mood. The most repeating part of the melody is four bars long, but we hear it in so many forms that we only need the first two bars to know where it’s from. In classical music, this is called ‘variations on a theme’. In video games, we call it a ‘sonic identity’.

Action platformer 88 Heroes, featuring music by Mike Clark.

How a picture should ‘sound’

Sonic identity informed my approach to the 88 Heroes soundtrack. The title screen tells us that an unknown group is going to save the day. I first thought about unlikely heroes who end up on an adventure, and Back to the Future, scored by Alan Silvestri, sprang to mind. The second inspiration came from traditional superheroes, like Superman. I composed a melody which travels between the first and fifth notes in the scale (in this case C and G), with little flourishes of the notes in-between. It’s a triumphant, heroic melody.

This concept helps to connect these worlds beyond their visuals. It took a long time for games to evolve into the cohesive open-world sandboxes or MMOs we see today; the technology that masked loading screens to create a seamless experience was unheard of in the 1990s, so a melody that you hear in different ‘costumes’ gives these games a sense of cohesion.

Intelligent instruments

What if you have levels (or worlds) so big that some areas need to be loaded? That’s where non-linear composition comes in. Banjo-Kazooie, released for the N64 in 1998, was among the first 3D games to feature dynamic music. It used a technique called MIDI channel fading. MIDI stands for Musical Instrument Digital Interface; think of it as a universal language for music that is played back in real time by the hardware. As you walk into caves, fly in the sky, or move near certain characters, instruments fade in and out using the different MIDI channels to mimic the atmosphere, give the player an audio cue, and build and release tension.

Learning how to write music that changes as you play might seem impossible at first, but it becomes second nature once you understand the relationship between every instrument in your composition. Many digital audio workstations, like Logic and FL Studio, let you import MIDI data for a song (so you have all the notes in front of you) and set the instruments yourself. Try slowly fading out or muting certain tracks altogether, and listen to how the mood changes. What could this change represent in a video game? It’s like when you’re riding Yoshi in many of the Mario games; the fast bongos come in to represent the quick-footed dinosaur as he dashes at high speeds.

Undertale’s soundtrack blends analogue synth instruments with a plethora of real instruments to help create emotion.

Music is used to evoke emotions that wouldn’t be possible with visuals alone. Beep: A Documentary History of Game Sound shows a six-second video of a boat accompanied by two soundtracks; one is a light and happy guitar piece, the other a grating, scary, orchestral dissonance. Through these two extremes, the music creates the mood by itself. I remember playing Metroid Prime and finding the Chozo Ghost enemies rather scary, not because of their appearance, but because of the unnerving music that accompanies them. Music and sound design are one and the same. Think about what feelings you can create by taking music away entirely — it’s a great way to create tension before a boss battle or pivotal plot point, and it really works. In Undertale, scored by Toby Fox, there are times when the music stops so abruptly during NPC dialogue that you feel shivers down your spine.

So, what if you’re trying to come up with some game music, and you have writer’s block? Well, the next time you play a new game, turn the sound off. As you’re playing, focus on how the story, art, or characters make you feel, and focus on the emotions the game is trying to convey. Then, think of a time when a song made you feel happy, sad, joyful, anxious, or even frightened. Maybe you can use the music to create the mood you want for that game, as opposed to what the game makes you feel. By finding these emotions and understanding how they can change, you’ll be able to write a score that helps strengthen the immersion, escapism, and player investment in your game.

You can read the rest of the feature in Wireframe issue 8, available now in Tesco, WHSmith, and all good independent UK newsagents.

Or you can buy Wireframe directly from us – worldwide delivery is available. And if you’d like to own a handy digital version of the magazine, you can also download a free PDF.

Markets, moggies, and making in Wireframe issue 8

Make sure to follow Wireframe on Twitter and Facebook for updates and exclusives, and for subcriptions, visit the Wireframe website to save 49% compared to newsstand pricing!

The post How musical game worlds are made | Wireframe #8 appeared first on Raspberry Pi.

Inside the Dreamcast homebrew scene | Wireframe issue 7

Post Syndicated from Ryan Lambie original https://www.raspberrypi.org/blog/wireframe-7-inside-dreamcast-homebrew-scene/

Despite its apparent death 17 years ago, the Sega Dreamcast still has a hardcore group of developers behind it. We uncover their stories in this excerpt from Wireframe issue 7, available now.

In 1998, the release of the Dreamcast gave Sega an opportunity to turn around its fortunes in the home console market. The firm’s earlier system, the Saturn, though host to some beloved titles, was running a distant third in sales behind the Nintendo 64 and PlayStation. The Dreamcast, by contrast, saw a successful launch and quickly became the go-to system for arcade-quality ports of fighting games, among other groundbreaking titles like Seaman and Crazy Taxi.

Unfortunately for fans, it wasn’t to last. The Dreamcast struggled to compete against the PlayStation 2, which launched in 2000, and at the end of March 2001, in the face of the imminent launch of the Nintendo GameCube and Microsoft’s new Xbox, Dreamcast left the stage, and Sega abandoned the console market altogether.

None of this stopped a vibrant homebrew development scene springing up around the console in Sega’s place, and even years later, the Dreamcast remains a thriving venue for indie developers. Roel van Mastbergen codes for Senile Team, the developers of Intrepid Izzy, a puzzle platformer coming soon to the PC, PS4, and Dreamcast.

Of the port to Sega’s ageing console, van Mastbergen tells us, “I started this project with only the PC in mind. I’m more used to developing for older hardware, though, so I tend to write code with low CPU and RAM requirements by force of habit. At some point I decided to see if I could get it running on the Dreamcast, and I was happy to find that it ran almost perfectly on the first try.”

It runs at a lower resolution than on PC, but Intrepid Izzy still maintains a smooth 60fps on Dreamcast.

One of the pluses of the Dreamcast, van Mastbergen points out, is how easy it is to develop for. “There are free tools and sufficient documentation available, and you can run your own code on a standard Dreamcast without any hardware modifications or hacks.”

Games burned to CD will play in most models of unmodified Dreamcast, usually with no extra software required. While this doesn’t result in a huge market — the customer base for new Dreamcast games is difficult to measure but certainly small — it makes development for original hardware far more viable than on other systems, which often need expensive and difficult-to-install modchips.

Many of the games now being developed for the system are available as digital downloads, but the state of Dreamcast emulation lags behind that of its competitors, with no equivalent to the popular Dolphin and PCSX2 emulators for GameCube and PS2. All this makes boxed games on discs more viable than on other systems — and, in many cases, physical games can also become prized collectors’ items.

Intrepid Izzy is developed with a custom code library that works across multiple systems; it’s simple to downscale PC assets and export a Dreamcast binary.

Kickstarting dreams

By now, you might be asking yourself what the point of developing for these old systems is — especially when creating games for PC is a much easier and potentially more profitable route to take. When it comes to crowdfunding, though, catering to a niche but dedicated audience can pay dividends.

Belgian developer Alice Team, creators of Alice Dreams Tournament, asked for €8000 in funding to complete its Dreamcast exclusive, which began development in 2006. It eventually raised €28,000 — more than treble its goal.

Intrepid Izzy didn’t quite reach such dizzying heights, only just meeting its €35,000 target, but van Mastbergen is clear it wouldn’t have been funded at all without the dedicated Dreamcast base. “The project has been under-funded since the beginning, which is slightly problematic,” van Mastbergen tells us. “Even so, it is true that the Dreamcast community is responsible for the lion’s share of the funding, which is a testament to how well-loved this system still is.”

You can read the rest of the feature in Wireframe issue 7, available in Tesco, WHSmith, and all good independent UK newsagents.

Or you can buy Wireframe directly from us – worldwide delivery is available. And if you’d like to own a handy digital version of the magazine, you can also download a free PDF.

Face your fears in the indie horror, Someday You’ll Return.

Make sure to follow Wireframe on Twitter and Facebook for updates and exclusives, and for subscriptions, visit the Wireframe website to save 49% compared to newsstand pricing!

The post Inside the Dreamcast homebrew scene | Wireframe issue 7 appeared first on Raspberry Pi.

Building a text adventure | Wireframe #6

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/making-text-adventure-wireframe-6/

Game developer Andrew Gillett explains how to make a simple text adventure in Python — and what pitfalls to avoid while doing so — in the latest issue of Wireframe magazine, out now.

Writing games in BASIC

The first game I ever wrote was named Pooh. It had nothing to do with the bear. In September 1982, I was four years old, and the ZX Spectrum home computer had just been released. It was incredible enough that the Spectrum let you play games on the TV, but like most home computers of the time, it also came with a built-in language called BASIC, and a manual which explained how to program it. In my first game, Pooh (the title was a misspelling), the player controlled a baby, represented by a pound sign, and had to guide it to a potty, represented by the letter O. There were no obstacles, no enemies, and if you tried to walk off the screen, the program would stop with an error message. I didn’t have any idea how to create a graphical game more complex than Pooh. I didn’t even know how to display a sprite on the screen.

The Hobbit, released in 1982, was widely praised for its intuitive parser.

Text adventures

Instead, I focused on writing text adventures, where the game describes scenes to the player (“You are in a comfortable, tunnel-like hall. You can see a door,” from 1982’s The Hobbit) and the player enters commands such as “Go through door” or “Kill goblin with sword.” Although this type of game is comparatively easy to write, I implemented it in the worst way possible. The code was essentially a huge list of IF statements. Each room had its own set of code, which would print out a description of the room and then check to see what the player typed. This ‘hard-coding’ led to the code being much longer than necessary, and more difficult to maintain.

The correct way would have been to separate my code and data. Each room would have had several pieces of data associated with it, such as an ID number, the description of the room (“You are in a small cave”), an array of objects which can be found in the room, and an array of room numbers indicating where the player should end up if they try to move in a particular direction – for example, the first number could indicate which room to go to if the player enters ‘NORTH’. You’d then have the main game code which keeps track of the room the player is currently in, and looks up the data for that room. With that data, it can then take the appropriate action based on the command the player typed.

Getting it right

The code below shows how to implement the beginnings of a text adventure game in Python. Instead of numeric IDs and arrays, the code uses string IDs and dictionary data structures, where each piece of data is associated with an ID or ‘key’. This is a more convenient option which wasn’t available in Spectrum BASIC. We first create a list of directions in which the player can potentially move. We then create the class Location which specifies each location’s properties. We store a name, a description, and a dictionary data structure which stores the other locations that the current location is linked to. For example, if you go north from the woods, you’ll reach the lake. The class includes a method named addLink, which adds entries to the linked locations dictionary after checking that the specified direction and destination exist.

Following the class definition, we then create a dictionary named locations. This has two entries, with the keys being woods and lake, and the values being instances of the Location class. Next, we call the addLink method on each of the locations we’ve just created, so that the player will be able to walk between them. The final step of the setup phase is to create the variable currentLocation, specifying where the player will start the game.

We then reach the main game loop, which will repeat indefinitely. We first display the description of the current location, along with the available directions in which the player can move. Then we wait for the player to input a command. In this version of the code, the only valid commands are directions: for example, type ‘north’ at the starting location to go to the lake. When a direction is entered, we check to make sure it’s a valid direction from the current location, then update currentLocation to the new location. When the main loop restarts, the description of the new location is displayed.

I moved on from the ZX Spectrum eight years after my dad first unpacked it. Despite the poor design of my code, I’d learned the essentials of programming. Ten years later, I was a game developer.

Further reading

If you’re keen to learn more about making a text adventure in Python, you could check out Phillip Johnson’s guide to the subject, Make Your Own Python Text Adventure. The author has also written a condensed version of the same guide.

You may also be interested in our free online course Object-oriented Programming in Python: Create Your Own Adventure Game.

More from Wireframe

You can discover more tutorials, alongside great reviews, articles and advice, in Wireframe issue 6, out now and available in Tesco, WHSmith, and all good independent UK newsagents.

Or you can buy Wireframe directly from the Raspberry Pi Press store — worldwide delivery is available. And if you’d like to own a handy digital version of the magazine, you can also download the PDF for free.

The post Building a text adventure | Wireframe #6 appeared first on Raspberry Pi.

From Wireframe issue 5: Breakthrough Brits in conversation

Post Syndicated from Ryan Lambie original https://www.raspberrypi.org/blog/wireframe-issue-5/

BAFTA-recognised developers Adrienne Law and Harry Nesbitt share their thoughts on making games, work-life balance, and more in this excerpt from Wireframe issue 5, available from today.

It’s certainly ‘woollies and scarf’ weather now, but the low-hanging sun provides a beautiful backdrop as Adrienne and Harry make their daily short walk from home to the ustwo games office. In late 2018, Adrienne Law and Harry Nesbitt were both recognised by BAFTA as Breakthrough Brits: an award given by BAFTA to new and emerging talent across a variety of art and entertainment industries. But that’s not the only thing they have in common — Adrienne and Harry work in the same office and are even housemates.

Monument Valley 2 screenshot

Monument Valley 2

Adrienne is a producer at ustwo games, most recently on the acclaimed puzzler Monument Valley 2. Harry doesn’t work for ustwo, but he’s a regular fixture there, taking a spare desk to work as the lead developer and artist for Alto’s Adventure and its sequel, Alto’s Odyssey.

Alto’s Odyssey screenshot

Alto’s Odyssey

As two professionals early in their careers in an ever-evolving industry, Adrienne and Harry find themselves with much in common, but the routes that led them to working and living together were very different. The pair agreed to take an hour out of their work schedules to speak to Wireframe, and to each other, about their personal experiences of game development, how it feels to release a game, work-life balance, and the potential of games to affect and enrich lives.

Adrienne Law: My route into the games industry was semi-accidental. I played games a lot when I was a kid but didn’t know there was an industry as such to go and work in. I did an English degree thinking that might possibly set me up for going into some kind of creative, story-driven field, which was what interested me. After that, I spent a few years working different jobs — I was a teaching assistant, I worked in finance, retail, marketing, and was circling around trying to get into film and TV industries.

Eventually, I got to the point where I went onto job sites and searched for “production assistant” and that’s where I found a production assistant role going at ustwo games. I thought, “Oh! Production is a thing in games! I didn’t know that.” I decided to just go for it. I ended up having a few interviews with ustwo — I think they were worried because I was quite quiet, and they weren’t sure how much I would step into the role — but they let me through the door and gave me a chance. I’ve been here ever since. I never set out to be in the games industry, but I think I’d been gaining a lot of skills and had an awareness of the medium, so those things combined into making me a good candidate for the role.

I went to an all girls’ school that specialised in maths and science, so there was no reason that I would have thought I couldn’t work in tech, but the school didn’t push the idea of working in tech and coding. I think if I had been aware of it from a younger age, I would have been a programmer.

Harry Nesbitt

Harry Nesbitt: I’ve always thought about working in games. From a young age, I had an interest in how games were made from an artistic standpoint. I would always look up who was responsible for the concept art. Concept art as a job was something I was aware of from a very young age.

Around 2006, when I started at university, indie games weren’t in the mainstream, and making games in your own bedroom wasn’t as popular an idea. When I discovered Unity, I thought “Oh, I can download this for free, and I can learn all the basics online.” I saw examples of illustrators who were downloading it and making cool, interesting little projects — almost like little art pieces — bringing their illustrations to life. It made me realise I could have a play with that. My knowledge of the basics of JavaScript and web development helped me pick up the coding side of things a little bit more easily.

When it came to making Alto’s Adventure, I knew a little bit of Unity and had been playing with it for about 12 months, so I realised I could at least be playing around with it, seeing what’s possible and using it as a way to demonstrate certain ideas.

Within a very short space of time, less than a week maybe, I’d been able to put together a basic prototype of the core systems, such as the terrain generation, basic player physics, even some effects such as particles and Alto’s scarf. It took another year and a half from there to get it finished, but online resources gave me what I needed to eventually get the game made. It’s not necessarily an experience I’d want to repeat though!

You can read the rest of this fantastic feature in Wireframe issue 5, out today, 17 January, in Tesco, WHSmith, and all good independent UK newsagents.

Or you can buy Wireframe directly from us — worldwide delivery is available. And if you’d like to own a handy digital version of the magazine, you can also download a free PDF.

The cutest Wireframe cover to date!

Make sure to follow Wireframe on Twitter and Facebook for updates and exclusives, and for subscriptions, visit the Wireframe website to save 49% compared to newsstand pricing!

The post From Wireframe issue 5: Breakthrough Brits in conversation appeared first on Raspberry Pi.

From Wireframe issue 4: Recovering Destiny’s long-lost soundtrack

Post Syndicated from Ian Dransfield original https://www.raspberrypi.org/blog/wireframe-issue-4-destinys-long-lost-soundtrack/

Missing for five years, Destiny’s soundtrack album, Music of the Spheres, resurfaced in 2017. Composer Marty O’Donnell reflects on what happened, in this excerpt from Wireframe issue 4.

When Bungie unveiled its space-opera shooter Destiny in February 2013, it marked the end of two years of near silence from the creators of the Halo franchise. Fans celebrated at the prospect of an entirely new game from such well known talent. Behind closed doors, however, Destiny was in trouble.

Though the game was almost complete by mid-2013, plans to launch that September were put on hold when concerns over Destiny’s story forced its narrative structure to be rebuilt from scratch. It would be more than 18 months before Destiny was released: a fun but strange shooter that bore difficult-to-pin-down traces of its troubled gestation. But one element of Destiny – that had been a huge part of its development – was nowhere to be seen. It was an ambitious original soundtrack written and recorded with an impressive but unexpected collaborator: Paul McCartney.

Spherical music

Audio director and composer Marty O’Donnell had been with Bungie since the late 1990s, and for him, Destiny represented an opportunity to develop something new: a musical prequel to the video game. This would become Music of the Spheres – an eight-part musical suite that took nearly two years to complete. This was no mere soundtrack, however. Born out of discussions between O’Donnell and Bungie COO Pete Parsons early in the game’s production, it was to play an integral role in Destiny’s marketing campaign.

“I wasn’t writing this just to be marketing fodder,” O’Donnell laughs. “I was writing it as a standalone listening experience that would then eventually become marketing fodder – but I didn’t want the other to happen first.”

Between 2011 and 2012, Bungie and O’Donnell devised plans for the album.

“Every few weeks or so, I would be called to a meeting in one of their big conference rooms and there would be a whole bunch of new faces there, pitching some cool idea or other,” says O’Donnell. “[At one point] it was going to be a visualisation with your mobile device.”

Difference of opinion

But there were fundamental differences between what Bungie had planned and what Activision – Destiny’s publisher, and keeper of the purse strings – wanted.

“I think Activision was confused [about] why you would ever use music as marketing… And the other thing is, I honestly don’t think they understood why we were working with Paul McCartney. I think they didn’t think that that was the right person for the demographic.”

News of a collaboration with McCartney had raised eyebrows when he revealed his involvement on Twitter in July 2012. His interest had been piqued during his attendance at E3 2009 following the announcement of The Beatles: Rock Band, which was preceded by Bungie’s unveiling of Halo ODST.

Loop symphony

“I had a contact in Los Angeles who worked out deals with actors we used on Halo,” O’Donnell recalls. “He was able to make contact with Paul’s people and set up a meeting between the two of us in spring of 2011. My impression was that Paul saw a new crop of fans come from Beatles Rock Band and was interested in seeing what was involved with creating music for video games. He seemed convinced that Bungie was working on a project that he could get behind.”

Within a few weeks, O’Donnell and McCartney were exchanging ideas for Destiny.

“The first thing he sent me was what he called his ‘loop symphony’,” says O’Donnell. “He used the same looping tape recorder that he used on Sgt. Pepper’s and Revolver… He hauled this tape recorder out of his attic.”

Working with regular collaborator Michael Salvatori, O’Donnell and McCartney set about developing Music of the Spheres into a fully fledged album, comprising eight movements.

Priorities

“I have all of these wonderful things, which included interesting things he did on his guitar that sort of loop and sound otherworldly… I think there are a couple of times in The Path, which is the first piece, and then I think The Prison, which is the seventh piece, where we use a recording of Paul doing this loop with his voice. This little funny thing. That’s Paul’s voice, which is cool.”

The album was completed in December 2012 following recording sessions at Capitol Studios in California, Avatar Studios in New York, and Abbey Road in London. Musical elements from Music of the Spheres accompanied Bungie’s big reveal of Destiny at a PlayStation 4 event in New York in February 2013. But after that, things started to go south.

“After that PlayStation 4 announcement, I said, ‘Let’s figure out how to release this. I don’t care if we have Harmonix do an iPad version with a visualiser for it. I mean, if we can’t pull the trigger on something big and interesting like that, that’s fine with me. Let’s just release it online.’ It had nothing to do with making money… It was always fan service, in my mind at least.”

Activision, on the other hand, had other priorities. “Activision had a lot of say on the marketing. I think that’s where things started to go wrong, for me… things started being handled badly, or postponed, and then all of a sudden I was seeing bits of Music of the Spheres being cut up and presented in ways that I wasn’t happy with.”

You can read the rest of this fantastic feature in Wireframe issue four, out 20 December in Tesco, WHSmith, and all good independent UK newsagents.

Or you can buy Wireframe directly from us — worldwide delivery is available. And if you’d like to own a handy digital version of the magazine, you can also download a free PDF.

The post From Wireframe issue 4: Recovering Destiny’s long-lost soundtrack appeared first on Raspberry Pi.

Wireframe 3: Phoenix Point, modders going pro, and more

Post Syndicated from Ian Dransfield original https://www.raspberrypi.org/blog/wireframe-issue-3/

We said we’d be back with more, so here we are back with more: issue 3 of Wireframe, the magazine that lifts the lid on video games.

From the ashes

Our third issue sees the now-established mix of great features, guides, reviews, and plenty more beyond that. Headlining it all is our sit-down chat with Julian Gollop about his upcoming strategy title Phoenix Point, with the X-Com creator waxing lyrical about Rebelstar, Chaos, and the secret of great AI.

We also take a look at the careers of amateurs-turned-pros, checking out the modders who went legit and getting input from those who’ve made the jump from doing it for fun, to doing it for fun and money.

And it doesn’t stop there

We’re investigating Thrunt XL, the indie game made without typing a single line of code; Terry Cavanaugh tells us about his unconventional new rogue-like Dicey Dungeons; and veteran game developer Howard Scott Warshaw looks back on the making of his Atari 2600 classic, Yars’ Revenge.

Plus:

  • Make your own first-person shooter in Unity with our step-by-step guide
  • The fur flies in the forthcoming multiplayer shooter, Super Animal Royale
  • How parallax scrolling gives 2D games the illusion of depth
  • The platformer from El Salvador that survived an attack of the clones

All this, and a variety of news, previews, and reviews covering everything from triple-A releases to dinky, loveable indie games.

Buy Wireframe issue 3

Print copies of Wireframe are available now in WHSmith, Tesco, and all good independent UK newsagents. Or you can buy Wireframe directly from us — worldwide delivery is available. And if you’d like to own a handy digital version of the magazine, you have the option to also download a free PDF.

Subscription options!

Whether you want to sample six print issues for a bargain price, subscribe for a full year, or get a regular digital edition sent directly to your device, we have some superb deals for you to choose from! To find out how you can save up to 49% on Wireframe, head to wfmag.cc/subscribe.

Or you can get the digital edition directly to your smart device via our Android and iOS apps.

See you in a fortnight!

The post Wireframe 3: Phoenix Point, modders going pro, and more appeared first on Raspberry Pi.

Wireframe 2: The Blackout Club, Battlefield V anxiety, and more

Post Syndicated from Ryan Lambie original https://www.raspberrypi.org/blog/wireframe-2/

Momentum firmly established, we’re back with our brilliant second issue of Wireframe — the magazine that lifts the lid on video games.

And yes, we are continuing to write ‘video games’ as two words.

Blacking out

In our sophomore edition, you’ll discover all manner of great features, guides, reviews, and everything else you could wish for. In an exclusive interview, BioShock 2 director Jordan Thomas talks about The Blackout Club, his new co-operative horror game – which also features on our fantastic front cover! With inspiration coming from the likes of Stranger Things, you just know The Blackout Club is going to be something special.

We also hear from Battlefield V’s Creative Director Lars Gustavsson in a candid discussion about his own personal excitement — and apprehension — surrounding the launch of DICE’s latest in its nearly 20-year-old series.

And a lot more

Is that all? Of course not. Thomas Was Alone and Subsurface Circular creator Mike Bithell shares his personal perspective on the ever-changing shape of video games.

Issue 2 also takes an extended look at an RPG’s journey from tabletop to screen: it’s not easy to bring the likes of Cyberpunk 2020 to the world of video games, and CD Projekt Red, Chris Avellone, and others tell us just why that is.

We’re just spoiling you now, but there’s plenty more besides, such as:

  • The maths behind matchmaking and video game economics
  • The changing face of Mega Man, an enduring 8-bit icon
  • An indie game’s path from Japanese restaurant to Nintendo eShop
  • The simple yet effective AI behind Galaxian’s angry aliens

All of this is joined by news, previews, and reviews of everything gaming has to offer.

Buy Wireframe issue 2

Physical copies of Wireframe are available now in WHSmith, Tesco, and all good independent UK newsagents. Of course, we don’t like to limit your choices, so you’re able to buy direct from us, with worldwide delivery available.

There’s also the option to download issue 2 a free PDF if you’d like a handy digital version.

Subscription options!

Fancy putting your feet up and letting Wireframe come directly to you? In that case, you should take a look at our subscription options: pick up a sample six issues for a bargain price, subscribe for a full year, or get the digital edition directly to your smart device via our Android and iOS apps. To find out how to save up to 49% on Wireframe’s print edition, head to wfmag.cc/subscribe.

wireframe magazine

See you again in two weeks!

A wild HackSpace magazine appeared

HackSpace magazine issue 13 is also out today, and it’s pretty sweet. Check it out here!

HackSpace issue 13 front cover

The post Wireframe 2: The Blackout Club, Battlefield V anxiety, and more appeared first on Raspberry Pi.

Wireframe issue 1 is out now!

Post Syndicated from Ryan Lambie original https://www.raspberrypi.org/blog/wireframe-issue-1/

Wireframe is our new twice-monthly magazine that lifts the lid on video games. In Wireframe, we look at how games are made, who makes them, and how you can make games of your own. And today, we’re releasing our very first issue!

Wireframe: the new magazine that lifts the lid on video games

Uploaded by Raspberry Pi on 2018-11-07.

The inaugural issue

In issue 1, Far Cry 4 director Alex Hutchinson talks to us about going indie. We look back at the British games industry’s turbulent early years; we explore how curves and probabilities shape the games we play; and we get hands-on with Nomada Studio’s forthcoming ethereal platformer, Gris.

Wireframe magazine

Plus:

  • Jessica Price on the state of game criticism
  • Portal squeezed onto the Commodore 64
  • Treasure — the iconic game studio at 25
  • Gone Home’s Kate Craig on indie game design workarounds
  • And much, much more…

About Wireframe magazine

Cutting through the hype, Wireframe takes a more indie-focused, left-field angle than traditional games magazines. As well as news, reviews, and previews, we bring you in-depth features that uncover the stories behind your favourite games.

Wireframe magazine

And on top of all that, we also help you create your own games! Our dedicated Toolbox section is packed with detailed tutorials and tips to guide you in your own game development projects.

wireframe issue 1 cover

Raspberry Pi is all about making computing accessible to everyone, and in Wireframe, we show you how programming, art, music, and design come together to make the video games you love to play — and how you can use these elements to build games yourself.

Free digital edition

We want everyone to enjoy Wireframe and learn more about creating video games, so from today, you’ll also be able to download a digital copy of issue 1 of Wireframe for free. Get all the features, guides, and lively opinion pieces of our paper-and-ink edition as a handy PDF from our website.

Wireframe in the wild

You can find the print edition of Wireframe issue 1 in select UK newsagents and supermarkets from today, priced at just £3. Subscribers also save money on the cover price, with an introductory offer of twelve issues for just £12.

For more information, and to find out how to order Wireframe from outside the UK, visit wfmag.cc.

The post Wireframe issue 1 is out now! appeared first on Raspberry Pi.

Raspberry Pi aboard Pino, the smart sailboat

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/pino-smart-sailing-boat/

As they sail aboard their floating game design studio Pino, Rekka Bellum and Devine Lu Linvega are starting to explore the use of Raspberry Pis. As part of an experimental development tool and a weather station, Pis are now aiding them on their nautical adventures!

Mar 2018: A Smart Sailboat

Pino is on its way to becoming a smart sailboat! Raspberry Pi is the ideal device for sailors, we hope to make many more projects with it. Also the projects continue still, but we have windows now yay!

Barometer

Using a haul of Pimoroni tech including the Enviro pHat, Scroll pHat HD and Mini Black HAT Hack3r, Rekka and Devine have been experimenting with using a Raspberry Pi Zero as an onboard barometer for their sailboat. On their Hundred Rabbits YouTube channel and website, the pair has documented their experimental setups. They have also built another Raspberry Pi rig for distraction-free work and development.

Hundred Rabbits Pino onboard Raspberry Pi workstation and barometer

The official Raspberry Pi 7″ touch display, a Raspberry Pi 3B+, a Pimorni Blinkt, and a Poker II Keyboard make up Pino‘s experimental development station.

“The Pi computer is currently used only as an experimental development tool aboard Pino, but could readily be turned into a complete development platform, would our principal computers fail.” they explain, before going into the build process for the Raspberry Pi–powered barometer.

Hundred Rabbits Pino onboard Raspberry Pi workstation and barometer

The use of solderless headers make this weather station an ideal build wherever space and tools are limited.

The barometer uses the sensor power of the Pimoroni Enviro HAT to measure atmospheric pressure, and a Raspberry Pi Zero displays this data on the Scroll pHAT HD. It thus advises the two travellers of oncoming storms. By taking advantage of the solderless header provided by the Sheffield-based pirates, the Hundred Rabbits team was able to put the device together with relative ease. They provide all information for the build here.

Hundred Rabbits Pino onboard Raspberry Pi workstation and barometer

All aboard Pino

If you’d like to follow the journey of Rekka Bellum and Devine Lu Linvega as they continue to travel the oceans aboard Pino, you can follow them on YouTube or Twitter, and via their website.

We are Hundred Rabbits

This is us, this what we do, and these are our intentions! We live, and work from our sailboat Pino. Traveling helps us stay creative, and we feed what we see back into our work. We make games, art, books and music under the studio name ‘Hundred Rabbits.’

 

The post Raspberry Pi aboard Pino, the smart sailboat appeared first on Raspberry Pi.

Build a Flick-controlled marble maze

Post Syndicated from Alex Bate original https://www.raspberrypi.org/blog/flick-marble-maze/

Wiggle your fingers to guide a ball through a 3D-printed marble maze using the Pi Supply Flick board for Raspberry Pi!

Wiggle, wiggle, wiggle, wiggle, yeah

Using the Flick, previously seen in last week’s Hacker House’s gesture-controlled holographic visualiser, South Africa–based Tom Van den Bon has created a touch-free marble maze. He was motivated by, if his Twitter is any indication, his love for game-making and 3D printing.

Tom Van den Bon on Twitter

Day 172 of #3dprint365. #3dprinted Raspberry PI Controlled Maze Thingie Part 3 #3dprint #3dprinter #thingiverse #raspberrypi #pisupply

All non-electronic parts of this build are 3D printed. The marble maze sits atop a motorised structure which moves along two axes thanks to servo motors. Tom controls the movement using gestures which are picked up by the Flick Zero, a Pi Zero–sized 3D-tracking board that can detect movement up to 15cm away.

Find the code for the maze, which takes advantage of the Flick library, on Tom’s GitHub account.

Make your own games

Our free resources are a treasure trove of fun home-brew games that you can build with your friends and family.

If you like physical games such as Tom’s gesture-controlled maze, you should definitely check out our Python quick reaction game! In it, players are pitted against each other to react as quickly as possible to a randomly lighting up LED.

raspberry pi marble maze

You can also play solo with our Lights out game, where it’s you against four erratic lights eager to remain lit.

For games you can build on your computer with no need for any extra tech, Scratch games such as our button-smashing Olympic weightlifter and Hurdler projects are perfect — you can play them just using a keyboard and browser!

raspberry pi marble maze

And if you’d like to really get stuck into learning about game development, then you’re in luck! CoderDojo’s Make your own game book guides you through all the steps of building a game in JavaScript, from creating the world to designing characters.

Cover of CoderDojo Nano Make your own game

And because I just found this while searching for image content for today’s blog, here is a photo of Eben’s and Liz’s cat Mooncake with a Raspberry Pi on her head. Enjoy!

A cat with a Raspberry Pi pin on its head — raspberry pi marble maze

Ras-purry Pi?

The post Build a Flick-controlled marble maze appeared first on Raspberry Pi.

Coaxing 2D platforming out of Unity

Post Syndicated from Eevee original https://eev.ee/blog/2017/10/13/coaxing-2d-platforming-out-of-unity/

An anonymous donor asked a question that I can’t even begin to figure out how to answer, but they also said anything else is fine, so here’s anything else.

I’ve been avoiding writing about game physics, since I want to save it for ✨ the book I’m writing ✨, but that book will almost certainly not touch on Unity. Here, then, is a brief run through some of the brick walls I ran into while trying to convince Unity to do 2D platforming.

This is fairly high-level — there are no blocks of code or helpful diagrams. I’m just getting this out of my head because it’s interesting. If you want more gritty details, I guess you’ll have to wait for ✨ the book ✨.

The setup

I hadn’t used Unity before. I hadn’t even used a “real” physics engine before. My games so far have mostly used LÖVE, a Lua-based engine. LÖVE includes box2d bindings, but for various reasons (not all of them good), I opted to avoid them and instead write my own physics completely from scratch. (How, you ask? ✨ Book ✨!)

I was invited to work on a Unity project, Chaos Composer, that someone else had already started. It had basic movement already implemented; I taught myself Unity’s physics system by hacking on it. It’s entirely possible that none of this is actually the best way to do anything, since I was really trying to reproduce my own homegrown stuff in Unity, but it’s the best I’ve managed to come up with.

Two recurring snags were that you can’t ask Unity to do multiple physics updates in a row, and sometimes getting the information I wanted was difficult. Working with my own code spoiled me a little, since I could invoke it at any time and ask it anything I wanted; Unity, on the other hand, is someone else’s black box with a rigid interface on top.

Also, wow, Googling for a lot of this was not quite as helpful as expected. A lot of what’s out there is just the first thing that works, and often that’s pretty hacky and imposes severe limits on the game design (e.g., “this won’t work with slopes”). Basic movement and collision are the first thing you do, which seems to me like the worst time to be locking yourself out of a lot of design options. I tried very (very, very, very) hard to minimize those kinds of constraints.

Problem 1: Movement

When I showed up, movement was already working. Problem solved!

Like any good programmer, I immediately set out to un-solve it. Given a “real” physics engine like Unity prominently features, you have two options: ⓐ treat the player as a physics object, or ⓑ don’t. The existing code went with option ⓑ, like I’d done myself with LÖVE, and like I’d seen countless people advise. Using a physics sim makes for bad platforming.

But… why? I believed it, but I couldn’t concretely defend it. I had to know for myself. So I started a blank project, drew some physics boxes, and wrote a dozen-line player controller.

Ah! Immediate enlightenment.

If the player was sliding down a wall, and I tried to move them into the wall, they would simply freeze in midair until I let go of the movement key. The trouble is that the physics sim works in terms of forces — moving the player involves giving them a nudge in some direction, like a giant invisible hand pushing them around the level. Surprise! If you press a real object against a real wall with your real hand, you’ll see the same effect — friction will cancel out gravity, and the object will stay in midair..

Platformer movement, as it turns out, doesn’t make any goddamn physical sense. What is air control? What are you pushing against? Nothing, really; we just have it because it’s nice to play with, because not having it is a nightmare.

I looked to see if there were any common solutions to this, and I only really found one: make all your walls frictionless.

Game development is full of hacks like this, and I… don’t like them. I can accept that minor hacks are necessary sometimes, but this one makes an early and widespread change to a fundamental system to “fix” something that was wrong in the first place. It also imposes an “invisible” requirement, something I try to avoid at all costs — if you forget to make a particular wall frictionless, you’ll never know unless you happen to try sliding down it.

And so, I swiftly returned to the existing code. It wasn’t too different from what I’d come up with for LÖVE: it applied gravity by hand, tracked the player’s velocity, computed the intended movement each frame, and moved by that amount. The interesting thing was that it used MovePosition, which schedules a movement for the next physics update and stops the movement if the player hits something solid.

It’s kind of a nice hybrid approach, actually; all the “physics” for conscious actors is done by hand, but the physics engine is still used for collision detection. It’s also used for collision rejection — if the player manages to wedge themselves several pixels into a solid object, for example, the physics engine will try to gently nudge them back out of it with no extra effort required on my part. I still haven’t figured out how to get that to work with my homegrown stuff, which is built to prevent overlap rather than to jiggle things out of it.

But wait, what about…

Our player is a dynamic body with rotation lock and no gravity. Why not just use a kinematic body?

I must be missing something, because I do not understand the point of kinematic bodies. I ran into this with Godot, too, which documented them the same way: as intended for use as players and other manually-moved objects. But by default, they don’t even collide with other kinematic bodies or static geometry. What? There’s a checkbox to turn this on, which I enabled, but then I found out that MovePosition doesn’t stop kinematic bodies when they hit something, so I would’ve had to cast along the intended path of movement to figure out when to stop, thus duplicating the same work the physics engine was about to do.

But that’s impossible anyway! Static geometry generally wants to be made of edge colliders, right? They don’t care about concave/convex. Imagine the player is standing on the ground near a wall and tries to move towards the wall. Both the ground and the wall are different edges from the same edge collider.

If you try to cast the player’s hitbox horizontally, parallel to the ground, you’ll only get one collision: the existing collision with the ground. Casting doesn’t distinguish between touching and hitting. And because Unity only reports one collision per collider, and because the ground will always show up first, you will never find out about the impending wall collision.

So you’re forced to either use raycasts for collision detection or decomposed polygons for world geometry, both of which are slightly worse tools for no real gain.

I ended up sticking with a dynamic body.


Oh, one other thing that doesn’t really fit anywhere else: keep track of units! If you’re adding something called “velocity” directly to something called “position”, something has gone very wrong. Acceleration is distance per time squared; velocity is distance per time; position is distance. You must multiply or divide by time to convert between them.

I never even, say, add a constant directly to position every frame; I always phrase it as velocity and multiply by Δt. It keeps the units consistent: time is always in seconds, not in tics.

Problem 2: Slopes

Ah, now we start to get off in the weeds.

A sort of pre-problem here was detecting whether we’re on a slope, which means detecting the ground. The codebase originally used a manual physics query of the area around the player’s feet to check for the ground, which seems to be somewhat common, but that can’t tell me the angle of the detected ground. (It’s also kind of error-prone, since “around the player’s feet” has to be specified by hand and may not stay correct through animations or changes in the hitbox.)

I replaced that with what I’d eventually settled on in LÖVE: detect the ground by detecting collisions, and looking at the normal of the collision. A normal is a vector that points straight out from a surface, so if you’re standing on the ground, the normal points straight up; if you’re on a 10° incline, the normal points 10° away from straight up.

Not all collisions are with the ground, of course, so I assumed something is ground if the normal pointed away from gravity. (I like this definition more than “points upwards”, because it avoids assuming anything about the direction of gravity, which leaves some interesting doors open for later on.) That’s easily detected by taking the dot product — if it’s negative, the collision was with the ground, and I now have the normal of the ground.

Actually doing this in practice was slightly tricky. With my LÖVE engine, I could cram this right into the middle of collision resolution. With Unity, not quite so much. I went through a couple iterations before I really grasped Unity’s execution order, which I guess I will have to briefly recap for this to make sense.

Unity essentially has two update cycles. It performs physics updates at fixed intervals for consistency, and updates everything else just before rendering. Within a single frame, Unity does as many fixed physics updates as it has spare time for (which might be zero, one, or more), then does a regular update, then renders. User code can implement either or both of Update, which runs during a regular update, and FixedUpdate, which runs just before Unity does a physics pass.

So my solution was:

  • At the very end of FixedUpdate, clear the actor’s “on ground” flag and ground normal.

  • During OnCollisionEnter2D and OnCollisionStay2D (which are called from within a physics pass), if there’s a collision that looks like it’s with the ground, set the “on ground” flag and ground normal. (If there are multiple ground collisions, well, good luck figuring out the best way to resolve that! At the moment I’m just taking the first and hoping for the best.)

That means there’s a brief window between the end of FixedUpdate and Unity’s physics pass during which a grounded actor might mistakenly believe it’s not on the ground, which is a bit of a shame, but there are very few good reasons for anything to be happening in that window.

Okay! Now we can do slopes.

Just kidding! First we have to do sliding.

When I first looked at this code, it didn’t apply gravity while the player was on the ground. I think I may have had some problems with detecting the ground as result, since the player was no longer pushing down against it? Either way, it seemed like a silly special case, so I made gravity always apply.

Lo! I was a fool. The player could no longer move.

Why? Because MovePosition does exactly what it promises. If the player collides with something, they’ll stop moving. Applying gravity means that the player is trying to move diagonally downwards into the ground, and so MovePosition stops them immediately.

Hence, sliding. I don’t want the player to actually try to move into the ground. I want them to move the unblocked part of that movement. For flat ground, that means the horizontal part, which is pretty much the same as discarding gravity. For sloped ground, it’s a bit more complicated!

Okay but actually it’s less complicated than you’d think. It can be done with some cross products fairly easily, but Unity makes it even easier with a couple casts. There’s a Vector3.ProjectOnPlane function that projects an arbitrary vector on a plane given by its normal — exactly the thing I want! So I apply that to the attempted movement before passing it along to MovePosition. I do the same thing with the current velocity, to prevent the player from accelerating infinitely downwards while standing on flat ground.

One other thing: I don’t actually use the detected ground normal for this. The player might be touching two ground surfaces at the same time, and I’d want to project on both of them. Instead, I use the player body’s GetContacts method, which returns contact points (and normals!) for everything the player is currently touching. I believe those contact points are tracked by the physics engine anyway, so asking for them doesn’t require any actual physics work.

(Looking at the code I have, I notice that I still only perform the slide for surfaces facing upwards — but I’d want to slide against sloped ceilings, too. Why did I do this? Maybe I should remove that.)

(Also, I’m pretty sure projecting a vector on a plane is non-commutative, which raises the question of which order the projections should happen in and what difference it makes. I don’t have a good answer.)

(I note that my LÖVE setup does something slightly different: it just tries whatever the movement ought to be, and if there’s a collision, then it projects — and tries again with the remaining movement. But I can’t ask Unity to do multiple moves in one physics update, alas.)

Okay! Now, slopes. But actually, with the above work done, slopes are most of the way there already.

One obvious problem is that the player tries to move horizontally even when on a slope, and the easy fix is to change their movement from speed * Vector2.right to speed * new Vector2(ground.y, -ground.x) while on the ground. That’s the ground normal rotated a quarter-turn clockwise, so for flat ground it still points to the right, and in general it points rightwards along the ground. (Note that it assumes the ground normal is a unit vector, but as far as I’m aware, that’s true for all the normals Unity gives you.)

Another issue is that if the player stands motionless on a slope, gravity will cause them to slowly slide down it — because the movement from gravity will be projected onto the slope, and unlike flat ground, the result is no longer zero. For conscious actors only, I counter this by adding the opposite factor to the player’s velocity as part of adding in their walking speed. This matches how the real world works, to some extent: when you’re standing on a hill, you’re exerting some small amount of effort just to stay in place.

(Note that slope resistance is not the same as friction. Okay, yes, in the real world, virtually all resistance to movement happens as a result of friction, but bracing yourself against the ground isn’t the same as being passively resisted.)

From here there are a lot of things you can do, depending on how you think slopes should be handled. You could make the player unable to walk up slopes that are too steep. You could make walking down a slope faster than walking up it. You could make jumping go along the ground normal, rather than straight up. You could raise the player’s max allowed speed while running downhill. Whatever you want, really. Armed with a normal and awareness of dot products, you can do whatever you want.

But first you might want to fix a few aggravating side effects.

Problem 3: Ground adherence

I don’t know if there’s a better name for this. I rarely even see anyone talk about it, which surprises me; it seems like it should be a very common problem.

The problem is: if the player runs up a slope which then abruptly changes to flat ground, their momentum will carry them into the air. For very fast players going off the top of very steep slopes, this makes sense, but it becomes visible even for relatively gentle slopes. It was a mild nightmare in the original release of our game Lunar Depot 38, which has very “rough” ground made up of lots of shallow slopes — so the player is very frequently slightly off the ground, which meant they couldn’t jump, for seemingly no reason. (I even had code to fix this, but I disabled it because of a silly visual side effect that I never got around to fixing.)

Anyway! The reason this is a problem is that game protagonists are generally not boxes sliding around — they have legs. We don’t go flying off the top of real-world hilltops because we put our foot down until it touches the ground.

Simulating this footfall is surprisingly fiddly to get right, especially with someone else’s physics engine. It’s made somewhat easier by Cast, which casts the entire hitbox — no matter what shape it is — in a particular direction, as if it had moved, and tells you all the hypothetical collisions in order.

So I cast the player in the direction of gravity by some distance. If the cast hits something solid with a ground-like collision normal, then the player must be close to the ground, and I move them down to touch it (and set that ground as the new ground normal).

There are some wrinkles.

Wrinkle 1: I only want to do this if the player is off the ground now, but was on the ground last frame, and is not deliberately moving upwards. That latter condition means I want to skip this logic if the player jumps, for example, but also if the player is thrust upwards by a spring or abducted by a UFO or whatever. As long as external code goes through some interface and doesn’t mess with the player’s velocity directly, that shouldn’t be too hard to track.

Wrinkle 2: When does this logic run? It needs to happen after the player moves, which means after a Unity physics pass… but there’s no callback for that point in time. I ended up running it at the beginning of FixedUpdate and the beginning of Update — since I definitely want to do it before rendering happens! That means it’ll sometimes happen twice between physics updates. (I could carefully juggle a flag to skip the second run, but I… didn’t do that. Yet?)

Wrinkle 3: I can’t move the player with MovePosition! Remember, MovePosition schedules a movement, it doesn’t actually perform one; that means if it’s called twice before the physics pass, the first call is effectively ignored. I can’t easily combine the drop with the player’s regular movement, for various fiddly reasons. I ended up doing it “by hand” using transform.Translate, which I think was the “old way” to do manual movement before MovePosition existed. I’m not totally sure if it activates triggers? For that matter, I’m not sure it even notices collisions — but since I did a full-body Cast, there shouldn’t be any anyway.

Wrinkle 4: What, exactly, is “some distance”? I’ve yet to find a satisfying answer for this. It seems like it ought to be based on the player’s current speed and the slope of the ground they’re moving along, but every time I’ve done that math, I’ve gotten totally ludicrous answers that sometimes exceed the size of a tile. But maybe that’s not wrong? Play around, I guess, and think about when the effect should “break” and the player should go flying off the top of a hill.

Wrinkle 5: It’s possible that the player will launch off a slope, hit something, and then be adhered to the ground where they wouldn’t have hit it. I don’t much like this edge case, but I don’t see a way around it either.

This problem is surprisingly awkward for how simple it sounds, and the solution isn’t entirely satisfying. Oh, well; the results are much nicer than the solution. As an added bonus, this also fixes occasional problems with running down a hill and becoming detached from the ground due to precision issues or whathaveyou.

Problem 4: One-way platforms

Ah, what a nightmare.

It took me ages just to figure out how to define one-way platforms. Only block when the player is moving downwards? Nope. Only block when the player is above the platform? Nuh-uh.

Well, okay, yes, those approaches might work for convex players and flat platforms. But what about… sloped, one-way platforms? There’s no reason you shouldn’t be able to have those. If Super Mario World can do it, surely Unity can do it almost 30 years later.

The trick is, again, to look at the collision normal. If it faces away from gravity, the player is hitting a ground-like surface, so the platform should block them. Otherwise (or if the player overlaps the platform), it shouldn’t.

Here’s the catch: Unity doesn’t have conditional collision. I can’t decide, on the fly, whether a collision should block or not. In fact, I think that by the time I get a callback like OnCollisionEnter2D, the physics pass is already over.

I could go the other way and use triggers (which are non-blocking), but then I have the opposite problem: I can’t stop the player on the fly. I could move them back to where they hit the trigger, but I envision all kinds of problems as a result. What if they were moving fast enough to activate something on the other side of the platform? What if something else moved to where I’m trying to shove them back to in the meantime? How does this interact with ground detection and listing contacts, which would rightly ignore a trigger as non-blocking?

I beat my head against this for a while, but the inability to respond to collision conditionally was a huge roadblock. It’s all the more infuriating a problem, because Unity ships with a one-way platform modifier thing. Unfortunately, it seems to have been implemented by someone who has never played a platformer. It’s literally one-way — the player is only allowed to move straight upwards through it, not in from the sides. It also tries to block the player if they’re moving downwards while inside the platform, which invokes clumsy rejection behavior. And this all seems to be built into the physics engine itself somehow, so I can’t simply copy whatever they did.

Eventually, I settled on the following. After calculating attempted movement (including sliding), just at the end of FixedUpdate, I do a Cast along the movement vector. I’m not thrilled about having to duplicate the physics engine’s own work, but I do filter to only things on a “one-way platform” physics layer, which should at least help. For each object the cast hits, I use Physics2D.IgnoreCollision to either ignore or un-ignore the collision between the player and the platform, depending on whether the collision was ground-like or not.

(A lot of people suggested turning off collision between layers, but that can’t possibly work — the player might be standing on one platform while inside another, and anyway, this should work for all actors!)

Again, wrinkles! But fewer this time. Actually, maybe just one: handling the case where the player already overlaps the platform. I can’t just check for that with e.g. OverlapCollider, because that doesn’t distinguish between overlapping and merely touching.

I came up with a fairly simple fix: if I was going to un-ignore the collision (i.e. make the platform block), and the cast distance is reported as zero (either already touching or overlapping), I simply do nothing instead. If I’m standing on the platform, I must have already set it blocking when I was approaching it from the top anyway; if I’m overlapping it, I must have already set it non-blocking to get here in the first place.

I can imagine a few cases where this might go wrong. Moving platforms, especially, are going to cause some interesting issues. But this is the best I can do with what I know, and it seems to work well enough so far.

Oh, and our player can deliberately drop down through platforms, which was easy enough to implement; I just decide the platform is always passable while some button is held down.

Problem 5: Pushers and carriers

I haven’t gotten to this yet! Oh boy, can’t wait. I implemented it in LÖVE, but my way was hilariously invasive; I’m hoping that having a physics engine that supports a handwaved “this pushes that” will help. Of course, you also have to worry about sticking to platforms, for which the recommended solution is apparently to parent the cargo to the platform, which sounds goofy to me? I guess I’ll find out when I throw myself at it later.

Overall result

I ended up with a fairly pleasant-feeling system that supports slopes and one-way platforms and whatnot, with all the same pieces as I came up with for LÖVE. The code somehow ended up as less of a mess, too, but it probably helps that I’ve been down this rabbit hole once before and kinda knew what I was aiming for this time.

Animation of a character running smoothly along the top of an irregular dinosaur skeleton

Sorry that I don’t have a big block of code for you to copy-paste into your project. I don’t think there are nearly enough narrative discussions of these fundamentals, though, so hopefully this is useful to someone. If not, well, look forward to ✨ my book, that I am writing ✨!

Make your own game with CoderDojo’s new book

Post Syndicated from Nuala McHale original https://www.raspberrypi.org/blog/coderdojo-nano/

The first official CoderDojo book, CoderDojo Nano: Build Your Own Website, was a resounding success: thousands of copies have been bought by aspiring CoderDojo Ninjas, and it‘s available in ten languages, including Bulgarian, Czech, Dutch, Lithuanian, Latvian, Portuguese, Spanish, and Slovakian. Now we are delighted to announce the release of the second book in our Create with Code trilogy, titled CoderDojo Nano: Make Your Own Game.

Cover of CoderDojo Nano Make your own game

The paperback book will be available in English from Thursday 7 September (with English flexibound and Dutch versions scheduled to follow in the coming months), enabling young people and adults to learn creative and fun coding skills!

What will you learn?

The new book explains the fundamentals of the JavaScript language in a clear, logical way while supporting you to create your very own computer game.

Pixel image of laptop displaying a jump-and-run game

You will learn how to animate characters, create a world for your game, and use the physics of movement within it. The book is full of clear step-by-step instructions and illustrated screenshots to make reviewing your code easy. Additionally, challenges and open-ended prompts at the end of each section will encourage you to get creative while making your game.

This book is the perfect first step towards understanding game development, particularly for those of you who do not (yet) have a local Dojo. Regardless of where you live, using our books you too can learn to ‘Create with Code’!

Tried and tested

As always, CoderDojo Ninjas from all around the world tested our book, and their reactions have been hugely positive. Here is a selection of their thoughts:

“The book is brilliant. The [game] is simple yet innovative. I personally love it, and want to get stuck in making it right away!”

“What I really like is that, unlike most books on coding, this one properly explains what’s happening, and what each piece of code does and where it comes from.”

“I found the book most enjoyable. The layout is great, with lots of colour, and I found the information very easy to follow. The Ninja Tips are a great help in case you get a bit stuck. I liked that the book represents a mix of boy and girl Ninjas — it really makes coding fun for all.”

“The book is a great guide for both beginners and people who want to do something creative with their knowledge of code. Even people who cannot go to a CoderDojo can learn code using this book!”

Writer Jurie Horneman

Author of CoderDojo Nano: Make Your Own Game Jurie Horneman has been working in the game development industry for more than 15 years.

stuffed toy rabbit wearing glasses

Jurie would get on well with Babbage, I think.

He shares how he got into coding, and what he has learnt while creating this awesome book:

“I’ve been designing and programming games since 1991, starting with ancient home computers, and now I’m working with PCs and consoles. As a game designer, it’s my job to teach players the rules of the game in a fun and playful manner — that gave me some useful experience for writing the book.

I believe that, if you want to understand something properly, you have to teach it to others. Therefore, writing this book was very educational for me, as I hope reading it will be for learners.”

Asked what his favorite thing about the book is, Jurie said he loves the incredible pixel art design: “The artist (Gary J Lucken, Army of Trolls) did a great job to help explain some of the abstract concepts in the book.”

Pixel image of a landscape with an East Asian temple on a lonely mountain

Gary’s art is also just gorgeous.

How can you get your copy?

You can pre-order CoderDojo Nano: Make Your Own Game here. Its initial pricing is £9.99 (around €11), and discounted copies with free international delivery are available here.

The post Make your own game with CoderDojo’s new book appeared first on Raspberry Pi.