Tag Archives: python

Code Jetpac’s rocket building action | Wireframe #40

Post Syndicated from Ryan Lambie original https://www.raspberrypi.org/blog/code-jetpacs-rocket-building-action-wireframe-40/

Pick up parts of a spaceship, fuel it up, and take off in Mark Vanstone’s Python and Pygame Zero rendition of a ZX Spectrum classic

The original Jetpac, in all its 8-bit ZX Spectrum glory

For ZX Spectrum owners, there was something special about waiting for a game to load, with the sound of zeros and ones screeching from the cassette tape player next to the computer. When the loading screen – an image of an astronaut and Ultimate Play the Game’s logo – appeared, you knew the wait was going to be worthwhile. Created by brothers Chris and Tim Stamper in 1983, Jetpac was one of the first hits for their studio, Ultimate Play the Game. The game features the hapless astronaut Jetman, who must build and fuel a rocket from the parts dotted around the screen, all the while avoiding or shooting swarms of deadly aliens.

This month’s code snippet will provide the mechanics of collecting the ship parts and fuel to get Jetman’s spaceship to take off.  We can use the in-built Pygame Zero Actor objects for all the screen elements and the Actor collision routines to deal with gravity and picking up items. To start, we need to initialise our Actors. We’ll need our Jetman, the ground, some platforms, the three parts of the rocket, some fire for the rocket engines, and a fuel container. The way each Actor behaves will be determined by a set of lists. We have a list for objects with gravity, objects that are drawn each frame, a list of platforms, a list of collision objects, and the list of items that can be picked up.

Jetman jumps inside the rocket and is away. Hurrah!

Our draw() function is straightforward as it loops through the list of items in the draw list and then has a couple of conditional elements being drawn after. The update() function is where all the action happens: we check for keyboard input to move Jetman around, apply gravity to all the items on the gravity list, check for collisions with the platform list, pick up the next item if Jetman is touching it, apply any thrust to Jetman, and move any items that Jetman is holding to move with him. When that’s all done, we can check if refuelling levels have reached the point where Jetman can enter the rocket and blast off.

If you look at the helper functions checkCollisions() and checkTouching(), you’ll see that they use different methods of collision detection, the first being checking for a collision with a specified point so we can detect collisions with the top or bottom of an actor, and the touching collision is a rectangle or bounding box collision, so that if the bounding box of two Actors intersect, a collision is registered. The other helper function applyGravity() makes everything on the gravity list fall downward until the base of the Actor hits something on the collide list.

So that’s about it: assemble a rocket, fill it with fuel, and lift off. The only thing that needs adding is a load of pesky aliens and a way to zap them with a laser gun.

Here’s Mark’s Jetpac code. To get it running on your system, you’ll need to install Pygame Zero. And to download the full code and assets, head here.

Get your copy of Wireframe issue 40

You can read more features like this one in Wireframe issue 40, available directly from Raspberry Pi Press — we deliver worldwide.

And if you’d like a handy digital version of the magazine, you can also download issue 40 for free in PDF format.

Make sure to follow Wireframe on Twitter and Facebook for updates and exclusive offers and giveaways. Subscribe on the Wireframe website to save up to 49% compared to newsstand pricing!

The post Code Jetpac’s rocket building action | Wireframe #40 appeared first on Raspberry Pi.

Be a better Scrabble player with a Raspberry Pi High Quality Camera

Post Syndicated from Ashley Whittaker original https://www.raspberrypi.org/blog/be-a-better-scrabble-player-with-a-raspberry-pi-high-quality-camera/

One of our fave makers, Wayne from Devscover, got a bit sick of losing at Scrabble (and his girlfriend was likely raging at being stuck in lockdown with a lesser opponent). So he came up with a Raspberry Pi–powered solution!

Using a Raspberry Pi High Quality Camera and a bit of Python, you can quickly figure out the highest-scoring word your available Scrabble tiles allow you to play.

Hardware

  • Raspberry Pi 3B
  • Compatible touchscreen
  • Raspberry Pi High Quality Camera
  • Power supply for the touchscreen and Raspberry Pi
  • Scrabble board

You don’t have to use a Raspberry Pi 3B, but you do need a model that has both display and camera ports. Wayne also chose to use an official Raspberry Pi Touch Display because it can power the computer, but any screen that can talk to your Raspberry Pi should be fine.

Software

Firstly, the build takes a photo of your Scrabble tiles using raspistill.

Next, a Python script processes the image of your tiles and then relays the highest-scoring word you can play to your touchscreen.

The key bit of code here is twl, a Python script that contains every possible word you can play in Scrabble.

From 4.00 minutes into his build video, Wayne walks you through what each bit of code does and how he made it work for this project, including how he installed and used the Scrabble dictionary.

Fellow Scrabble-strugglers have suggested sneaky upgrades in the comments of Wayne’s YouTube video, such having the build relay answers to a more discreet smart watch.

No word yet on how the setup deals with the blank Scrabble tiles; those things are like gold dust.

In case you haven’t met the Raspberry Pi High Quality Camera yet, Wayne also did this brilliant unboxing and tutorial video for our newest piece of hardware.

And for more projects from Devscover, check out this great Amazon price tracker using a Raspberry Pi Zero W, and make sure to subscribe to the channel for more content.

The post Be a better Scrabble player with a Raspberry Pi High Quality Camera appeared first on Raspberry Pi.

Learning with Raspberry Pi — robotics, a Master’s degree, and beyond

Post Syndicated from Ashley Whittaker original https://www.raspberrypi.org/blog/learning-with-raspberry-pi-robotics-a-masters-degree-and-beyond/

Meet Callum Fawcett, who shares his journey from tinkering with the first Raspberry Pi while he was at school, to a Master’s degree in computer science and a real-life job in programming. We also get to see some of the awesome projects he’s made along the way.

I first decided to get a Raspberry Pi at the age of 14. I had already started programming a little bit before and found that I really enjoyed the language Python. At the time the first Raspberry Pi came out, my History teacher told us about them and how they would be a great device to use to learn programming. I decided to ask for one to help me learn more. I didn’t really know what I would use it for or how it would even work, but after a little bit of help at the start, I quickly began making small programs in Python. I remember some of my first programs being very simple dictionary-type programs in which I would match English words to German to help with my German homework.

Learning Linux, C++, and Python

Most of my learning was done through two sources. I learnt Linux and how the terminal worked using online resources such as Stack Overflow. I would have a problem that I needed to solve, look up solutions online, and try out commands that I found. This was perhaps the hardest part of learning how to use a Raspberry Pi, as it was something I had never done before, but it really helped me in later years when I would use Linux more than Windows. For learning programming, I preferred to use books. I had a book for C++ and a book for Python that I would work through. These were game-based books, so many of the fun projects that I did were simple text-based games where you typed in responses to questions.

A family robotics project

The first robot Callum made using a Raspberry Pi

By far the coolest project I did with the Raspberry Pi was to build a small robot (shown above). This was a joint project between myself and my dad. He sorted out the electronics and I programmed the robot. It was a great opportunity to learn about robotics and refine my programming skills. By the end, the robot was capable of moving around by itself, driving into objects, and then reversing and trying a new direction. It was almost like an unintelligent Roomba that couldn’t hoover, but I spent many hours improving small bits and pieces to make it as easy to use as possible. My one wish that I never managed to achieve with my robot was allowing it to map out its surroundings. This was a very ambitious project at the time, since I was still quite inexperienced in programming. The biggest problem with this was calibrating the robot’s turning circle, which was never consistent so it was very hard to have the robot know where in the room it was.

Sense HAT maze game

Another fun project that I worked on used the Sense HAT developed for the Astro Pi computers for use on the International Space Station. Using this, I was able to make a memory maze game (shown below), in which a player is shown a maze for several seconds and then has to navigate that maze from memory by shaking the device. This was my first introduction to using more interactive types of input, and this eventually led to my final-year project, which used these interesting interactions to develop another way of teaching.

Learning programming without formal lessons

I have now just finished my Master’s degree in computer science at the University of Bristol. Before going to university, I had no experience of being taught programming in a formal environment. It was not a taught subject at my secondary school or sixth form. I wanted to get more people at my school interested in this area of study though, which I did by running a coding club for people. I would help others debug their code and discuss interesting problems with them. The reason that I chose to study computer science is largely because of my experiences with Raspberry Pi and other programming I did in my own time during my teenage years. I likely would have studied history if it weren’t for the programming I had done by myself making robots and other games.

Raspberry Pi has continued to play a part in my degree and extra-curricular activities; I used them in two large projects during my time at university and used a similar device in my final project. My robot experience also helped me to enter my university’s ‘Robot Wars’ competition which, though we never won, was a lot of fun.

A tool for learning and a device for industry

Having a Raspberry Pi is always useful during a hackathon, because it’s such a versatile component. Tech like Raspberry Pi will always be useful for beginners to learn the basics of programming and electronics, but these computers are also becoming more and more useful for people with more experience to make fun and useful projects. I could see tech like Raspberry Pi being used in the future to help quickly prototype many types of electronic devices and, as they become more powerful, even being used as an affordable way of controlling many types of robots, which will become more common in the future.

Our guest blogger Callum

Now I am going on to work on programming robot control systems at Ocado Technology. My experiences of robot building during my years before university played a large part in this decision. Already, robots are becoming a huge part of society, and I think they are only going to become more prominent in the future. Automation through robots and artificial intelligence will become one of the most important tools for humanity during the 21st century, and I look forward to being a part of that process. If it weren’t for learning through Raspberry Pi, I certainly wouldn’t be in this position.

Cheers for your story, Callum! Has tinkering with our tiny computer inspired your educational or professional choices? Let us know in the comments below. 

The post Learning with Raspberry Pi — robotics, a Master’s degree, and beyond appeared first on Raspberry Pi.

Code Gauntlet’s four-player co-op mode | Wireframe #39

Post Syndicated from Ryan Lambie original https://www.raspberrypi.org/blog/code-gauntlets-four-player-co-op-mode-wireframe-39/

Four players dungeon crawling at once? Mark Vanstone shows you how to recreate Gauntlet’s co-op mode in Python and Pygame Zero.

Players collected items while battling their way through dungeons. Shooting food was a definite faux pas.

Atari’s Gauntlet was an eye-catching game, not least because it allowed four people to explore its dungeons together. Each player could choose one of four characters, each with its own abilities – there was a warrior, a Valkyrie, a wizard, and an elf – and surviving each dungeon required slaughtering enemies and the constant gathering of food, potions, and keys that unlocked doors and exits.

Designed by Ed Logg, and loosely based on the tabletop RPG Dungeons & Dragons, as well as John Palevich’s 1983 dungeon crawler, Dandy, Gauntlet was a big success. It was ported to most of the popular home systems at the time, and Atari released a sequel arcade machine, Gauntlet II, in 1986.

Atari’s original arcade machine featured four joysticks, but our example will mix keyboard controls and gamepad inputs. Before we deal with the movement, we’ll need some characters and dungeon graphics. For this example, we can make our dungeon from a large bitmap image and use a collision map to prevent our characters from clipping through walls. We’ll also need graphics for the characters moving in eight different directions. Each direction has three frames of walking animation, which makes a total of 24 frames per character. We can use a Pygame Zero Actor object for each character and add a few extra properties to keep track of direction and the current animation frame. If we put the character Actors in a list, we can loop through the list to check for collisions, move the player, or draw them to the screen.

We now test input devices for movement controls using the built-in Pygame keyboard object to test if keys are pressed. For example, keyboard.left will return True if the left arrow key is being held down. We can use the arrow keys for one player and the WASD keys for the other keyboard player. If we register x and y movements separately, then if two keys are pressed – for example, up and left – we can read that as a diagonal movement. In this way, we can get all eight directions of movement from just four keys.

For joystick or gamepad movement, we need to import the joystick module from Pygame. This provides us with methods to count the number of joystick or gamepad devices that are attached to the computer, and then initialise them for input. When we check for input from these devices, we just need to get the x-axis value and the y- axis value and then make it into an integer. Joysticks and gamepads should return a number between -1 and 1 on each axis, so if we round that number, we will get the movement value we need.

We can work out the direction (and the image we need to use) of the character with a small lookup table of x and y values and translate that to a frame number cycling through those three frames of animation as the character walks. Then all we need to do before we move the character is check they aren’t going to collide with a wall or another character. And that’s it – we now have a four-player control system. As for adding enemy spawners, loot, and keys – well, that’s a subject for another time.

Here’s Mark’s code snippet. To get it working on your system, you’ll need to install Pygame Zero. And to download the full code and assets, go here.

Get your copy of Wireframe issue 39

You can read more features like this one in Wireframe issue 39, available directly from Raspberry Pi Press — we deliver worldwide.

And if you’d like a handy digital version of the magazine, you can also download issue 39 for free in PDF format.

Make sure to follow Wireframe on Twitter and Facebook for updates and exclusive offers and giveaways. Subscribe on the Wireframe website to save up to 49% compared to newsstand pricing!

The post Code Gauntlet’s four-player co-op mode | Wireframe #39 appeared first on Raspberry Pi.

Code Robotron: 2084’s twin-stick action | Wireframe #38

Post Syndicated from Ryan Lambie original https://www.raspberrypi.org/blog/code-robotron-2084s-twin-stick-action-wireframe-38/

News flash! Before we get into our Robotron: 2084 code, we have some important news to share about Wireframe: as of issue 39, the magazine will be going monthly.

The new 116-page issue will be packed with more in-depth features, more previews and reviews, and more of the guides to game development that make the magazine what it is. The change means we’ll be able to bring you new subscription offers, and generally make the magazine more sustainable in a challenging global climate.

As for existing subscribers, we’ll be emailing you all to let you know how your subscription is changing, and we’ll have some special free issues on offer as a thank you for your support.

The first monthly issue will be out on 4 June, and subsequent editions will be published on the first Thursday of every month after that. You’ll be able to order a copy online, or you’ll find it in selected supermarkets and newsagents if you’re out shopping for essentials.

We now return you to our usual programming…

Move in one direction and fire in another with this Python and Pygame re-creation of an arcade classic. Raspberry Pi’s own Mac Bowley has the code.

Robotron: 2084 is often listed on ‘best game of all time’ lists, and has been remade and re-released for numerous systems over the years.

Robotron: 2084

Released back in 1982, Robotron: 2084 popularised the concept of the twin-stick shooter. It gave players two joysticks which allowed them to move in one direction while also shooting at enemies in another. Here, I’ll show you how to recreate those controls using Python and Pygame. We don’t have access to any sticks, only a keyboard, so we’ll be using the arrow keys for movement and WASD to control the direction of fire.

The movement controls use a global variable, a few if statements, and two built-in Pygame functions: on_key_down and on_key_up. The on_key_down function is called when a key on the keyboard is pressed, so when the player presses the right arrow key, for example, I set the x direction of the player to be a positive 1. Instead of setting the movement to 1, instead, I’ll add 1 to the direction. The on_key_down function is called when a button’s released. A key being released means the player doesn’t want to travel in that direction anymore and so we should do the opposite of what we did earlier – we take away the 1 or -1 we applied in the on_key_up function.

We repeat this process for each arrow key. Moving the player in the update() function is the last part of my movement; I apply a move speed and then use a playArea rect to clamp the player’s position.

The arena background and tank sprites were created in Piskel. Separate sprites for the tank allow the turret to rotate separately from the tracks.

Turn and fire

Now for the aiming and rotating. When my player aims, I want them to set the direction the bullets will fire, which functions like the movement. The difference this time is that when a player hits an aiming key, I set the direction directly rather than adjusting the values. If my player aims up, and then releases that key, the shooting will stop. Our next challenge is changing this direction into a rotation for the turret.

Actors in Pygame can be rotated in degrees, so I have to find a way of turning a pair of x and y directions into a rotation. To do this, I use the math module’s atan2 function to find the arc tangent of two points. The function returns a result in radians, so it needs to be converted. (You’ll also notice I had to adjust mine by 90 degrees. If you want to avoid having to do this, create a sprite that faces right by default.)

To fire bullets, I’m using a flag called ‘shooting’ which, when set to True, causes my turret to turn and fire. My bullets are dictionaries; I could have used a class, but the only thing I need to keep track of is an actor and the bullet’s direction.

Here’s Mac’s code snippet, which creates a simple twin-stick shooting mechanic in Python. To get it working on your system, you’ll need to install Pygame Zero. And to download the full code and assets, go here.

You can look at the update function and see how I’ve implemented a fire rate for the turret as well. You can edit the update function to take a single parameter, dt, which stores the time since the last frame. By adding these up, you can trigger a bullet at precise intervals and then reset the timer.

This code is just a start – you could add enemies and maybe other player weapons to make a complete shooting experience.

Get your copy of Wireframe issue 38

You can read more features like this one in Wireframe issue 38, available directly from Raspberry Pi Press — we deliver worldwide.

And if you’d like a handy digital version of the magazine, you can also download issue 38 for free in PDF format.

Make sure to follow Wireframe on Twitter and Facebook for updates and exclusive offers and giveaways. Subscribe on the Wireframe website to save up to 49% compared to newsstand pricing!

The post Code Robotron: 2084’s twin-stick action | Wireframe #38 appeared first on Raspberry Pi.

Learn at home: a guide for parents #2

Post Syndicated from Katie Gouskos original https://www.raspberrypi.org/blog/digital-making-at-home-parents-guide-python/

With millions of schools still in lockdown, parents have been telling us that they need help to support their children with learning computing at home. As well as providing loads of great content for young people, we’ve been working on support tutorials specifically for parents who want to understand and learn about the programmes used in schools and our resources.

If you don’t know your Scratch from your Trinket and your Python, we’ve got you!

Glen, Web Developer at the Raspberry Pi Foundation, and Maddie, aged 8

 

What are Python and Trinket all about?

In our last blog post for parents, we talked to you about Scratch, the programming language used in most primary schools. This time Mark, Youth Programmes Manager at the Raspberry Pi Foundation, takes you through how to use Trinket. Trinket is a free online platform that lets you write and run your code in any web browser. This is super useful because it means you don’t have to install any new software.

A parents’ introduction to Trinket

Sign up to our regular parents’ newsletter to receive regular, FREE tutorials, tips & fun projects for young people of all levels of experience: http://rpf.i…

Trinket also lets you create public web pages and projects that can be viewed by anyone with the link to them. That means your child can easily share their coding creation with others, and for you that’s a good opportunity to talk to them about staying safe online and not sharing any personal information.

Lincoln, aged 10

Getting to know Python

We’ve also got an introduction to Python for you, from Mac, a Learning Manager on our team. He’ll guide you through what to expect from Python, which is a widely used text-based programming language. For many learners, Python is their first text-based language, because it’s very readable, and you can get things done with fewer lines of code than in many other programming languages. In addition, Python has support for ‘Turtle’ graphics and other features that make coding more fun and colourful for learners. Turtle is simply a Python feature that works like a drawing board, letting you control a turtle to draw anything you like using code.

A parents’ introduction to Python

Sign up to our regular parents’ newsletter to receive regular, FREE tutorials, tips & fun projects for young people of all levels of experience: http://rpf.i…

Why not try out Mac’s suggestions of Hello world, Countdown timer, and Outfit recommender for  yourself?

Python is used in lots of real-world software applications in industries such as aerospace, retail banking, insurance and healthcare, so it’s very useful for your children to learn it!

Parent diary: juggling homeschooling and work

Olympia is Head of Youth Programmes at the Raspberry Pi Foundation and also a mum to two girls aged 9 and 11. She is currently homeschooling them as well as working (and hopefully having the odd evening to herself!). Olympia shares her own experience of learning during lockdown and how her family are adapting to their new routine.

Parent diary: Juggling homeschooling and work

Olympia Brown, Head of Youth Partnerships at the Raspberry Pi Foundation shares her experience of homeschooling during the lockdown, and how her family are a…

Digital Making at Home

To keep young people entertained and learning, we launched our Digital Making at Home series, which is free and accessible to everyone. New code-along videos are released every Monday, with different themes and projects for all levels of experience.

Code along live with the team on Wednesday 6 May at 14:00 BST / 9:00 EDT for a special session of Digital Making at Home

Sarah and Ozzy, aged 13

We want your feedback

We’ve been asking parents what they’d like to see as part of our initiative to support young people and parents. We’ve had some great suggestions so far! If you’d like to share your thoughts, you can email us at [email protected].

Sign up for our bi-weekly emails, tailored to your needs

Sign up now to start receiving free activities suitable to your child’s age and experience level, straight to your inbox. And let us know what you as a parent or guardian need help with, and what you’d like more or less of from us. 

PS: All of our resources are completely free. This is made possible thanks to the generous donations of individuals and organisations. Learn how you can help too!

 

The post Learn at home: a guide for parents #2 appeared first on Raspberry Pi.

Build a serverless Martian weather display with CircuitPython and AWS Lambda

Post Syndicated from Moheeb Zara original https://aws.amazon.com/blogs/compute/build-a-serverless-martian-weather-display-with-circuitpython-and-aws-lambda/

Build a standalone digital weather display of Mars showing the latest images from the Mars Curiosity Rover.

This project uses an Adafruit PyPortal, an open-source IoT touch display. Traditionally, a microcontroller is programmed with firmware compiled using various specific toolchains. Fortunately, the PyPortal is programmed using CircuitPython, a lightweight version of Python that works on embedded hardware. You just copy your code to the PyPortal like you would to a thumb drive and it runs.

I deploy the backend, the part in the cloud that does all the heavy lifting, using the AWS Serverless Application Repository (SAR). The code on the PyPortal makes a REST call to the backend to handle the requests to the NASA Mars Rover Photos API and InSight: Mars Weather Service API. It then converts and resizes the image before returning the information to the PyPortal for display.

An Adafruit PyPortal displaying the latest images from the Mars Curiosity Rover and weather data from InSight Mars Lander.

An Adafruit PyPortal displaying the latest images from the Mars Curiosity Rover and weather data from InSight Mars Lander.

Prerequisites

You need the following to complete the project:

Deploy the backend application

An architecture diagram of the serverless backend.

An architecture diagram of the serverless backend.

Using a serverless backend reduces the load on the PyPortal. The PyPortal makes a call to the backend API and receives a small JSON object with the relevant data. This allows you to change to the logic of where and how to get the image and weather data without needing physical access to the device.

The backend API consists of an AWS Lambda function, written in Python, behind an Amazon API Gateway endpoint. When invoked, the FetchMarsData function makes requests to two separate NASA APIs. First it fetches the latest images from the Mars Curiosity Rover, typically from the previous day, and picks one at random. It resizes and converts the image to bitmap format before uploading to Amazon S3 with public read permissions. The PyPortal downloads the image from S3 later.

The function then calls the InSight: Mars Weather Service API. It retrieves the average air temperature, wind speed, pressure, season, solar day (sol), as well as the first and last timestamp of daily sampling. The API returns these values and the S3 image URL as a JSON object.

I use the AWS Serverless Application Model (SAM) to create the backend. While it can be deployed using the AWS SAM CLI, you can also deploy from the AWS Management Console:

  1. Generate a free NASA API key at api.nasa.gov. This is required to gain access to the NASA data APIs.
  2. Navigate to the aws-serverless-pyportal-mars-weather-display application in the Serverless Application Repository.
  3. Choose Deploy.
  4. On the next page, under Application Settings, enter the parameter, NasaApiKey.

  5. Once complete, choose View CloudFormation Stack.

  6. Select the Outputs tab and make a note of the MarsApiUrl. This is required for configuring the PyPortal.

  7. Navigate to the MarsApiKey URL listed in the Outputs tab.

  8. Click Show to reveal the API key. Make a note of this. This is required for authenticating requests from the PyPortal to the MarsApiUrl.

PyPortal setup

  1. Follow these instructions from Adafruit to install the latest version of the CircuitPython bootloader. At the time of writing, the latest version is 5.2.0.
  2. Follow these instructions to install the latest Adafruit CircuitPython library bundle. I use bundle version 5.x.
  3. Insert the microSD card in the slot located on the back of the device.
  4. Optionally install the Mu Editor, a multi-platform code editor and serial debugger compatible with Adafruit CircuitPython boards. This can help if you need to troubleshoot issues.
  5. Optionally if you have a 3D printer at home, you can print a case for your PyPortal. This can protect your project while also being a great way to display it on a desk.

Code PyPortal

As with regular Python, CircuitPython does not need to be compiled to execute. Flashing new firmware on the PyPortal is as simple as copying a Python file and necessary assets over to a mounted volume. The bootloader runs code.py anytime the device starts or any files are updated.

  1. Use a USB cable to plug the PyPortal into your computer and wait until a new mounted volume CIRCUITPY is available.
  2. Download the project from GitHub. Inside the project, copy the contents of /circuit-python on to the CIRCUITPY volume.
  3. Inside the volume, open and edit the secrets.py file. Include your Wi-Fi credentials along with the MarsApiKey and MarsApiUrl API Gateway endpoint, which can be found under Outputs in the AWS CloudFormation stack created by the Serverless Application Repository.
  4. Save the file, and the device restarts. It takes a moment to connect to Wi-Fi and make the first request.
    Optionally, if you installed the Mu Editor, you can click on “Serial” to follow along the device log.Animated gif of the PyPortal device displaying a Mars rover image and Mars weather data.

Understanding how CircuitPython calls API Gateway

The main CircuitPython file is code.py. At the end of the file, the while loop periodically performs the operations necessary to display the photos from the Curiosity Rover and the InSight Mars lander weather data.

while True:
    data = callAPIEndpoint(secrets['mars_api_url'])
    downloadImage(data['image_url'])
    showDisplay(data['insight'], 
    displayTime=60*interval_minutes)

First, it calls the API Gateway endpoint using the URL from the secrets.py file, and passes the returned JSON to helper functions. The callAPIEndpoint(url) function passes the MarsApiKey in the header and a timeout of 30 seconds to the wifi.get() method. The timeout is required for integrations with services like Lambda and API Gateway. Remember, the CircuitPython code is running on a microcontroller and sometimes must wait longer when making requests.

def callAPIEndpoint(mars_api_url):
    headers = {"x-api-key": secrets['mars_api_key']}
    response = wifi.get(mars_api_url, headers=headers, timeout=30)
    data = response.json()
    print("JSON Response: ", data)
    response.close()
    return data

The JSON object that is received by the PyPortal is defined in the handler of the Lambda function. In the GitHub project downloaded earlier, see src/app.py.

def lambda_handler(event, context):
    url = fetchRoverImage()
    imgData = fetchImageData(url)
    image_s3_url = resize_image(imgData)
    weatherData = getMarsInsightWeather()

    return {
        "statusCode": 200,
        "body": json.dumps({
            "image_url": image_s3_url,
            "insight": weatherData
        })
    }

Similar to the CircuitPython code, this uses helper functions to perform all the various operations needed to retrieve and craft the data. At completion, the returned JSON is passed as the response to the PyPortal.

A quick way to add a new property is to edit the Lambda function directly through the AWS Lambda Console. Here, a key “hello” is added with a value “world”:

In the CircuitPython code.py file, the key is now available in the JSON response from API Gateway. The following prints the key value, which can be seen using the Mu Editor Serial debugger.

data = callAPIEndpoint(secrets['mars_api_url'])

print(data[‘hello’])

The Lambda function is packaged with the AWS Python SDK, boto3, which provides methods for interacting with a variety of AWS services. The Python Requests library is also included to make calls to the NASA APIs. Try exploring how to incorporate other services or APIs into your project. To understand how to modify the visual display on the PyPortal itself, see the displayio guide from Adafruit.

Conclusion

I show how to build a “live” Martian weather display using an Adafruit PyPortal, CircuitPython, and AWS Serverless technologies. Whether this is your first time using hardware or a serverless backend in the AWS Cloud, this project is simplified by the use of CircuitPython and the Serverless Application Model.

I also show how to make a request to API Gateway from the PyPortal. I then craft a response in Lambda for the PyPortal. Since both use variants of the Python programming language, much of the syntax stays the same.

To learn more, explore other devices supported by CircuitPython and the variety of community contributed libraries. Combined with the breadth of AWS services, you can push the boundaries of creativity.

Code a homage to Lunar Lander | Wireframe #37

Post Syndicated from Ryan Lambie original https://www.raspberrypi.org/blog/code-a-homage-to-lunar-lander-wireframe-37/

Shoot for the moon in our Python version of the Atari hit, Lunar Lander. Mark Vanstone has the code.

Atari’s cabinet featured a thrust control, two buttons for rotating, and an abort button in case it all went horribly wrong.

Lunar Lander

First released in 1979 by Atari, Lunar Lander was based on a concept created a decade earlier. The original 1969 game (actually called Lunar) was a text-based affair that involved controlling a landing module’s thrust to guide it safely down to the lunar surface; a later iteration, Moonlander, created a more visual iteration of the same idea on the DEC VT50 graphics terminal.

Given that it appeared at the height of the late-seventies arcade boom, though, it was Atari’s coin-op that became the most recognisable version of Lunar Lander, arriving just after the tenth anniversary of the Apollo 11 moon landing. Again, the aim of the game was to use rotation and thrust controls to guide your craft, and gently set it down on a suitably flat platform. The game required efficient control of the lander, and extra points were awarded for parking successfully on more challenging areas of the landscape.

The arcade cabinet was originally going to feature a normal joystick, but this was changed to a double stalked up-down lever providing variable levels of thrust. The player had to land the craft against the clock with a finite amount of fuel with the Altitude, Horizontal Speed, and Vertical Speed readouts at the top of the screen as a guide. Four levels of difficulty were built into the game, with adjustments to landing controls and landing areas.

Our homage to the classic Lunar Lander. Can you land without causing millions of dollars’ worth of damage?

Making the game

To write a game like Lunar Lander with Pygame Zero, we can replace the vector graphics with a nice pre-drawn static background and use that as a collision detection mechanism and altitude meter. If our background is just black where the Lander can fly and a different colour anywhere the landscape is, then we can test pixels using the Pygame function image.get_at() to see if the lander has landed. We can also test a line of pixels from the Lander down the Y-axis until we hit the landscape, which will give us the lander’s altitude.

The rotation controls of the lander are quite simple, as we can capture the left and right arrow keys and increase or decrease the rotation of the lander; however, when thrust is applied (by pressing the up arrow) things get a little more complicated. We need to remember which direction the thrust came from so that the craft will continue to move in that direction even if it is rotated, so we have a direction property attached to our lander object. A little gravity is applied to the position of the lander, and then we just need a little bit of trigonometry to work out the movement of the lander based on its speed and direction of travel.

To judge if the lander has been landed safely or rammed into the lunar surface, we look at the downward speed and angle of the craft as it reaches an altitude of 1. If the speed is sufficiently slow and the angle is near vertical, then we trigger the landed message, and the game ends. If the lander reaches zero altitude without these conditions met, then we register a crash. Other elements that can be added to this sample are things like a limited fuel gauge and variable difficulty levels. You might even try adding the sounds of the rocket booster noise featured on the original arcade game.

Engage

The direction of thrust could be done in several ways. In this case, we’ve kept it simple, with one directional value which gradually moves in a new direction when an alternative thrust is applied. You may want to try making an X- and Y-axis direction calculation for thrust so that values are a combination of the two dimensions. You could also add joystick control to provide variable thrust input.

Here’s Mark’s code snippet, which creates a simple shooting game in Python. To get it working on your system, you’ll need to install Pygame Zero. And to download the full code and assets, go here.

Get your copy of Wireframe issue 36

You can read more features like this one in Wireframe issue 37, available directly from Raspberry Pi Press — we deliver worldwide.

And if you’d like a handy digital version of the magazine, you can also download issue 37 for free in PDF format.

Make sure to follow Wireframe on Twitter and Facebook for updates and exclusive offers and giveaways. Subscribe on the Wireframe website to save up to 49% compared to newsstand pricing!

The post Code a homage to Lunar Lander | Wireframe #37 appeared first on Raspberry Pi.

Track your cat’s activity with a homemade speedometer

Post Syndicated from Ashley Whittaker original https://www.raspberrypi.org/blog/track-your-cats-activity-with-a-homemade-speedometer/

Firstly, hamster wheels for cats are (still) a thing. Secondly, Bengal cats run far. And Shawn Nunley on reddit is the latest to hit on this solution for kitty exercise and bonus cat stats.

Here is the wheel itself. That part was shop-bought. (Apparently it’s a ZiggyDoo Ferris Cat Wheel.)

Smol kitty in big wheel

Shawn has created a speedometer that tracks distance and speed. Every time a magnet mounted on the wheel passes a fixed sensor, a Raspberry Pi Zero writes to a log file so he can see how far and fast his felines have travelled. The wheel has six sensors, which each record 2.095 ft of travel. This project revealed the cats do about 4-6 miles per night on their wheel, and they reach speeds of 14 miles an hour.

Here’s your shopping list:

  • Raspberry Pi
  • Reed switch (Shawn got these)
  • Jumper wires
  • Ferris cat wheel

The tiny white box sticking out at the base of the wheel is the sensor

Shawn soldered a 40-pin header to his Raspberry Pi Zero and used jumper wires to connect to the sensor. He mounted the sensor to the cat wheel using hot glue and a pill box cut in half, which provided the perfect offset so it could accurately detect the magnets passing by. The code is written in Python.

Upcoming improvements include adding RFID so the wheel can distinguish between the cats in this two-kitty household.

Shawn also plans to calculate how much energy the Bengals are expending, and he’ll soon be connecting the Raspberry Pi to their Google Cloud Platform account so you can all keep up with the cats’ stats.

The stats are currently available only locally

And, get this – this was Shawn’s first ever time doing anything with Raspberry Pi or Python. OK, so as an ex-programmer he had a bit of a head start, but he assures us he hasn’t touched the stuff since the 1990s. He explains: “I was totally shocked at how easy it was once I figured out how to get the Raspberry Pi to read a sensor.” Start to finish, the project took him just one week.

The post Track your cat’s activity with a homemade speedometer appeared first on Raspberry Pi.

Building well-architected serverless applications: Understanding application health – part 2

Post Syndicated from Julian Wood original https://aws.amazon.com/blogs/compute/building-well-architected-serverless-applications-understanding-application-health-part-2/

This series of blog posts uses the AWS Well-Architected Tool with the Serverless Lens to help customers build and operate applications using best practices. In each post, I address the nine serverless-specific questions identified by the Serverless Lens along with the recommended best practices. See the Introduction post for a table of contents and explaining the example application.

Question OPS1: How do you evaluate your serverless application’s health?

This post continues part 1 of this Operational Excellence question. Previously, I covered using Amazon CloudWatch out-of-the-box standard metrics and alerts, and structured and centralized logging with the Embedded Metrics Format.

Best practice: Use application, business, and operations metrics

Identifying key performance indicators (KPIs), including business, customer, and operations outcomes, in additional to application metrics, helps to show a higher-level view of how the application and business is performing.

Business KPIs measure your application performance against business goals. For example, if fewer flight reservations are flowing through the system, the business would like to know.

Customer experience KPIs highlight the overall effectiveness of how customers use the application over time. Examples are perceived latency, time to find a booking, make a payment, etc.

Operational metrics help to see how operationally stable the application is over time. Examples are continuous integration/delivery/deployment feedback time, mean-time-between-failure/recovery, number of on-call pages and time to resolution, etc.

Custom Metrics

Embedded Metrics Format can also emit custom metrics to help understand your workload health’s impact on business.

The airline booking service uses AWS Step Functions, AWS Lambda, Amazon SNS, and Amazon DynamoDB.

In the confirm booking module function handler, I add a new namespace and dimension to associate this set of logs with this application and service.

metrics.set_namespace("ServerlessAirlineEMF")
metrics.put_dimensions({"service":"confirm_booking"})

Within the try statement within the try/except block, I emit a metric for a successful booking:

metrics.put_metric("BookingSuccessful", 1, "Count")

And within the except statement within the try/except block, I emit a metric for a failed booking:

metrics.put_metric("BookingFailed", 1, "Count")

Once I make a booking, within the CloudWatch console, I navigate to Logs | Log groups and select the Airline-ConfirmBooking-develop group. I select a log stream and find the custom metric as part of the structured log entry.

structured-log-entry

Custom metric structured log entry

I can also create a custom metric graph. Within the CloudWatch console, I navigate to Metrics. I see the ServerlessAirlineEMF Custom Namespace is available.

custom-namespace

Custom metric namespace

I select the Namespace and the available metric.

available-namespace

Available metric

I select a Line graph, add a name, and see the successful booking now plotted on the graph.

custom-metric-plotted

Plotted CloudWatch metric

I can also visualize and analyze this metric using CloudWatch Insights.

Once a booking is made, within the CloudWatch console, I navigate to Logs | Insights. I select the /aws/lambda/Airline-ConfirmBooking-develop log group. I choose Run query which shows a list of discovered fields on the right of the console.

I can search for discovered booking related fields.

cloudwatch-insights-discovered-fieldsI then enter the following in the query pane to search the logs and plot the sum of BookingReference and choose Run Query:

fields @timestamp, @message
| stats sum(@BookingReference)

cloudwatch-insights-displayed-bookingreference

CloudWatch Insights query

There are a number of other component metrics that are important to measure. Track interactions between upstream and downstream components such as message queue length, integration latency, and throttling.

Improvement plan summary:

  1. Identify user journeys and metrics from each customer transaction.
  2. Create custom metrics asynchronously instead of synchronously for improved performance, cost, and reliability outcomes.
  3. Emit business metrics from within your workload to measure application performance against business goals.
  4. Create and analyze component metrics to measure interactions with upstream and downstream components.
  5. Create and analyze operational metrics to assess the health of your continuous delivery pipeline and operational processes.

Good practice: Use distributed tracing and code is instrumented with additional context

Logging provides information on the individual point in time events the application generates. Tracing provides a wider continuous view of an application. Tracing helps to follow a user journey or transaction through the application.

AWS X-Ray is an example of a distributed tracing solution, there are a number of third-party options as well. X-Ray collects data about the requests that your application serves and also builds a service graph, which shows all the components of an application. This provides visibility to understand how upstream/downstream services may affect workload health.

For the most comprehensive view, enable X-Ray across as many services as possible and include X-Ray tracing instrumentation in code. This is the list of AWS Services integrated with X-Ray.

X-Ray receives data from services as segments. Segments for a common request are then grouped into traces. Segments can be further broken down into more granular subsegments. Custom data key-value pairs are added to segments and subsegments with annotations and metadata. Traces can also be grouped which helps with filter expressions.

AWS Lambda instruments incoming requests for all supported languages. Lambda application code can be further instrumented to emit information about its status, correlation identifiers, and business outcomes to determine transaction flows across your workload.

X-Ray tracing for a Lambda function is enabled in the Lambda Console. Select a Lambda function. I select the Airline-ReserveBooking-develop function. In the Configuration pane, I select to enable X-Ray Active tracing.

x-ray-enable

X-Ray tracing enabled

X-Ray can also be enabled via CloudFormation with the following code:

TracingConfig:
  Mode: Active

Lambda IAM permissions to write to X-Ray are added automatically when active tracing is enabled via the console. When using CloudFormation, Allow the Actions xray:PutTraceSegments and xray:PutTelemetryRecords

It is important to understand what invocations X-Ray does trace. X-Ray applies a sampling algorithm. If an upstream service, such as API Gateway with X-Ray tracing enabled, has already sampled a request, the Lambda function request is also sampled. Without an upstream request, X-Ray traces data for the first Lambda invocation each second, and then 5% of additional invocations.

For the airline application, X-Ray tracing is initiated within the shared library with the code:

from aws_xray_sdk.core import models, patch_all, xray_recorder

Segments, subsegments, annotations, and metadata are added to functions with the following example code:

segment = xray_recorder.begin_segment('segment_name')
# Start a subsegment
subsegment = xray_recorder.begin_subsegment('subsegment_name')
# Add metadata and annotations
segment.put_metadata('key', dict, 'namespace')
subsegment.put_annotation('key', 'value')
# Close the subsegment and segment
xray_recorder.end_subsegment()
xray_recorder.end_segment()

For example, within the collect payment module, an annotation is added for a successful payment with:

tracer.put_annotation("PaymentStatus", "SUCCESS")

CloudWatch ServiceLens

Once a booking is made and payment is successful, the tracing is available in the X-Ray console.

I explore how Amazon CloudWatch ServiceLens connects metrics, logs and the X-Ray tracing. Within the CloudWatch console, I navigate to ServiceLens | Service Map.

I can visualize all application resources and dependencies where X-Ray is enabled. I can trace performance or availability issues. If there was an issue connecting to SNS for example, this would be shown.

I select the Airline-CollectPayment-develop node and can view the out-of-the-box standard Lambda metrics.

I can select View Logs to jump to the CloudWatch Logs Insights console.

cloudwatch-insights-service-map-view

CloudWatch Insights Service map

I select View dashboard to see the function metrics, node map, and function details.

cloudwatch-insights-service-map-dashboard

CloudWatch Insights Service Map dashboard

I select View traces and can filter by the custom metric PaymentStatus. I select SUCCESS and chose Add to filter. I then select a trace.

cloudwatch-insights-service-map-select-trace

CloudWatch Insights Filtered traces

I see the full trace details, which show the full application transaction of a payment collection.

cloudwatch-insights-service-map-view-trace

Segments timeline

Selecting the Lambda handler subsegment – ## lambda_handler, I can view the trace Annotations and Metadata, which include the business transaction details such as Customer and PaymentStatus.

cloudwatch-insights-service-map-view-trace-annotations

X-Ray annotations

Trace groups are another feature of X-Ray and ServiceLens. Trace groups use filter expressions such as Annotation.PaymentStatus = "FAILED" which are used to view traces that match the particular group. Service graphs can also be viewed, and CloudWatch alarms created based on the group.

CloudWatch ServiceLens provides powerful capabilities to understand application performance bottlenecks and issues, helping determine how users are impacted.

Improvement plan summary:

  1. Identify common business context and system data that are commonly present across multiple transactions.
  2. Instrument SDKs and requests to upstream/downstream services to understand the flow of a transaction across system components.

Recent announcements

There have been a number of recent announcements for X-Ray and CloudWatch to improve how to evaluate serverless application health.

Conclusion

Evaluating application health helps you identify which services should be optimized to improve your customer’s experience. In part 1, I cover out-of-the-box standard metrics and alerts, as well as structured and centralized logging. In this post, I explore custom metrics and distributed tracing and show how to use ServiceLens to view logs, metrics, and traces together.

In an upcoming post, I will cover the next Operational Excellence question from the Well-Architected Serverless Lens – Approaching application lifecycle management.

Raspberry Pi puts the heart back in mid-noughties nostalgia tech

Post Syndicated from Ashley Whittaker original https://www.raspberrypi.org/blog/raspberry-pi-puts-the-heart-back-in-mid-noughties-nostalgia-tech/

Is it still the Easter holidays? Can anyone tell? Does it matter, when we have nostalgic tech bunny pets to share with you?

These little bunnies can now do much more than when they first appeared. But they’re still incredibly cute – just look at that little lopsided-ear thing they do.

The original Nabaztag bunnies were to us in the mid-noughties what Tamagotchis were to eleven-year-olds everywhere in the 1990s. They communicated through colour, light, and sound. But now (and here’s the best bit), with a simple bit of surgery and the help of a new Raspberry Pi heart, your digital desk pet will be smarter than ever. It will be able to tell you what the weather is like, and offer local speech recognition as well as “ear-based Tai Chi”. No, we’re not sure either, but we are sure that it sounds cool. And very calming.

Part of the custom kit that will breathe new life into your bunny

The design team have created what they call the TagTagTag kit. Here are the main components of said kit:

This new venture had its first outing at the Paris Maker Faire in 2018, and it looks like we’re already too late to buy one of the limited number of ready-made upgraded bunnies. However, those of you who kept hold of your original bunny might be able to source one of Nabaztag’s custom boards to upgrade it yourself if you’re prepared to be patient – head over to the project’s funding page. You’ll also need a Raspberry Pi Zero W and a microSD card. The video below is in French, but it’s captioned.

Nabaztag’s funding page also shares all of the tech specs, schematics, and open source Python code you’re going to need.

We know this might be a tricky project for which to source all the parts, but it’s just. So. Cute. Follow the rabbit on Twitter to find out when you might be able to get your hands on a custom board.

The post Raspberry Pi puts the heart back in mid-noughties nostalgia tech appeared first on Raspberry Pi.

Make a Side Pocket-esque pool game | Wireframe #36

Post Syndicated from Ryan Lambie original https://www.raspberrypi.org/blog/make-a-side-pocket-esque-pool-game-wireframe-36/

Recreate the arcade pool action of Data East’s Side Pocket. Raspberry Pi’s own Mac Bowley has the code.

In the original Side Pocket, the dotted line helped the player line up shots, while additional functions on the UI showed where and how hard you were striking the cue ball.

Created by Data East in 1986, Side Pocket was an arcade pool game that challenged players to sink all the balls on the table and achieve a minimum score to progress. As the levels went on, players faced more balls in increasingly difficult locations on the table.

Here, I’ll focus on three key aspects from Side Pocket: aiming a shot, moving the balls, and handling collisions for balls and pockets. This project is great for anyone who wants to dip their toe into 2D game physics. I’m going to use the Pygame’s built-in collision system as much as possible, to keep the code readable and short wherever I can.

Making a pool game

Before thinking about aiming and moving balls, I need a table to play on. I created both a border and a play area sprite using piskelapp.com; originally, this was one sprite, and I used a rect to represent the play area (see Figure 1). Changing to two sprites and making the play area an actor made all the collisions easier to handle and made everything much easier to place.

Figure 1: Our table with separate border. You could add some detail to your own table, or even adapt a photograph to make it look even more realistic.

For the balls, I made simple 32×32 sprites in varying colours. I need to be able to keep track of some information about each ball on the table, such as its position, a sprite, movement, and whether it’s been pocketed or not – once a ball’s pocketed, it’s removed from play. Each ball will have similar functionality as well – moving and colliding with each other. The best way to do this is with a class: a blueprint for each ball that I will make copies of when I need a new ball on the table.

class Ball:
def __init__(self, image, pos):
self.actor = Actor(image, center=pos, anchor=(“center”, “center”))
self.movement = [0, 0]
self.pocketed = False

def move(self):
self.actor.x += self.movement[0]
self.actor.y += self.movement[1]
if self.pocketed == False:
if self.actor.y < playArea.top + 16 or self.actor.y > playArea.bottom-16:
self.movement[1] = -self.movement[1]
self.actor.y = clamp(self.actor.y, playArea.top+16, playArea.bottom-16)
if self.actor.x < playArea.left+16 or self.actor.x > playArea.right-16:
self.movement[0] = -self.movement[0]
self.actor.x = clamp(self.actor.x, playArea.left+16, playArea.right-16)
else:
self.actor.x += self.movement[0]
self.actor.y += self.movement[1]
self.resistance()

def resistance(self):
# Slow the ball down
self.movement[0] *= 0.95
self.movement[1] *= 0.95

if abs(self.movement[0]) + abs(self.movement[1]) < 0.4:
self.movement = [0, 0]

The best part about using a class is that I only need to make one piece of code to move a ball, and I can reuse it for every ball on the table. I’m using an array to keep track of the ball’s movement – how much it will move each frame. I also need to make sure it bounces off the sides of the play area if it hits them. I’ll use an array to hold all the balls on the table.

To start with, I need a cue ball:

balls = []
cue_ball = Ball(“cue_ball.png”, (WIDTH//2, HEIGHT//2))
balls.append(cue_ball)

Aiming the shot

In Side Pocket, players control a dotted line that shows where the cue ball will go when they take a shot. Using the joystick or arrow buttons rotated the shot and moved the line, so players could aim to get the balls in the pockets (see Figure 2). To achieve this, we have to dive into our first bit of maths, converting a rotation in degrees to a pair of x and y movements. I decided my rotation would be at 0 degrees when pointing straight up; the player can then press the right and left arrow to increase or decrease this value.

Figure 2: The dotted line shows the trajectory of the ball. Pressing the left or right arrows rotates the aim.

Pygame Zero has some built-in attributes for checking the keyboard, which I’m taking full advantage of.

shot_rotation = 270.0 # Start pointing up table
turn_speed = 1
line = [] # To hold the points on my line
line_gap = 1/12
max_line_length = 400
def update():
global shot_rotation

## Rotate your aim
if keyboard[keys.LEFT]:
shot_rotation -= 1 * turn_speed
if keyboard[keys.RIGHT]:
shot_rotation += 1 * turn_speed

# Make the rotation wrap around
if shot_rotation > 360:
shot_rotation -= 360
if shot_rotation < 0:
shot_rotation += 360

At 0 degrees, my cue ball’s movement should be 0 in the x direction and -1 in y. When the rotation is 90 degrees, my x movement would be 1 and y would be zero; anything in between should be a fraction between the two numbers. I could use a lot of ‘if-elses’ to set this, but an easier way is to use sin and cos on my angle – I sin the rotation to get my x value and cos the rotation to get the y movement.

# The in-built functions need radian
rot_radians = shot_rotation * (math.pi/180)

x = math.sin(rot_rads)
y = -math.cos(rot_rads)
if not shot:
current_x = cue_ball.actor.x
current_y = cue_ball.actor.y
length = 0
line = []
while length < max_line_length:
hit = False
if current_y < playArea.top or current_y > playArea.bottom:
y = -y
hit = True
if current_x < playArea.left or current_x > playArea.right:
x = -x
hit = True
if hit == True:
line.append((current_x-(x*line_gap), current_y-(y*line_gap)))
length += math.sqrt(((x*line_gap)**2)+((y*line_gap)**2) )
current_x += x*line_gap
current_y += y*line_gap
line.append((current_x-(x*line_gap), current_y-(y*line_gap)))

I can then use those x and y co-ordinates to create a series of points for my aiming line.

Shooting the ball

To keep things simple, I’m only going to have a single shot speed – you could improve this design by allowing players to load up a more powerful shot over time, but I won’t do that here.

shot = False
ball_speed = 30


## Inside update
## Shoot the ball with the space bar
if keyboard[keys.SPACE] and not shot:
shot = True
cue_ball.momentum = [x*ball_speed, y*ball_speed]

When the shot variable is True, I’m going to move all the balls on my table – at the beginning, this is just the cue ball – but this code will also move the other balls as well when I add them.

# Shoot the ball and move all the balls on the table
else:
shot = False
balls_pocketed = []
collisions = []
for b in range(len(balls)):
# Move each ball
balls[b].move()
if abs(balls[b].momentum[0]) + abs(balls[b].momentum[1]) > 0:
shot = True

Each time I move the balls, I check whether they still have some movement left. I made a resistance function inside the ball class that will slow them down.

Collisions

Now for the final problem: getting the balls to collide with each other and the pockets. I need to add more balls and some pocket actors to my game in order to test the collisions.

balls.append(Ball(“ball_1.png”, (WIDTH//2 - 75, HEIGHT//2)))
balls.append(Ball(“ball_2.png”, (WIDTH//2 - 150, HEIGHT//2)))

pockets = []
pockets.append(Actor(“pocket.png”, topleft=(playArea.left, playArea.top), anchor=(“left”, “top”)))
# I create one of these actors for each pocket, they are not drawn

Each ball needs to be able to collide with the others, and when that happens, the direction and speed of the balls will change. Each ball will be responsible for changing the direction of the ball it has collided with, and I add a new function to my ball class:

def collide(self, ball):
collision_normal = [ball.actor.x - self.actor.x, ball.actor.y - self.actor.y]
ball_speed = math.sqrt(collision_normal[0]**2 + collision_normal[1]**2)
self_speed = math.sqrt(self.momentum[0]**2 + self.momentum[1]**2)
if self.momentum[0] == 0 and self.momentum[1] == 0:
ball.momentum[0] = -ball.momentum[0]
ball.momentum[1] = -ball.momentum[1]
elif ball_speed > 0:
collision_normal[0] *= 1/ball_speed
collision_normal[1] *= 1/ball_speed
ball.momentum[0] = collision_normal[0] * self_speed
ball.momentum[1] = collision_normal[1] * self_speed

When a collision happens, the other ball should move in the opposite direction to the collision. This is what allows you to line-up slices and knock balls diagonally into the pockets. Unlike the collisions with the edges, I can’t just reverse the x and y movement. I need to change its direction, and then give it a part of the current ball’s speed. Above, I’m using a normal to find the direction of the collision. You can think of this as the direction to the other ball as they collide.

Our finished pool game. See if you can expand it with extra balls and maybe a scoring system.

Handling collisions

I need to add to my update loop to detect and store the collisions to be handled after each set of movement.

# Check for collisions
for other in balls:
if other != b and b.actor.colliderect(other.actor):
collisions.append((b, other))
# Did it sink in the hole?
in_pocket = b.actor.collidelistall(pockets)
if len(in_pocket) > 0 and b.pocketed == False:
if b != cue_ball:
b.movement[0] = (pockets[in_pocket[0]].x - b.actor.x) / 20
b.movement[1] = (pockets[in_pocket[0]].y - b.actor.y) / 20
b.pocket = pockets[in_pocket[0]]
balls_pocketed.append(b)
else:
b.x = WIDTH//2
b.y = HEIGHT//2

First, I use the colliderect() function to check if any of the balls collide this frame – if they do, I add them to a list. This is so I handle all the movement first and then the collisions. Otherwise, I’m changing the momentum of balls that haven’t moved yet. I detect whether a pocket was hit as well; if so, I change the momentum so that the ball heads towards the pocket and doesn’t bounce off the walls anymore.

When all my balls have been moved, I can handle the collisions with both the other balls and the pockets:

for col in collisions:
col[0].collide(col[1])
if shot == False:
for b in balls_pocketed:
balls.remove(b)

And there you have it: the beginnings of an arcade pool game in the Side Pocket tradition. You can get the full code and assets right here.

Get your copy of Wireframe issue 36

You can read more features like this one in Wireframe issue 36, available directly from Raspberry Pi Press — we deliver worldwide. And if you’d like a handy digital version of the magazine, you can also download issue 36 for free in PDF format.

Make sure to follow Wireframe on Twitter and Facebook for updates and exclusive offers and giveaways. Subscribe on the Wireframe website to save up to 49% compared to newsstand pricing!

The post Make a Side Pocket-esque pool game | Wireframe #36 appeared first on Raspberry Pi.

Introducing a new generation of AWS Elastic Beanstalk platforms

Post Syndicated from David LaBissoniere original https://aws.amazon.com/blogs/compute/introducing-a-new-generation-of-aws-elastic-beanstalk-platforms/

In my last post I discussed AWS Elastic Beanstalk’s new public roadmap on GitHub. Today I want to talk about our new generation of Elastic Beanstalk platforms built on top of Amazon Linux 2 (AL2).

Late last year we launched a public beta of a new Elastic Beanstalk platform for Amazon Corretto — Amazon’s no-cost, production-ready distribution of the Open Java Development Kit (OpenJDK). This is also our first platform based on AL2. This year we have launched two more beta AL2 platforms: Docker and Python. More beta platforms are arriving soon, followed by generally available platform releases.

A sample application using the new Python 3.7 beta platform

A sample application using the new Python 3.7 beta platform

I want to dive a little deeper on what we are doing with these platforms. Elastic Beanstalk was publicly launched in 2011, and announced in a blog post by Jeff Barr. Back then there were few enough AWS services that they were all listed as tabs along the top of the AWS Management Console. At launch, we supported only Apache Tomcat applications. Over time, we added support for many other runtimes and began using the term “platform” to describe our offerings. Today we support a wide variety of platforms for popular web application frameworks. For example, Ruby on Rails, PHP, and Node.js, as well as generic Docker-based platforms. In the years since we launched each platform, the underlying communities have continued to evolve. Elastic Beanstalk is an opinionated service, especially when it comes to our platforms. As the service evolves, the opinions baked into our platforms must evolve as well.

With our AL2 platforms, we are refreshing each platform based on feedback we’ve gotten from customers. For example, with Java we heard concerns from many customers about long-term support and licensing of OpenJDK. That’s why in AL2 we are using Amazon’s own Corretto distribution, which includes committed long-term support. It also has performance and scalability improvements learned from Amazon’s years of experience running Java across thousands of production services — such as the Elastic Beanstalk service itself. For more details, see this section of our Java platform documentation.

Our Python AL2 platform has also been modernized. Previously we only supported serving applications through Apache and mod_wsgi. Now we are using NGINX as a reverse proxy in front of Gunicorn, with the flexibility to use another Web Server Gateway Interface (WSGI) server if you prefer. We also took this opportunity to add support for Pipenv and Pipfile, more modern and powerful Python dependency management tools. Learn more in our Python platform documentation.

The Docker AL2 platform is rewritten internally, but provides largely the same customer experience. It does offer improved I/O performance by using the OverlayFS storage driver. This is a change from the previous Docker platform, which used the older and slower Device Mapper storage driver and required an extra Amazon EBS volume.

We are hard at work on another set of beta platforms including PHP, Ruby, and Node.js, which are expected to launch soon. Each of these have been modernized and improved. For a full list of differences between our existing platforms and their Amazon Linux 2 equivalents, check out our documentation. In the next section I want to take a closer look at one new feature that applies to all of the new platforms: platform hooks.

Platform hooks

With our AL2 platforms, we are offering a simplified model for on-instance customization. We’ve long supported configuration files called ebextensions that allow customization of environment options, resources, and on-instance behavior. These have enabled customers to extend their environments in ways we never dreamed of. But we’ve also heard customer feedback about the difficulty of writing complex shell scripts embedded within YAML or JSON. And as they are, ebextensions don’t provide any straightforward mechanism to execute custom code after an application deployment is completed. Customers have pointed out many use cases where they want to do this – for example to enable third party monitoring tools.

With our new generation of Linux platforms, we are introducing platform hooks. Platform hooks are a set of directories inside the application bundle that you can populate with scripts. These scripts are executed at defined points in the on-instance application deployment lifecycle. These hooks are reminiscent of custom platform hooks, but are simplified and easier to manage and version because they are part of the application bundle.

For example, a Corretto application bundle might look like:

├── .platform
│   ├── hooks
│   │   ├── prebuild
│   │   │   ├── 01_set_secrets.sh
│   │   │   └── 10_install_dependencies.sh
│   │   └── predeploy
│   │       └── 01_configure_corretto.sh
│   │   ├── postdeploy
│   │   │   └── 99_log_deployment_complete.pay
│   └── nginx
│       └── conf.d
│           └── custom.conf
├── Procfile
└── application.jar

The files in each of the .platform/hooks/ subdirectories are executed in lexicographical order at predefined points in the deployment process.

  1. prebuild hooks are executed after the application is downloaded and extracted, but before we try to configure anything
  2. predeploy hooks are run after the application is configured and staged, but before it is deployed.
  3. postdeploy hooks are run at the very end — after the application is deployed and running.

Finally, take note of the .platform/nginx/ directory as well. This can be used to provide custom configuration additions or overrides for the on-instance NGINX proxy server. You can either override the provided configuration file completely, or just add a new configuration file that is imported by NGINX. Because all of the AL2 platforms use NGINX and the same base configuration, these customizations are now more portable across platforms. For a full explanation of platform hooks and related functionality, see our Extending Linux Platforms documentation page.

We’re excited to launch this new generation of Elastic Beanstalk platforms, and to hear feedback from you about how we can make them even better. If you have feedback about one of the AL2 beta platforms, please add a comment to the relevant issue on the public roadmap on GitHub. For example, here is the issue for the Corretto platform. Keep an eye on the roadmap and our release notes for announcements of the remaining platforms over the coming weeks.

 

Building well-architected serverless applications: Understanding application health – part 1

Post Syndicated from Julian Wood original https://aws.amazon.com/blogs/compute/building-well-architected-serverless-applications-understanding-application-health-part-1/

This series of blog posts uses the AWS Well-Architected Tool with the Serverless Lens to help customers build and operate applications using best practices. In each post, I address the nine serverless-specific questions identified by the Serverless Lens along with the recommended best practices. See the Introduction post for a table of contents and explaining the example application.

Question OPS1: How do you evaluate your serverless application’s health?

Evaluating your metrics, distributed tracing, and logging gives you insight into business and operational events, and helps you understand which services should be optimized to improve your customer’s experience. By understanding the health of your Serverless Application, you will know whether it is functioning as expected, and can proactively react to any signals that indicate it is becoming unhealthy.

Required practice: Understand, analyze, and alert on metrics provided out of the box

It is important to understand metrics for every AWS service used in your application so you can decide how to measure its behavior. AWS services provide a number of out-of-the-box standard metrics to help monitor the operational health of your application.

As these metrics are generated automatically, it is a simple way to start monitoring your application and can also be augmented with custom metrics.

The first stage is to identify which services the application uses. The airline booking component uses AWS Step Functions, AWS Lambda, Amazon SNS, and Amazon DynamoDB.

When I make a booking, as shown in the Introduction post, AWS services emit metrics to Amazon CloudWatch. These are processed asynchronously without impacting the application’s performance.

There are two default CloudWatch dashboards to visualize key metrics quickly: per service and cross service.

Per service

To view the per service metrics dashboard, I open the CloudWatch console.

per-service-metrics-dashboardI select a service where Overview is shown, such as Lambda. Now I can view the metrics for all Lambda functions in the account.

per-service-metrics-lambdaCross service

To see an overview of key metrics across all AWS services, open the CloudWatch console and choose View cross service dashboard.

cross-service-metrics-dashboardI see a list of all services with one or two key metrics displayed. This provides a good overview of all services your application uses.

Alerting

The next stage is to identify the key metrics for comparison and set up alerts for under- and over-performing services. Here are some recommended metrics to alarm on for a number of AWS services.

Alerts can be configured manually or via infrastructure as code tools such as the AWS Serverless Application Model, AWS CloudFormation, or third-party tools.

To configure a manual alert for Lambda function errors using CloudWatch Alarms:

  1. I open the CloudWatch console and select Alarms and select Create Alarm.
  2. I choose Select Metric and from AWS Namespaces, select Lambda, Across All Functions and select Errors and select Select metric.add-metric-to-alert
  3. I change the Statistic to Sum and the Period to 1 minute.metric-values
  4. Under Conditions, I select a Static threshold Greater than 1 and select Next.

Alarms can also be created using anomaly detection rather than static values if there is a discernible pattern or trend. Anomaly detection looks at past metric data and uses machine learning to create a model of expected values. Alerts can then be configured if they fall outside this band of “normal” values. I use a Static threshold for this alarm.

  1. For the notification, I set the trigger to alarm to an existing SNS topic with my email address, then choose Next.metric-notification
  2. I enter a descriptive alarm name such as serverlessairline-lambda-prod-errors > 1, select Next, and choose Create alarm.

I have now manually set up an alarm.

Use CloudWatch composite alarms to combine multiple alarms to reduce noise and focus on critical issues. For example, a single alarm could trigger if there are both Lambda function errors as well as high Lambda concurrent executions.

It is simpler and more scalable to include alerting within infrastructure as code. Here is an example of alerting programmatically using CloudFormation.

I view the out of the box standard metrics and in this example, manually create an alarm for Lambda function errors.

Improvement plan summary:

  1. Understand what metrics and dimensions each managed service used provides.
  2. Configure alerts on relevant metrics for when services are unhealthy.

Good practice: Use structured and centralized logging

Central logging provides a single place to search and analyze logs. Structured logging means selecting a consistent log format and content structure to simplify querying across multiple components.

To identify a business transaction across components, such as a particular flight booking, log operational information from upstream and downstream services. Add information such as customer_id along with business outcomes such as order=accepted or order=confirmed. Make sure you are not logging any sensitive or personal identifying data in any logs.

Use JSON as your logging output format. Log multiple fields in a single object or dictionary rather than many one line messages for simpler searching.

Here is an example of a structured logging format.

The airline booking component, which is written in Python, currently uses a shared library with a separate log processing stack.

Embedded Metrics Format is a simpler mechanism to replace the shared library and use structured logging. CloudWatch Embedded Metrics adds environmental metadata such as Lambda Function version and also automatically extracts custom metrics so you can visualize and alarm on them. There are open-source client libraries available for Node.js and Python.

I then add embedded metrics to the individual confirm booking module with the following steps:

  1. I install the aws-embedded-metrics library using the instructions.
  2. In the function init code, I import the module and create a metric_scope with the following code

from aws_embedded_metrics import metric_scope
@metric_scope

  1. In the function handler, I log the generated bookingReference with the following code.

metrics.set_property("BookingReference", ret["bookingReference"])

In this example I also log the entire incoming event details.

metrics.set_property("event", event)

It is best practice to only log what is required to avoid unnecessary costs. Ensure the event does not have any sensitive or personal identifying data which is available to anyone who has access to the logs.

To avoid the duplicate logging in this example airline application which adds cost, I remove the existing shared library logger.*() lines.

When I make a booking, the CloudWatch log message is in structured JSON format. It contains the properties I set event, BookingReference, as well as function metadata.

I can then search for all log activity related to a specific booking across multiple functions with booking_id. I can track customer activity across multiple bookings using customer_id.

Logging is often created as a shared library resource which all functions reference. Another option is using Lambda Layers, which lets functions import additional code such as external libraries. Multiple functions can share this code.

Improvement plan summary:

  1. Log request identifiers from downstream services, component name, component runtime information, unique correlation identifiers, and information that helps identify a business transaction.
  2. Use JSON as the logging output. Prefer logging entire objects/dictionaries rather than many one line messages. Mask or remove sensitive data when logging.
  3. Minimize logging debugging information to a minimum as they can incur both costs and increase noise to signal ratio

Conclusion

Evaluating serverless application health helps understand which services should be optimized to improve your customer’s experience. I cover out of the box metrics and alerts, as well as structured and centralized logging.

This well-architected question will be continued in an upcoming post where I look at custom metrics and distributed tracing.

Code Hyper Sports’ shooting minigame | Wireframe #35

Post Syndicated from Ryan Lambie original https://www.raspberrypi.org/blog/code-hyper-sports-shooting-minigame-wireframe-35/

Gun down the clay pigeons in our re-creation of a classic minigame from Konami’s Hyper Sports. Take it away, Mark Vanstone

Hyper Sports

Hyper Sports’ Japanese release was tied in with the 1984 Summer Olympics.

Hyper Sports

Konami’s sequel to its 1983 arcade hit, Track & Field, Hyper Sports offered seven games – or events – in which up to four players could participate. Skeet shooting was perhaps the most memorable game in the collection, and required just two buttons: fire left and fire right.

The display showed two target sights, and each moved up and down to come into line with the next clay disc’s trajectory. When the disc was inside the red target square, the player pressed the fire button, and if their timing was correct, the clay disc exploded. Points were awarded for being on target, and every now and then, a parrot flew across the screen, which could be gunned down for a bonus.

Making our game

To make a skeet shooting game with Pygame Zero, we need a few graphical elements. First, a static background of hills and grass, with two clay disc throwers each side of the screen, and a semicircle where our shooter stands – this can be displayed first, every time our draw() function is called.

We can then draw our shooter (created as an Actor) in the centre near the bottom of the screen. The shooter has three images: one central while no keys are pressed, and two for the directions left and right when the player presses the left or right keys. We also need to have two square target sights to the left and right above the shooter, which we can create as Actors.

When the clay targets appear, the player uses the left and right buttons to shoot either the left or right target respectively.

To make the clay targets, we create an array to hold disc Actor objects. In our update() function we can trigger the creation of a new disc based on a random number, and once created, start an animation to move it across the screen in front of the shooter. We can add a shadow to the discs by tracking a path diagonally across the screen so that the shadow appears at the correct Y coordinate regardless of the disc’s height – this is a simple way of giving our game the illusion of depth. While we’re in the update() function, looping around our disc object list, we can calculate the distance of the disc to the nearest target sight frame, and from that, work out which is the closest.

When we’ve calculated which disc is closest to the right-hand sight, we want to move the sight towards the disc so that their paths intersect. All we need to do is take the difference of the Y coordinates, divide by two, and apply that offset to the target sight. We also do the same for the left-hand sight. If the correct key (left or right arrows) is pressed at the moment a disc crosses the path of the sight frame, we register a hit and cycle the disc through a sequence of exploding frames. We can keep a score and display this with an overlay graphic so that the player knows how well they’ve done.

And that’s it! You may want to add multiple players and perhaps a parrot bonus, but we’ll leave that up to you.

Here’s Mark’s code snippet, which creates a simple shooting game in Python. To get it working on your system, you’ll need to install Pygame Zero. And to download the full code and assets, go here.

Get your copy of Wireframe issue 35

You can read more features like this one in Wireframe issue 35, available now at Tesco, WHSmith, and all good independent UK newsagents.

Or you can buy Wireframe directly from Raspberry Pi Press — delivery is available worldwide. And if you’d like a handy digital version of the magazine, you can also download issue 35 for free in PDF format.

Make sure to follow Wireframe on Twitter and Facebook for updates and exclusive offers and giveaways. Subscribe on the Wireframe website to save up to 49% compared to newsstand pricing!

The post Code Hyper Sports’ shooting minigame | Wireframe #35 appeared first on Raspberry Pi.

Building a Raspberry Pi telepresence robot using serverless: Part 1

Post Syndicated from Moheeb Zara original https://aws.amazon.com/blogs/compute/building-a-raspberry-pi-telepresence-robot-using-serverless-part-1/

A Pimoroni STS-Pi Robot Kit connected to AWS for remote control and viewing.

A Pimoroni STS-Pi Robot Kit connected to AWS for remote control and viewing.

A telepresence robot allows you to explore remote environments from the comfort of your home through live stream video and remote control. These types of robots can improve the lives of the disabled, elderly, or those that simply cannot be with their coworkers or loved ones in person. Some are used to explore off-world terrain and others for search and rescue.

This guide walks through building a simple telepresence robot using a Pimoroni STS-PI Raspberry Pi robot kit. A Raspberry Pi is a small low-cost device that runs Linux. Add-on modules for Raspberry Pi are called “hats”. You can substitute this kit with any mobile platform that uses two motors wired to an Adafruit Motor Hat or a Pimoroni Explorer Hat.

The sample serverless application uses AWS Lambda and Amazon API Gateway to create a REST API for driving the robot. A Python application running on the robot uses AWS IoT Core to receive drive commands and authenticate with Amazon Kinesis Video Streams with WebRTC using an IoT Credentials Provider. In the next blog I walk through deploying a web frontend to both view the livestream and control the robot via the API.

Prerequisites

You need the following to complete the project:

A Pimoroni STS-Pi robot kit, Explorer Hat, Raspberry Pi, camera, and battery.

A Pimoroni STS-Pi robot kit, Explorer Hat, Raspberry Pi, camera, and battery.

Estimated Cost: $120

There are three major parts to this project. First deploy the serverless backend using the AWS Serverless Application Repository. Then assemble the robot and run an installer on the Raspberry Pi. Finally, configure and run the Python application on the robot to confirm it can be driven through the API and is streaming video.

Deploy the serverless application

In this section, use the Serverless Application Repository to deploy the backend resources for the robot. The resources to deploy are defined using the AWS Serverless Application Model (SAM), an open-source framework for building serverless applications using AWS CloudFormation. To deeper understand how this application is built, look at the SAM template in the GitHub repository.

An architecture diagram of the AWS IoT and Amazon Kinesis Video Stream resources of the deployed application.

The Python application that runs on the robot requires permissions to connect as an IoT Thing and subscribe to messages sent to a specific topic on the AWS IoT Core message broker. The following policy is created in the SAM template:

RobotIoTPolicy:
      Type: "AWS::IoT::Policy"
      Properties:
        PolicyName: !Sub "${RobotName}Policy"
        PolicyDocument:
          Version: "2012-10-17"
          Statement:
            - Effect: Allow
              Action:
                - iot:Connect
                - iot:Subscribe
                - iot:Publish
                - iot:Receive
              Resource:
                - !Sub "arn:aws:iot:*:*:topicfilter/${RobotName}/action"
                - !Sub "arn:aws:iot:*:*:topic/${RobotName}/action"
                - !Sub "arn:aws:iot:*:*:topic/${RobotName}/telemetry"
                - !Sub "arn:aws:iot:*:*:client/${RobotName}"

To transmit video, the Python application runs the amazon-kinesis-video-streams-webrtc-sdk-c sample in a subprocess. Instead of using separate credentials to authenticate with Kinesis Video Streams, a Role Alias policy is created so that IoT credentials can be used.

{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Action": [
        "iot:Connect",
        "iot:AssumeRoleWithCertificate"
      ],
      "Resource": "arn:aws:iot:Region:AccountID:rolealias/robot-camera-streaming-role-alias",
      "Effect": "Allow"
    }
  ]
}

When the above policy is attached to a certificate associated with an IoT Thing, it can assume the following role:

 KVSCertificateBasedIAMRole:
      Type: 'AWS::IAM::Role'
      Properties:
        AssumeRolePolicyDocument:
          Version: '2012-10-17'
          Statement:
          - Effect: 'Allow'
            Principal:
              Service: 'credentials.iot.amazonaws.com'
            Action: 'sts:AssumeRole'
        Policies:
        - PolicyName: !Sub "KVSIAMPolicy-${AWS::StackName}"
          PolicyDocument:
            Version: '2012-10-17'
            Statement:
            - Effect: Allow
              Action:
                - kinesisvideo:ConnectAsMaster
                - kinesisvideo:GetSignalingChannelEndpoint
                - kinesisvideo:CreateSignalingChannel
                - kinesisvideo:GetIceServerConfig
                - kinesisvideo:DescribeSignalingChannel
              Resource: "arn:aws:kinesisvideo:*:*:channel/${credentials-iot:ThingName}/*"

This role grants access to connect and transmit video over WebRTC using the Kinesis Video Streams signaling channel deployed by the serverless application. An architecture diagram of the API endpoint in the deployed application.

A deployed API Gateway endpoint, when called with valid JSON, invokes a Lambda function that publishes to an IoT message topic, RobotName/action. The Python application on the robot subscribes to this topic and drives the motors based on any received message that maps to a command.

  1. Navigate to the aws-serverless-telepresence-robot application in the Serverless Application Repository.
  2. Choose Deploy.
  3. On the next page, under Application Settings, fill out the parameter, RobotName.
  4. Choose Deploy.
  5. Once complete, choose View CloudFormation Stack.
  6. Select the Outputs tab. Copy the ApiURL and the EndpointURL for use when configuring the robot.

Create and download the AWS IoT device certificate

The robot requires an AWS IoT root CA (fetched by the install script), certificate, and private key to authenticate with AWS IoT Core. The certificate and private key are not created by the serverless application since they can only be downloaded on creation. Create a new certificate and attach the IoT policy and Role Alias policy deployed by the serverless application.

  1. Navigate to the AWS IoT Core console.
  2. Choose Manage, Things.
  3. Choose the Thing that corresponds with the name of the robot.
  4. Under Security, choose Create certificate.
  5. Choose Activate.
  6. Download the Private Key and Thing Certificate. Save these securely, as this is the only time you can download this certificate.
  7. Choose Attach Policy.
  8. Two policies are created and must be attached. From the list, select
    <RobotName>Policy
    AliasPolicy-<AppName>
  9. Choose Done.

Flash an operating system to an SD card

The Raspberry Pi single-board Linux computer uses an SD card as the main file system storage. Raspbian Buster Lite is an officially supported Debian Linux operating system that must be flashed to an SD card. Balena.io has created an application called balenaEtcher for the sole purpose of accomplishing this safely.

  1. Download the latest version of Raspbian Buster Lite.
  2. Download and install balenaEtcher.
  3. Insert the SD card into your computer and run balenaEtcher.
  4. Choose the Raspbian image. Choose Flash to burn the image to the SD card.
  5. When flashing is complete, balenaEtcher dismounts the SD card.

Configure Wi-Fi and SSH headless

Typically, a keyboard and monitor are used to configure Wi-Fi or to access the command line on a Raspberry Pi. Since it is on a mobile platform, configure the Raspberry Pi to connect to a Wi-Fi network and enable remote access headless by adding configuration files to the SD card.

  1. Re-insert the SD card to your computer so that it shows as volume boot.
  2. Create a file in the boot volume of the SD card named wpa_supplicant.conf.
  3. Paste in the following contents, substituting your Wi-Fi credentials.
    ctrl_interface=DIR=/var/run/wpa_supplicant GROUP=netdev
            update_config=1
            country=<Insert country code here>
    
            network={
             ssid="<Name of your WiFi>"
             psk="<Password for your WiFi>"
            }

  4. Create an empty file without a file extension in the boot volume named ssh. At boot, the Raspbian operating system looks for this file and enables remote access if it exists. This can be done from a command line:
    cd path/to/volume/boot
    touch ssh

  5. Safely eject the SD card from your computer.

Assemble the robot

For this section, you can use the Pimoroni STS-Pi robot kit with a Pimoroni Explorer Hat, along with a Raspberry Pi Model 3 B+ or newer, and a camera module. Alternatively, you can use any two motor robot platform that uses the Explorer Hat or Adafruit Motor Hat.

  1. Follow the instructions in this video to assemble the Pimoroni STS-Pi robot kit.
  2. Place the SD card in the Raspberry Pi.
  3. Since the installation may take some time, power the Raspberry Pi using a USB 5V power supply connected to a wall plug rather than a battery.

Connect remotely using SSH

Use your computer to gain remote command line access of the Raspberry Pi using SSH. Both devices must be on the same network.

  1. Open a terminal application with SSH installed. It is already built into Linux and Mac OS, to enable SSH on Windows follow these instructions.
  2. Enter the following to begin a secure shell session as user pi on the default local hostname raspberrypi, which resolves to the IP address of the device using MDNS:
  3. If prompted to add an SSH key to the list of known hosts, type yes.
  4. When prompted for a password, type raspberry. This is the default password and can be changed using the raspi-config utility.
  5. Upon successful login, you now have shell access to your Raspberry Pi device.

Enable the camera using raspi-config

A built-in utility, raspi-config, provides an easy to use interface for configuring Raspbian. You must enable the camera module, along with I2C, a serial bus used for communicating with the motor driver.

  1. In an open SSH session, type the following to open the raspi-config utility:
    sudo raspi-config

  2. Using the arrows, choose Interfacing Options.
  3. Choose Camera. When prompted, choose Yes to enable the camera module.
  4. Repeat the process to enable the I2C interface.
  5. Select Finish and reboot.

Run the install script

An installer script is provided for building and installing the Kinesis Video Stream WebRTC producer, AWSIoTPythonSDK and Pimoroni Explorer Hat Python libraries. Upon completion, it creates a directory with the following structure:

├── /home/pi/Projects/robot
│  └── main.py // The main Python application
│  └── config.json // Parameters used by main.py
│  └── kvsWebrtcClientMasterGstSample //Kinesis Video Stream producer
│  └── /certs
│     └── cacert.pem // Amazon SFSRootCAG2 Certificate Authority
│     └── certificate.pem // AWS IoT certificate placeholder
│     └── private.pem.key // AWS IoT private key placeholder
  1. Open an SSH session on the Raspberry Pi.
  2. (Optional) If using the Adafruit Motor Hat, run this command, otherwise the script defaults to the Pimoroni Explorer Hat.
    export MOTOR_DRIVER=adafruit  

  3. Run the following command to fetch and execute the installer script.
    wget -O - https://raw.githubusercontent.com/aws-samples/aws-serverless-telepresence-robot/master/scripts/install.sh | bash

  4. While the script installs, proceed to the next section.

Configure the code

The Python application on the robot subscribes to AWS IoT Core to receive messages. It requires the certificate and private key created for the IoT thing to authenticate. These files must be copied to the directory where the Python application is stored on the Raspberry Pi.

It also requires the IoT Credentials endpoint is added to the file config.json to assume permissions necessary to transmit video to Amazon Kinesis Video Streams.

  1. Open an SSH session on the Raspberry Pi.
  2. Open the certificate.pem file with the nano text editor and paste in the contents of the certificate downloaded earlier.
    cd/home/pi/Projects/robot/certs
    nano certificate.pem

  3. Press CTRL+X and then Y to save the file.
  4. Repeat the process with the private.key.pem file.
    nano private.key.pem

  5. Open the config.json file.
    cd/home/pi/Projects/robot
    nano config.json

  6. Provide the following information:
    IOT_THINGNAME: The name of your robot, as set in the serverless application.
    IOT_CORE_ENDPOINT: This is found under the Settings page in the AWS IoT Core console.
    IOT_GET_CREDENTIAL_ENDPOINT: Provided by the serverless application.
    ROLE_ALIAS: This is already set to match the Role Alias deployed by the serverless application.
    AWS_DEFAULT_REGION: Corresponds to the Region the application is deployed in.
  7. Save the file using CTRL+X and Y.
  8. To start the robot, run the command:
    python3 main.py

  9. To stop the script, press CTRL+C.

View the Kinesis video stream

The following steps create a WebRTC connection with the robot to view the live stream.

  1. Navigate to the Amazon Kinesis Video Streams console.
  2. Choose Signaling channels from the left menu.
  3. Choose the channel that corresponds with the name of your robot.
  4. Open the Media Playback card.
  5. After a moment, a WebRTC peer to peer connection is negotiated and live video is displayed.
    An animated gif demonstrating a live video stream from the robot.

Sending drive commands

The serverless backend includes an Amazon API Gateway REST endpoint that publishes JSON messages to the Python script on the robot.

The robot expects a message:

{ “action”: <direction> }

Where direction can be “forward”, “backwards”, “left”, or “right”.

  1. While the Python script is running on the robot, open another terminal window.
  2. Run this command to tell the robot to drive forward. Replace <API-URL> using the endpoint listed under Outputs in the CloudFormation stack for the serverless application.
    curl -d '{"action":"forward"}' -H "Content-Type: application/json" -X POST https://<API-URL>/publish

    An animated gif demonstrating the robot being driven from a REST request.

Conclusion

In this post, I show how to build and program a telepresence robot with remote control and a live video feed in the cloud. I did this by installing a Python application on a Raspberry Pi robot and deploying a serverless application.

The Python application uses AWS IoT credentials to receive remote commands from the cloud and transmit live video using Kinesis Video Streams with WebRTC. The serverless application deploys a REST endpoint using API Gateway and a Lambda function. Any application that can connect to the endpoint can drive the robot.

In part two, I build on this project by deploying a web interface for the robot using AWS Amplify.

A preview of the web frontend built in the next blog.

A preview of the web frontend built in the next blog.

 

 

Recreate Flappy Bird’s flight mechanic | Wireframe #29

Post Syndicated from Ryan Lambie original https://www.raspberrypi.org/blog/recreate-flappy-birds-flight-mechanic-wireframe-29/

From last year’s issue 29 of Wireframe magazine: learn how to create your own version of the simple yet addictive side-scroller Flappy Bird. Raspberry Pi’s Rik Cross shows you how.

Flappy Bird: ridiculously big in 2014, at least for a while.

Flappy Bird was released by programmer Dong Nguyen in 2013, and made use of a straightforward game mechanic to create an addictive hit. Tapping the screen provided ‘lift’ to the main character, which is used strategically to navigate through a series of moving pipes. A point is scored for each pipe successfully passed. The idea proved so addictive that Nguyen eventually regretted his creation and removed it from the Google and Apple app stores. In this article, I’ll show you how to recreate this simple yet time-consuming game, using Python and Pygame Zero.

The player’s motion is very similar to that employed in a standard platformer: falling down towards the bottom of the screen under gravity. See the article, Super Mario-style jumping physics in Wireframe #7 for more on creating this type of movement. Pressing a button (in our case, the SPACE bar) gives the player some upward thrust by setting its velocity to a negative value (i.e. upwards) larger than the value of gravity acting downwards. I’ve adapted and used two different images for the sprite (made by Imaginary Perception and available on opengameart.org), so that it looks like it’s flapping its wings to generate lift and move upwards.

Pressing the SPACE bar gives the bird ‘lift’ against gravity, allowing it to navigate through moving pipes.

Sets of pipes are set equally spaced apart horizontally, and move towards the player slowly each frame of the game. These pipes are stored as two lists of rectangles, top_pipes and bottom_pipes, so that the player can attempt to fly through gaps between the top and bottom pipes. Once a pipe in the top_pipes list reaches the left side of the screen past the player’s position, a score is incremented and the top and corresponding bottom pipes are removed from their respective lists. A new set of pipes is created at the right edge of the screen, creating a continuous challenge for the player. The y-position of the gap between each newly created pair of pipes is decided randomly (between minimum and maximum limits), which is used to calculate the position and height of the new pipes.

The game stops and a ‘Game over’ message appears if the player collides with either a pipe or the ground. The collision detection in the game uses the player.colliderect() method, which checks whether two rectangles overlap. As the player sprite isn’t exactly rectangular, it means that the collision detection isn’t pixel-perfect, and improvements could be made by using a different approach. Changing the values for GRAVITY, PIPE_GAP, PIPE_SPEED, and player.flap_velocity through a process of trial and error will result in a game that has just the right amount of frustration! You could even change these values as the player’s score increases, to add another layer of challenge.

Here’s Rik’s code, which gets an homage to Flappy Bird running in Python. To get it working on your system, you’ll first need to install Pygame Zero. And to download the full code, go here.

If you’d like to read older issues of Wireframe magazine, you can find the complete back catalogue as free PDF downloads.

The latest issue of Wireframe is available in print to buy online from the Raspberry Pi Press store, with older physical issues heavily discounted too. You can also find Wireframe at local newsagents, but we should all be staying home as much as possible right now, so why not get your copy online and save yourself the trip?

Make sure to follow Wireframe on Twitter and Facebook for updates and exclusive offers and giveaways. And subscribe on the Wireframe website to save up to 49% compared to newsstand pricing!

The post Recreate Flappy Bird’s flight mechanic | Wireframe #29 appeared first on Raspberry Pi.

Code a homage to Marble Madness | Wireframe #34

Post Syndicated from Ryan Lambie original https://www.raspberrypi.org/blog/code-a-homage-to-marble-madness-wireframe-34/

Code the map and movement basics of the innovative marble-rolling arcade game. Mark Vanstone shows you how.

The original Marble Madness

Each of Marble Madness’ six levels got progressively harder to navigate and had to be completed within a time limit.

Marble Madness

Hitting arcades in 1984, Atari’s Marble Madness presented a rather different control mechanism than other games of the time. The original arcade cabinet provided players with a trackball controller rather than a conventional joystick, and the aim was to guide a marble through a three-dimensional course in the fastest possible time. This meant that a player could change the angle and speed of the marble as it rolled and avoid various obstacles and baddies.

During development, designer Mark Cerny had to shelve numerous ideas for Marble Madness, since the hardware just wasn’t able to achieve the level of detail and interaction he wanted. The groundbreaking 3D display was one idea that made it through to the finished game: its pre-rendered, ray-traced isometric levels.

Marble Madness was the first game to use Atari’s System 1 upgradeable hardware platform, and also boasted the first use of an FM sound chip produced by Yamaha to create its distinctive stereo music. The game was popular in arcades to start with, but interest appeared to drop off after a few months – something Cerny attributed to the fact that the game didn’t take long to play. Marble Madness’s popularity endured in the home market, though, with ports made for most computers and consoles of the time – although inevitably, most of these didn’t support the original’s trackball controls.

Our Python version of Marble Madness

In our sample level, you can control the movement of the marble using the left and right arrow keys.

Making our game

For our version of Marble Madness, we’re going to use a combination of a rendered background and a heightmap in Pygame Zero, and write some simple physics code to simulate the marble rolling over the terrain’s flats and slopes. We can produce the background graphic using a 3D modelling program such as Blender. The camera needs to be set to Orthographic to get the forced perspective look we’re after. The angle of the camera is also important, in that we need an X rotation of 54.7 degrees and a Y rotation of 45 degrees to get the lines of the terrain correct. The heightmap can be derived from an overhead view of the terrain, but you’ll probably want to draw the heights of the blocks in a drawing package such as GIMP to give you precise colour values on the map.

The ball rolling physics are calculated from the grey-shaded heightmap graphic. We’ve left a debug mode in the code; by changing the debug variable to True, you can see how the marble moves over the terrain from the overhead viewpoint of the heightmap. The player can move the marble left and right with the arrow keys – on a level surface it will gradually slow down if no keys are pressed. If the marble is on a gradient on the heightmap, it will increase speed in the direction of the gradient. If the marble hits a section of black on the heightmap, it falls out of play, and we stop the game.

That takes care of the movement of the marble in two dimensions, but now we have to translate this to the rendered background’s terrain. The way we do this is to translate the Y coordinate of the marble as if the landscape was all at the same level – we multiply it by 0.6 – and then move it down the screen according to the heightmap data, which in this case moves the marble down 1.25 pixels for each shade of colour. We can use an overlay for items the marble always rolls behind, such as the finish flag. And with that, we have the basics of a Marble Madness level.

The code you'll need to make Marble Madness

Here’s Mark’s code snippet, which creates a Marble Madness level in Python. To get it working on your system, you’ll need to install Pygame Zero. And to download the full code, go here.

Module Madness

We use the image module from Pygame to sample the colour of the pixel directly under the marble on the heightmap. We also take samples from the left diagonal and the right diagonal to see if there is a change of height. We are only checking for left and right movement, but this sample could be expanded to deal with the two other directions and moving up the gradients, too. Other obstacles and enemies can be added using the same heightmap translations used for the marble, and other overlay objects can be added to the overlay graphic.

Get your copy of Wireframe issue 34

You can read more features like this one in Wireframe issue 34, available now at Tesco, WHSmith, all good independent UK newsagents, and the Raspberry Pi Store, Cambridge.

Or you can buy Wireframe directly from Raspberry Pi Press — delivery is available worldwide. And if you’d like a handy digital version of the magazine, you can also download issue 34 for free in PDF format.

Wireframe #34

Make sure to follow Wireframe on Twitter and Facebook for updates and exclusive offers and giveaways. Subscribe on the Wireframe website to save up to 49% compared to newsstand pricing!

The post Code a homage to Marble Madness | Wireframe #34 appeared first on Raspberry Pi.

Code a Zaxxon-style axonometric level | Wireframe #33

Post Syndicated from Ryan Lambie original https://www.raspberrypi.org/blog/code-a-zaxxon-style-axonometric-level-wireframe-33/

Fly through the space fortress in this 3D retro forced scrolling arcade sample. Mark Vanstone has the details

A shot from Sega's arcade hit, Zaxxon

Zaxxon was the first arcade game to use an axonometric viewpoint, which made it look very different from its 2D rivals.

Zaxxon

When Zaxxon was first released by Sega in 1982, it was hailed as a breakthrough thanks to its pseudo-3D graphics. This axonometric projection ensured that Zaxxon looked unlike any other shooter around in arcades.

Graphics aside, Zaxxon offered a subtly different twist on other shooting games of the time, like Defender and Scramble; the player flew over either open space or a huge fortress, where they had to avoid obstacles of varying heights. Players could tell how high they were flying with the aid of an altimeter, and also the shadow beneath their ship (shadows were another of Zaxxon’s innovations). The aim of the game was to get to the end of each level without running out of fuel or getting shot down; if the player did this, they’d encounter an area boss called Zaxxon. Points were awarded for destroying gun turrets and fuel silos, and extra lives could be gained as the player progressed through the levels.

A shot of our Pygame version of Zaxxon

Our Zaxxon homage running in Pygame Zero: fly the spaceship through the fortress walls and obstacles with your cursor keys.

Making our level

For this code sample, we can borrow some of the techniques used in a previous Source Code article about Ant Attack (see Wireframe issue 15) since it also used an isometric display. Although the way the map display is built up is very similar, we’ll use a JSON file to store the map data. If you’ve not come across JSON before, it’s well worth learning about, as a number of web and mobile apps use it, and it can be read by Python very easily. All we need to do is load the JSON file, and Python automatically puts the data into a Python dictionary object for us to use.

In the sample, there’s a short run of map data 40 squares long with blocks for the floor, some low walls, higher walls, and a handful of fuel silos. To add more block types, just add data to the blocktypes area of the JSON file. The codes used in the map data are the index numbers of the blocktypes, so the first blocktypes is index 0, the next index 1, and so on. Our drawMap() function takes care of rendering the data into visual form and blits blocks from the top right to the bottom left of the screen. When the draw loop gets to where the ship is, it draws first the shadow and then the ship a little higher up the screen, depending on the altitude of the ship. The equation to translate the ship’s screen coordinates to a block position on the map is a bit simplistic, but in this case, it does the job well enough.

Cursor keys guide the movement of the spaceship, which is limited by the width of the map and a height of 85 pixels. There’s some extra code to display the ship if it isn’t on the map – for example, at the start, before it reaches the map area. To make the code snippet into a true Zaxxon clone, you’ll have to add some laser fire and explosions, a fuel gauge, and a scoring system, but this code sample should provide the basis you’ll need to get started.

Code for our Zaxxon homage

Here’s Mark’s code snippet, which creates a side-scrolling beat-’em-up in Python. To get it working on your system, you’ll need to install Pygame Zero. And to download the full code, go here.

Get your copy of Wireframe issue 33

You can read more features like this one in Wireframe issue 33, available now at Tesco, WHSmith, all good independent UK newsagents, and the Raspberry Pi Store, Cambridge.

Or you can buy Wireframe directly from Raspberry Pi Press — delivery is available worldwide. And if you’d like a handy digital version of the magazine, you can also download issue 33 for free in PDF format.

Make sure to follow Wireframe on Twitter and Facebook for updates and exclusive offers and giveaways. Subscribe on the Wireframe website to save up to 49% compared to newsstand pricing!

The post Code a Zaxxon-style axonometric level | Wireframe #33 appeared first on Raspberry Pi.

Generating REST APIs from data classes in Python

Post Syndicated from James Beswick original https://aws.amazon.com/blogs/compute/generating-rest-apis-from-data-classes-in-python/

This post is courtesy of Robert Enyedi – Senior Research Engineer – AI Labs

Implementing and managing public APIs is greatly simplified by API Gateway. Among the various features of API Gateway, the ability to import API definitions in the Open API format is powerful.

In this post, I show how you can automatically generate REST APIs directly from Python data classes. This method includes a highly automated workflow for exposing Python services as public APIs using the API Gateway. Recent changes in the Python language open the door for full automation of API publishing directly from code.

Open API and API Gateway

The Open API specification is a popular mechanism to declare the structure of REST APIs. It’s language-independent and allows you to determine API operations and their data types. Previously called Swagger, it is a standardization effort with benefits for the service developer and service consumer. It reduces repetitive tasks, increases API quality, and removes the guesswork from calling a service.

Examples shown here use data classes, which are supported in Python 3.7 or higher. There are backports of data classes to Python 3.6 available but they are beyond the scope of this post.

Python standard type annotations

The type hints syntax, defined in PEP 526 and implemented in Python 3.5, allow the declaration of a type for identifiers. This includes local variables, function and method parameters, and return type or class fields. They improve the readability of the code and provide useful information for tools. This allows your IDE to be more effective at auto-completion, semantic error detection, and refactoring.

Code checkers such as Mypy can better catch problems at build time. These are the typical advantages of statically typed languages. With Python, because type annotations are optional and a recent addition to the language, not all the project’s dependencies have types. That’s why tooling is less accurate in detecting all error conditions.

Python data classes

Data classes are an even more recent addition to the language. Described in PEP 557 and introduced in Python 3.7 they allow a simplified declaration of class data structures useful for storing state. Combined with type hints, one can use the @dataclass decorator:

@dataclass
class Person:
  name: str
  age: int

Then the Python implementation can generate:

  1. The constructor:
    Person(”Joe”, 12)
  2. Comparator methods to allow operations such as:
    Person(name=”Joe”, age=12) == Person(name=”Joe”, age=12)
  3. The __repr__() implementation to pretty print the object:
    Person(name='Joe', age=12)

Building an API using data classes

Data classes containing fields with type hints lend themselves to automation of API definitions. This solution uses data classes to generate Open API service definitions with AWS extensions and to create API Gateway configurations.

Similar solutions exist for strictly typed languages like Java, C# or Scala. In Python, this level of automation was not available until version 3.7. This code uses the Dataclasses JSON library to automate the serialization of data classes.

1. Start with the entity definition, in this case a person:

@dataclass
@dataclass_json
class Person:
  name: str
  age: int

2. Create one class for the request and another for the response to help payload serialization:

@dataclass
@dataclass_json
class CreatePersonRequest:
  person: Person

@dataclass
@dataclass_json
class CreatePersonResponse:
  person_id: int

3. Next, implement the route handler (this example uses the Flask Web framework):

OPERATION_CREATE_PERSON: str = 'create-person'
@app.route(f'/{OPERATION_CREATE_PERSON}', methods=['POST'])
def create_person():
    payload = request.get_json()
    logging.info(f"Incoming payload for {OPERATION_CREATE_PERSON}: {payload}")
    person = CreatePersonRequest.from_json(payload)

The payload is deserialized transparently using the schema derived from the data class definition of Person.

4. To generate a corresponding API definition, enter:

spec = {}

generate_operation(path=OPERATION_CREATE_PERSON,
                   request_schema=CreatePersonRequest.schema(),
                   request_schema_name=CreatePersonResponse.__name__,
                   response_schema=CreatePersonResponse.schema(),
                   response_schema_name=CreatePersonResponse.__name__,
                   spec=spec)

spec_dict = spec.to_dict()

The implementation of generate_operation() makes use of the apispec library to programmatically construct the Open API definition.

With spec_dict containing the Open API specification, it’s used to either create or update the API definition. You can also run any Open API tools on this definition, such as SDK generators, mock servers, or documentation generators. There’s a comprehensive catalog of tools maintained at https://openapi.tools/.

As a sensible default, the code generates API operations guarded by API keys supplied with the x-api-key header:

"securitySchemes": {
      "api_key": {
        "type": "apiKey",
        "name": "x-api-key",
        "in": "header"
      }
    }

The spec uses API Gateway extensions to include implementation-specific metadata. The most important is the one linking the API definition to the ECS backend:

"x-amazon-apigateway-integration": {
          "passthroughBehavior": "when_no_match",
          "type": "http_proxy",
          "httpMethod": "POST",
          "uri": "http://myecshost-1234567890.us-east-1.elb.amazonaws.com/create-person"
        }

You can use a similar pattern to connect the gateway to a different service, such as AWS Lambda:

"x-amazon-apigateway-integration": {
          "uri": "arn:aws:apigateway:...:lambda:path/.../functions/arn:aws:lambda:...:...:function:yourLambdaFunction/invocations",
          "responses": {
            "default": {
              "statusCode": "200"
            }
          },
          "passthroughBehavior": "when_no_match",
          "httpMethod": "POST",
          "contentHandling": "CONVERT_TO_TEXT",
          "type": "aws"
        }

For more information on the API Gateway extension to Open API, visit the AWS documentation.

Generating the API using API Gateway

This example uses the boto3 API Gateway API to expose a public API.

1. To create the API, enter the following:

api_definition = json.dumps(spec_dict, indent=2)
api_gateway_client.import_rest_api( body=api_definition )

2. To update the API, merge the changes into a manually modified API definition (mode='merge'), or completely overwrite the API (mode='overwrite'). It is often safer to merge the API, as follows:

api_gateway_client.put_rest_api(body=api_definition, mode='merge', restApiId=find_api_id(api_gateway_client, api_name))

The find_api_id() helper function looks up the API ID based on its name.

3. Check the API Gateway dashboard in the AWS Management Console for the new API definition. It shows the API and its resources:

API Gateway dashboard

Now you are ready to issue a test call to the external API to validate its security and functionality. The Open API definition of a manually created or modified API can be exported by various means, including from the stage editor.

Validate the API

The correct way to call the API is shown in test_get_dubbing_job_status_API() from test/ondemand_test_call_service.py:

response = _send_request(secure=True,
                         host='<<yourapi>>.execute-api.us-east-1.amazonaws.com',
                         service_port=80,
                         path='sample-generated-api',
                         operation=OPERATION_CREATE_PERSON,
                         request=CreatePersonRequest(Person(name='Jane Doe', age='40')),
                         api_key='<<yourapikey>>')

response_obj = CreatePersonResponse.from_json(response)

assert response_obj.person_id is not None

If you call the API without the api_key parameter, it returns an HTTP 403 code and the error message:

{"message":"Forbidden"}

Conclusion

This post shows how to automatically expose Python services as public APIs directly from the code. With the introduction of Python data classes, it is easy to automate JSON serialization.

Now you can fully automate the API generation and deployment tasks for API Gateway.  Introducing a new entity is trivial, and adding a new field to your API requires only writing its definition. You can develop a fully functional API based upon these building blocks.

Learn more from this sample repository, and adapt the code for your projects to achieve a high level of automation for your public APIs.