Tag Archives: display

CoderDojo Coolest Projects 2017

Post Syndicated from Ben Nuttall original https://www.raspberrypi.org/blog/coderdojo-coolest-projects-2017/

When I heard we were merging with CoderDojo, I was delighted. CoderDojo is a wonderful organisation with a spectacular community, and it’s going to be great to join forces with the team and work towards our common goal: making a difference to the lives of young people by making technology accessible to them.

You may remember that last year Philip and I went along to Coolest Projects, CoderDojo’s annual event at which their global community showcase their best makes. It was awesome! This year a whole bunch of us from the Raspberry Pi Foundation attended Coolest Projects with our new Irish colleagues, and as expected, the projects on show were as cool as can be.

Coolest Projects 2017 attendee

Crowd at Coolest Projects 2017

This year’s coolest projects!

Young maker Benjamin demoed his brilliant RGB LED table tennis ball display for us, and showed off his brilliant project tutorial website codemakerbuddy.com, which he built with Python and Flask. [Click on any of the images to enlarge them.]

Coolest Projects 2017 LED ping-pong ball display
Coolest Projects 2017 Benjamin and Oly

Next up, Aimee showed us a recipes app she’d made with the MIT App Inventor. It was a really impressive and well thought-out project.

Coolest Projects 2017 Aimee's cook book
Coolest Projects 2017 Aimee's setup

This very successful OpenCV face detection program with hardware installed in a teddy bear was great as well:

Coolest Projects 2017 face detection bear
Coolest Projects 2017 face detection interface
Coolest Projects 2017 face detection database

Helen’s and Oly’s favourite project involved…live bees!

Coolest Projects 2017 live bees

BEEEEEEEEEEES!

Its creator, 12-year-old Amy, said she wanted to do something to help the Earth. Her project uses various sensors to record data on the bee population in the hive. An adjacent monitor displays the data in a web interface:

Coolest Projects 2017 Aimee's bees

Coolest robots

I enjoyed seeing lots of GPIO Zero projects out in the wild, including this robotic lawnmower made by Kevin and Zach:

Raspberry Pi Lawnmower

Kevin and Zach’s Raspberry Pi lawnmower project with Python and GPIO Zero, showed at CoderDojo Coolest Projects 2017

Philip’s favourite make was a Pi-powered robot you can control with your mind! According to the maker, Laura, it worked really well with Philip because he has no hair.

Philip Colligan on Twitter

This is extraordinary. Laura from @CoderDojo Romania has programmed a mind controlled robot using @Raspberry_Pi @coolestprojects

And here are some pictures of even more cool robots we saw:

Coolest Projects 2017 coolest robot no.1
Coolest Projects 2017 coolest robot no.2
Coolest Projects 2017 coolest robot no.3

Games, toys, activities

Oly and I were massively impressed with the work of Mogamad, Daniel, and Basheerah, who programmed a (borrowed) Amazon Echo to make a voice-controlled text-adventure game using Java and the Alexa API. They’ve inspired me to try something similar using the AIY projects kit and adventurelib!

Coolest Projects 2017 Mogamad, Daniel, Basheerah, Oly
Coolest Projects 2017 Alexa text-based game

Christopher Hill did a brilliant job with his Home Alone LEGO house. He used sensors to trigger lights and sounds to make it look like someone’s at home, like in the film. I should have taken a video – seeing it in action was great!

Coolest Projects 2017 Lego home alone house
Coolest Projects 2017 Lego home alone innards
Coolest Projects 2017 Lego home alone innards closeup

Meanwhile, the Northern Ireland Raspberry Jam group ran a DOTS board activity, which turned their area into a conductive paint hazard zone.

Coolest Projects 2017 NI Jam DOTS activity 1
Coolest Projects 2017 NI Jam DOTS activity 2
Coolest Projects 2017 NI Jam DOTS activity 3
Coolest Projects 2017 NI Jam DOTS activity 4
Coolest Projects 2017 NI Jam DOTS activity 5
Coolest Projects 2017 NI Jam DOTS activity 6

Creativity and ingenuity

We really enjoyed seeing so many young people collaborating, experimenting, and taking full advantage of the opportunity to make real projects. And we loved how huge the range of technologies in use was: people employed all manner of hardware and software to bring their ideas to life.

Philip Colligan on Twitter

Wow! Look at that room full of awesome young people. @coolestprojects #coolestprojects @CoderDojo

Congratulations to the Coolest Projects 2017 prize winners, and to all participants. Here are some of the teams that won in the different categories:

Coolest Projects 2017 winning team 1
Coolest Projects 2017 winning team 2
Coolest Projects 2017 winning team 3

Take a look at the gallery of all winners over on Flickr.

The wow factor

Raspberry Pi co-founder and Foundation trustee Pete Lomas came along to the event as well. Here’s what he had to say:

It’s hard to describe the scale of the event, and photos just don’t do it justice. The first thing that hit me was the sheer excitement of the CoderDojo ninjas [the children attending Dojos]. Everyone was setting up for their time with the project judges, and their pure delight at being able to show off their creations was evident in both halls. Time and time again I saw the ninjas apply their creativity to help save the planet or make someone’s life better, and it’s truly exciting that we are going to help that continue and expand.

Even after 8 hours, enthusiasm wasn’t flagging – the awards ceremony was just brilliant, with ninjas high-fiving the winners on the way to the stage. This speaks volumes about the ethos and vision of the CoderDojo founders, where everyone is a winner just by being part of a community of worldwide friends. It was a brilliant introduction, and if this weekend was anything to go by, our merger certainly is a marriage made in Heaven.

Join this awesome community!

If all this inspires you as much as it did us, consider looking for a CoderDojo near you – and sign up as a volunteer! There’s plenty of time for young people to build up skills and start working on a project for next year’s event. Check out coolestprojects.com for more information.

The post CoderDojo Coolest Projects 2017 appeared first on Raspberry Pi.

Manage Instances at Scale without SSH Access Using EC2 Run Command

Post Syndicated from Jeff Barr original https://aws.amazon.com/blogs/aws/manage-instances-at-scale-without-ssh-access-using-ec2-run-command/

The guest post below, written by Ananth Vaidyanathan (Senior Product Manager for EC2 Systems Manager) and Rich Urmston (Senior Director of Cloud Architecture at Pegasystems) shows you how to use EC2 Run Command to manage a large collection of EC2 instances without having to resort to SSH.

Jeff;


Enterprises often have several managed environments and thousands of Amazon EC2 instances. It’s important to manage systems securely, without the headaches of Secure Shell (SSH). Run Command, part of Amazon EC2 Systems Manager, allows you to run remote commands on instances (or groups of instances using tags) in a controlled and auditable manner. It’s been a nice added productivity boost for Pega Cloud operations, which rely daily on Run Command services.

You can control Run Command access through standard IAM roles and policies, define documents to take input parameters, control the S3 bucket used to return command output. You can also share your documents with other AWS accounts, or with the public. All in all, Run Command provides a nice set of remote management features.

Better than SSH
Here’s why Run Command is a better option than SSH and why Pegasystems has adopted it as their primary remote management tool:

Run Command Takes Less Time –  Securely connecting to an instance requires a few steps e.g. jumpboxes to connect to or IP addresses to whitelist etc. With Run Command, cloud ops engineers can invoke commands directly from their laptop, and never have to find keys or even instance IDs. Instead, system security relies on AWS auth, IAM roles and policies.

Run Command Operations are Fully Audited – With SSH, there is no real control over what they can do, nor is there an audit trail. With Run Command, every invoked operation is audited in CloudTrail, including information on the invoking user, instances on which command was run, parameters, and operation status. You have full control and ability to restrict what functions engineers can perform on a system.

Run Command has no SSH keys to Manage – Run Command leverages standard AWS credentials, API keys, and IAM policies. Through integration with a corporate auth system, engineers can interact with systems based on their corporate credentials and identity.

Run Command can Manage Multiple Systems at the Same Time – Simple tasks such as looking at the status of a Linux service or retrieving a log file across a fleet of managed instances is cumbersome using SSH. Run Command allows you to specify a list of instances by IDs or tags, and invokes your command, in parallel, across the specified fleet. This provides great leverage when troubleshooting or managing more than the smallest Pega clusters.

Run Command Makes Automating Complex Tasks Easier – Standardizing operational tasks requires detailed procedure documents or scripts describing the exact commands. Managing or deploying these scripts across the fleet is cumbersome. Run Command documents provide an easy way to encapsulate complex functions, and handle document management and access controls. When combined with AWS Lambda, documents provide a powerful automation platform to handle any complex task.

Example – Restarting a Docker Container
Here is an example of a simple document used to restart a Docker container. It takes one parameter; the name of the Docker container to restart. It uses the AWS-RunShellScript method to invoke the command. The output is collected automatically by the service and returned to the caller. For an example of the latest document schema, see Creating Systems Manager Documents.

{
  "schemaVersion":"1.2",
  "description":"Restart the specified docker container.",
  "parameters":{
    "param":{
      "type":"String",
      "description":"(Required) name of the container to restart.",
      "maxChars":1024
    }
  },
  "runtimeConfig":{
    "aws:runShellScript":{
      "properties":[
        {
          "id":"0.aws:runShellScript",
          "runCommand":[
            "docker restart {{param}}"
          ]
        }
      ]
    }
  }
}

Putting Run Command into practice at Pegasystems
The Pegasystems provisioning system sits on AWS CloudFormation, which is used to deploy and update Pega Cloud resources. Layered on top of it is the Pega Provisioning Engine, a serverless, Lambda-based service that manages a library of CloudFormation templates and Ansible playbooks.

A Configuration Management Database (CMDB) tracks all the configurations details and history of every deployment and update, and lays out its data using a hierarchical directory naming convention. The following diagram shows how the various systems are integrated:

For cloud system management, Pega operations uses a command line version called cuttysh and a graphical version based on the Pega 7 platform, called the Pega Operations Portal. Both tools allow you to browse the CMDB of deployed environments, view configuration settings, and interact with deployed EC2 instances through Run Command.

CLI Walkthrough
Here is a CLI walkthrough for looking into a customer deployment and interacting with instances using Run Command.

Launching the cuttysh tool brings you to the root of the CMDB and a list of the provisioned customers:

% cuttysh
d CUSTA
d CUSTB
d CUSTC
d CUSTD

You interact with the CMDB using standard Linux shell commands, such as cd, ls, cat, and grep. Items prefixed with s are services that have viewable properties. Items prefixed with d are navigable subdirectories in the CMDB hierarchy.

In this example, change directories into customer CUSTB’s portion of the CMDB hierarchy, and then further into a provisioned Pega environment called env1, under the Dev network. The tool displays the artifacts that are provisioned for that environment. These entries map to provisioned CloudFormation templates.

> cd CUSTB
/ROOT/CUSTB/us-east-1 > cd DEV/env1

The ls –l command shows the version of the provisioned resources. These version numbers map back to source control–managed artifacts for the CloudFormation, Ansible, and other components that compose a version of the Pega Cloud.

/ROOT/CUSTB/us-east-1/DEV/env1 > ls -l
s 1.2.5 RDSDatabase 
s 1.2.5 PegaAppTier 
s 7.2.1 Pega7 

Now, use Run Command to interact with the deployed environments. To do this, use the attach command and specify the service with which to interact. In the following example, you attach to the Pega Web Tier. Using the information in the CMDB and instance tags, the CLI finds the corresponding EC2 instances and displays some basic information about them. This deployment has three instances.

/ROOT/CUSTB/us-east-1/DEV/env1 > attach PegaWebTier
 # ID         State  Public Ip    Private Ip  Launch Time
 0 i-0cf0e84 running 52.63.216.42 10.96.15.70 2017-01-16 
 1 i-0043c1d running 53.47.191.22 10.96.15.43 2017-01-16 
 2 i-09b879e running 55.93.118.27 10.96.15.19 2017-01-16 

From here, you can use the run command to invoke Run Command documents. In the following example, you run the docker-ps document against instance 0 (the first one on the list). EC2 executes the command and returns the output to the CLI, which in turn shows it.

/ROOT/CUSTB/us-east-1/DEV/env1 > run 0 docker-ps
. . 
CONTAINER ID IMAGE             CREATED      STATUS        NAMES
2f187cc38c1  pega-7.2         10 weeks ago  Up 8 weeks    pega-web

Using the same command and some of the other documents that have been defined, you can restart a Docker container or even pull back the contents of a file to your local system. When you get a file, Run Command also leaves a copy in an S3 bucket in case you want to pass the link along to a colleague.

/ROOT/CUSTB/us-east-1/DEV/env1 > run 0 docker-restart pega-web
..
pega-web

/ROOT/CUSTB/us-east-1/DEV/env1 > run 0 get-file /var/log/cfn-init-cmd.log
. . . . . 
get-file

Data has been copied locally to: /tmp/get-file/i-0563c9e/data
Data is also available in S3 at: s3://my-bucket/CUSTB/cuttysh/get-file/data

Now, leverage the Run Command ability to do more than one thing at a time. In the following example, you attach to a deployment with three running instances and want to see the uptime for each instance. Using the par (parallel) option for run, the CLI tells Run Command to execute the uptime document on all instances in parallel.

/ROOT/CUSTB/us-east-1/DEV/env1 > run par uptime
 …
Output for: i-006bdc991385c33
 20:39:12 up 15 days, 3:54, 0 users, load average: 0.42, 0.32, 0.30

Output for: i-09390dbff062618
 20:39:12 up 15 days, 3:54, 0 users, load average: 0.08, 0.19, 0.22

Output for: i-08367d0114c94f1
 20:39:12 up 15 days, 3:54, 0 users, load average: 0.36, 0.40, 0.40

Commands are complete.
/ROOT/PEGACLOUD/CUSTB/us-east-1/PROD/prod1 > 

Summary
Run Command improves productivity by giving you faster access to systems and the ability to run operations across a group of instances. Pega Cloud operations has integrated Run Command with other operational tools to provide a clean and secure method for managing systems. This greatly improves operational efficiency, and gives greater control over who can do what in managed deployments. The Pega continual improvement process regularly assesses why operators need access, and turns those operations into new Run Command documents to be added to the library. In fact, their long-term goal is to stop deploying cloud systems with SSH enabled.

If you have any questions or suggestions, please leave a comment for us!

— Ananth and Rich

Pirate Bay Facilitates Piracy and Can be Blocked, Top EU Court Rules

Post Syndicated from Ernesto original https://torrentfreak.com/pirate-bay-facilitates-piracy-and-can-be-blocked-top-eu-court-rules-170614/

pirate bayIn 2014, The Court of The Hague handed down its decision in a long running case which had previously forced two Dutch ISPs, Ziggo and XS4ALL, to block The Pirate Bay.

The Court ruled against local anti-piracy outfit BREIN, concluding that the blockade was ineffective and restricted the ISPs’ entrepreneurial freedoms.

The Pirate Bay was unblocked by all local ISPs while BREIN took the matter to the Supreme Court, which subsequently referred the case to the EU Court of Justice, seeking further clarification.

After a careful review of the case, the Court of Justice today ruled that The Pirate Bay can indeed be blocked.

While the operators don’t share anything themselves, they knowingly provide users with a platform to share copyright-infringing links. This can be seen as “an act of communication” under the EU Copyright Directive, the Court concludes.

“Whilst it accepts that the works in question are placed online by the users, the Court highlights the fact that the operators of the platform play an essential role in making those works available,” the Court explains in a press release (pdf).

According to the ruling, The Pirate Bay indexes torrents in a way that makes it easy for users to find infringing content while the site makes a profit. The Pirate Bay is aware of the infringements, and although moderators sometimes remove “faulty” torrents, infringing links remain online.

“In addition, the same operators expressly display, on blogs and forums accessible on that platform, their intention of making protected works available to users, and encourage the latter to make copies of those works,” the Court writes.

The ruling means that there are no major obstacles for the Dutch Supreme Court to issue an ISP blockade, but a final decision in the underlying case will likely take a few more months.

A decision at the European level is important, as it may also affect court orders in other countries where The Pirate Bay and other torrent sites are already blocked, including Austria, Belgium, Finland, Italy, and its home turf Sweden.

Despite the negative outcome, the Pirate Bay team is not overly worried.

“Copyright holders will remain stubborn and fight to hold onto a dying model. Clueless and corrupt law makers will put corporate interests before the public’s. Their combined jackassery is what keeps TPB alive,” TPB’s plc365 tells TorrentFreak.

“The reality is that regardless of the ruling, nothing substantial will change. Maybe more ISPs will block TPB. More people will use one of the hundreds of existing proxies, and even more new ones will be created as a result.”

Pirate Bay moderator “Xe” notes that while it’s an extra barrier to access the site, blockades will eventually help people to get around censorship efforts, which are not restricted to TPB.

“They’re an issue for everyone in the sense that they’re an obstacle which has to be overcome. But learning how to work around them isn’t hard and knowing how to work around them is becoming a core skill for everyone who uses the Internet.

“Blockades are not a major issue for the site in the sense that they’re nothing new: we’ve long since adapted to them. We serve the needs of millions of people every day in spite of them,” Xe adds.

Source: TF, for the latest info on copyright, file-sharing, torrent sites and ANONYMOUS VPN services.

UK Police Claim Success in Keeping Gambling Ads off Pirate Sites

Post Syndicated from Andy original https://torrentfreak.com/uk-police-claim-success-in-keeping-gambling-ads-off-pirate-sites-170614/

Over the past several years, there has been a major effort by entertainment industry groups to cut off revenue streams to ‘pirate’ sites. The theory is that if sites cannot generate funds, their operators will eventually lose interest.

Since advertising is a key money earner for any website, significant resources have been expended trying to keep ads off sites that directly or indirectly profit from infringement. It’s been a multi-pronged affair, with agencies being encouraged to do the right thing and brands warned that their ads appearing on pirate sites does nothing for their image.

One sector that has trailed behind most is the gambling industry. Up until fairly recently, ads for some of the UK’s largest bookmakers have been a regular feature on many large pirate sites, either embedded in pages or more often than not, appearing via popup or pop-under spreads. Now, however, a significant change is being reported.

According to the City of London Police’s Intellectual Property Crime Unit (PIPCU), over the past 12 months there has been an 87% drop in adverts for licensed gambling operators being displayed on infringing websites.

The research was carried out by whiteBULLET, a brand safety and advertising solutions company which helps advertisers to assess whether placing an advert on a particular URL will cause it to appear on a pirate site.

PIPCU says that licensed gambling operators have an obligation to “keep crime out of gambling” due to their commitments under the Gambling Act 2005. However, the Gambling Commission, the UK’s gambling regulatory body, has recently been taking additional steps to tackle the problem.

In September 2015, the Commission consulted on amendments (pdf) to licensing conditions that would compel licensees to ensure that advertisements “placed by themselves and others” do not appear on websites providing unauthorized access to copyrighted content.

After the consultation was published in May 2016 (pdf), all respondents agreed in principle that gambling operators should not advertise on pirate sites. A month later, the Commission said it would ban the placement of gambling ads on such platforms.

When the new rules came into play last October, 40 gambling companies (including Bet365, Coral and Sky Bet, who had previously been called out for displaying ads on pirate sites) were making use of PIPCU’s ‘Infringing Website List‘, a database of sites that police claim are actively involved in piracy.

Speaking yesterday, acting Detective Superintendent Peter Ratcliffe, Head of the Police Intellectual Property Crime Unit (PIPCU), welcomed the ensuing reduction in ad placement on ‘pirate’ domains.

“The success of a strong relationship built between PIPCU and The Gambling Commission can be seen by these figures. This is a fantastic example of a joint working initiative between police and an industry regulator,” Ratcliffe said.

“We commend the 40 gambling companies who are already using the Infringing Website List and encourage others to sign up. We will continue to encourage all UK advertisers to become a member of the Infringing Website List to ensure they’re not inadvertently funding criminal websites.”

Source: TF, for the latest info on copyright, file-sharing, torrent sites and ANONYMOUS VPN services.

How to Deploy Local Administrator Password Solution with AWS Microsoft AD

Post Syndicated from Dragos Madarasan original https://aws.amazon.com/blogs/security/how-to-deploy-local-administrator-password-solution-with-aws-microsoft-ad/

Local Administrator Password Solution (LAPS) from Microsoft simplifies password management by allowing organizations to use Active Directory (AD) to store unique passwords for computers. Typically, an organization might reuse the same local administrator password across the computers in an AD domain. However, this approach represents a security risk because it can be exploited during lateral escalation attacks. LAPS solves this problem by creating unique, randomized passwords for the Administrator account on each computer and storing it encrypted in AD.

Deploying LAPS with AWS Microsoft AD requires the following steps:

  1. Install the LAPS binaries on instances joined to your AWS Microsoft AD domain. The binaries add additional client-side extension (CSE) functionality to the Group Policy client.
  2. Extend the AWS Microsoft AD schema. LAPS requires new AD attributes to store an encrypted password and its expiration time.
  3. Configure AD permissions and delegate the ability to retrieve the local administrator password for IT staff in your organization.
  4. Configure Group Policy on instances joined to your AWS Microsoft AD domain to enable LAPS. This configures the Group Policy client to process LAPS settings and uses the binaries installed in Step 1.

The following diagram illustrates the setup that I will be using throughout this post and the associated tasks to set up LAPS. Note that the AWS Directory Service directory is deployed across multiple Availability Zones, and monitoring automatically detects and replaces domain controllers that fail.

Diagram illustrating this blog post's solution

In this blog post, I explain the prerequisites to set up Local Administrator Password Solution, demonstrate the steps involved to update the AD schema on your AWS Microsoft AD domain, show how to delegate permissions to IT staff and configure LAPS via Group Policy, and demonstrate how to retrieve the password using the graphical user interface or with Windows PowerShell.

This post assumes you are familiar with Lightweight Directory Access Protocol Data Interchange Format (LDIF) files and AWS Microsoft AD. If you need more of an introduction to Directory Service and AWS Microsoft AD, see How to Move More Custom Applications to the AWS Cloud with AWS Directory Service, which introduces working with schema changes in AWS Microsoft AD.

Prerequisites

In order to implement LAPS, you must use AWS Directory Service for Microsoft Active Directory (Enterprise Edition), also known as AWS Microsoft AD. Any instance on which you want to configure LAPS must be joined to your AWS Microsoft AD domain. You also need a Management instance on which you install the LAPS management tools.

In this post, I use an AWS Microsoft AD domain called example.com that I have launched in the EU (London) region. To see which the regions in which Directory Service is available, see AWS Regions and Endpoints.

Screenshot showing the AWS Microsoft AD domain example.com used in this blog post

In addition, you must have at least two instances launched in the same region as the AWS Microsoft AD domain. To join the instances to your AWS Microsoft AD domain, you have two options:

  1. Use the Amazon EC2 Systems Manager (SSM) domain join feature. To learn more about how to set up domain join for EC2 instances, see joining a Windows Instance to an AWS Directory Service Domain.
  2. Manually configure the DNS server addresses in the Internet Protocol version 4 (TCP/IPv4) settings of the network card to use the AWS Microsoft AD DNS addresses (172.31.9.64 and 172.31.16.191, for this blog post) and perform a manual domain join.

For the purpose of this post, my two instances are:

  1. A Management instance on which I will install the management tools that I have tagged as Management.
  2. A Web Server instance on which I will be deploying the LAPS binary.

Screenshot showing the two EC2 instances used in this post

Implementing the solution

 

1. Install the LAPS binaries on instances joined to your AWS Microsoft AD domain by using EC2 Run Command

LAPS binaries come in the form of an MSI installer and can be downloaded from the Microsoft Download Center. You can install the LAPS binaries manually, with an automation service such as EC2 Run Command, or with your existing software deployment solution.

For this post, I will deploy the LAPS binaries on my Web Server instance (i-0b7563d0f89d3453a) by using EC2 Run Command:

  1. While signed in to the AWS Management Console, choose EC2. In the Systems Manager Services section of the navigation pane, choose Run Command.
  2. Choose Run a command, and from the Command document list, choose AWS-InstallApplication.
  3. From Target instances, choose the instance on which you want to deploy the LAPS binaries. In my case, I will be selecting the instance tagged as Web Server. If you do not see any instances listed, make sure you have met the prerequisites for Amazon EC2 Systems Manager (SSM) by reviewing the Systems Manager Prerequisites.
  4. For Action, choose Install, and then stipulate the following values:
    • Parameters: /quiet
    • Source: https://download.microsoft.com/download/C/7/A/C7AAD914-A8A6-4904-88A1-29E657445D03/LAPS.x64.msi
    • Source Hash: f63ebbc45e2d080630bd62a195cd225de734131a56bb7b453c84336e37abd766
    • Comment: LAPS deployment

Leave the other options with the default values and choose Run. The AWS Management Console will return a Command ID, which will initially have a status of In Progress. It should take less than 5 minutes to download and install the binaries, after which the Command ID will update its status to Success.

Status showing the binaries have been installed successfully

If the Command ID runs for more than 5 minutes or returns an error, it might indicate a problem with the installer. To troubleshoot, review the steps in Troubleshooting Systems Manager Run Command.

To verify the binaries have been installed successfully, open Control Panel and review the recently installed applications in Programs and Features.

Screenshot of Control Panel that confirms LAPS has been installed successfully

You should see an entry for Local Administrator Password Solution with a version of 6.2.0.0 or newer.

2. Extend the AWS Microsoft AD schema

In the previous section, I used EC2 Run Command to install the LAPS binaries on an EC2 instance. Now, I am ready to extend the schema in an AWS Microsoft AD domain. Extending the schema is a requirement because LAPS relies on new AD attributes to store the encrypted password and its expiration time.

In an on-premises AD environment, you would update the schema by running the Update-AdmPwdADSchema Windows PowerShell cmdlet with schema administrator credentials. Because AWS Microsoft AD is a managed service, I do not have permissions to update the schema directly. Instead, I will update the AD schema from the Directory Service console by importing an LDIF file. If you are unfamiliar with schema updates or LDIF files, see How to Move More Custom Applications to the AWS Cloud with AWS Directory Service.

To make things easier for you, I am providing you with a sample LDIF file that contains the required AD schema changes. Using Notepad or a similar text editor, open the SchemaChanges-0517.ldif file and update the values of dc=example,dc=com with your own AWS Microsoft AD domain and suffix.

After I update the LDIF file with my AWS Microsoft AD details, I import it by using the AWS Management Console:

  1. On the Directory Service console, select from the list of directories in the Microsoft AD directory by choosing its identifier (it will look something like d-534373570ea).
  2. On the Directory details page, choose the Schema extensions tab and choose Upload and update schema.
    Screenshot showing the "Upload and update schema" option
  3. When prompted for the LDIF file that contains the changes, choose the sample LDIF file.
  4. In the background, the LDIF file is validated for errors and a backup of the directory is created for recovery purposes. Updating the schema might take a few minutes and the status will change to Updating Schema. When the process has completed, the status of Completed will be displayed, as shown in the following screenshot.

Screenshot showing the schema updates in progress
When the process has completed, the status of Completed will be displayed, as shown in the following screenshot.

Screenshot showing the process has completed

If the LDIF file contains errors or the schema extension fails, the Directory Service console will generate an error code and additional debug information. To help troubleshoot error messages, see Schema Extension Errors.

The sample LDIF file triggers AWS Microsoft AD to perform the following actions:

  1. Create the ms-Mcs-AdmPwd attribute, which stores the encrypted password.
  2. Create the ms-Mcs-AdmPwdExpirationTime attribute, which stores the time of the password’s expiration.
  3. Add both attributes to the Computer class.

3. Configure AD permissions

In the previous section, I updated the AWS Microsoft AD schema with the required attributes for LAPS. I am now ready to configure the permissions for administrators to retrieve the password and for computer accounts to update their password attribute.

As part of configuring AD permissions, I grant computers the ability to update their own password attribute and specify which security groups have permissions to retrieve the password from AD. As part of this process, I run Windows PowerShell cmdlets that are not installed by default on Windows Server.

Note: To learn more about Windows PowerShell and the concept of a cmdlet (pronounced “command-let”), go to Getting Started with Windows PowerShell.

Before getting started, I need to set up the required tools for LAPS on my Management instance, which must be joined to the AWS Microsoft AD domain. I will be using the same LAPS installer that I downloaded from the Microsoft LAPS website. In my Management instance, I have manually run the installer by clicking the LAPS.x64.msi file. On the Custom Setup page of the installer, under Management Tools, for each option I have selected Install on local hard drive.

Screenshot showing the required management tools

In the preceding screenshot, the features are:

  • The fat client UI – A simple user interface for retrieving the password (I will use it at the end of this post).
  • The Windows PowerShell module – Needed to run the commands in the next sections.
  • The GPO Editor templates – Used to configure Group Policy objects.

The next step is to grant computers in the Computers OU the permission to update their own attributes. While connected to my Management instance, I go to the Start menu and type PowerShell. In the list of results, right-click Windows PowerShell and choose Run as administrator and then Yes when prompted by User Account Control.

In the Windows PowerShell prompt, I type the following command.

Import-module AdmPwd.PS

Set-AdmPwdComputerSelfPermission –OrgUnit “OU=Computers,OU=MyMicrosoftAD,DC=example,DC=com

To grant the administrator group called Admins the permission to retrieve the computer password, I run the following command in the Windows PowerShell prompt I previously started.

Import-module AdmPwd.PS

Set-AdmPwdReadPasswordPermission –OrgUnit “OU=Computers, OU=MyMicrosoftAD,DC=example,DC=com” –AllowedPrincipals “Admins”

4. Configure Group Policy to enable LAPS

In the previous section, I deployed the LAPS management tools on my management instance, granted the computer accounts the permission to self-update their local administrator password attribute, and granted my Admins group permissions to retrieve the password.

Note: The following section addresses the Group Policy Management Console and Group Policy objects. If you are unfamiliar with or wish to learn more about these concepts, go to Get Started Using the GPMC and Group Policy for Beginners.

I am now ready to enable LAPS via Group Policy:

  1. On my Management instance (i-03b2c5d5b1113c7ac), I have installed the Group Policy Management Console (GPMC) by running the following command in Windows PowerShell.
Install-WindowsFeature –Name GPMC
  1. Next, I have opened the GPMC and created a new Group Policy object (GPO) called LAPS GPO.
  2. In the Local Group Policy Editor, I navigate to Computer Configuration > Policies > Administrative Templates > LAPS. I have configured the settings using the values in the following table.

Setting

State

Options

Password Settings

Enabled

Complexity: large letters, small letters, numbers, specials

Do not allow password expiration time longer than required by policy

Enabled

N/A

Enable local admin password management

Enabled

N/A

  1. Next, I need to link the GPO to an organizational unit (OU) in which my machine accounts sit. In your environment, I recommend testing the new settings on a test OU and then deploying the GPO to production OUs.

Note: If you choose to create a new test organizational unit, you must create it in the OU that AWS Microsoft AD delegates to you to manage. For example, if your AWS Microsoft AD directory name were example.com, the test OU path would be example.com/example/Computers/Test.

  1. To test that LAPS works, I need to make sure the computer has received the new policy by forcing a Group Policy update. While connected to the Web Server instance (i-0b7563d0f89d3453a) using Remote Desktop, I open an elevated administrative command prompt and run the following command: gpupdate /force. I can check if the policy is applied by running the command: gpresult /r | findstr LAPS GPO, where LAPS GPO is the name of the GPO created in the second step.
  2. Back on my Management instance, I can then launch the LAPS interface from the Start menu and use it to retrieve the password (as shown in the following screenshot). Alternatively, I can run the Get-ADComputer Windows PowerShell cmdlet to retrieve the password.
Get-ADComputer [YourComputerName] -Properties ms-Mcs-AdmPwd | select name, ms-Mcs-AdmPwd

Screenshot of the LAPS UI, which you can use to retrieve the password

Summary

In this blog post, I demonstrated how you can deploy LAPS with an AWS Microsoft AD directory. I then showed how to install the LAPS binaries by using EC2 Run Command. Using the sample LDIF file I provided, I showed you how to extend the schema, which is a requirement because LAPS relies on new AD attributes to store the encrypted password and its expiration time. Finally, I showed how to complete the LAPS setup by configuring the necessary AD permissions and creating the GPO that starts the LAPS password change.

If you have comments about this post, submit them in the “Comments” section below. If you have questions about or issues implementing this solution, please start a new thread on the Directory Service forum.

– Dragos

EtherApe – Graphical Network Monitor

Post Syndicated from Darknet original http://feedproxy.google.com/~r/darknethackers/~3/DxSK15EgI5k/

EtherApe is a graphical network monitor for Unix modelled after etherman. Featuring link layer, IP and TCP modes, it displays network activity graphically. Hosts and links change in size with traffic. Colour coded protocols display. It supports Ethernet, FDDI, Token Ring, ISDN, PPP, SLIP and WLAN devices, plus several encapsulation formats. It can…

Read the full post at darknet.org.uk

New Features for IAM Policy Summaries – Resource Summaries

Post Syndicated from Joy Chatterjee original https://aws.amazon.com/blogs/security/new-features-for-iam-policy-summaries-resource-summaries/

In March, we introduced policy summaries, which make it easier for you to understand the permissions in your AWS Identity and Access Management (IAM) policies. Today, we added three new features to policy summaries to improve the experience of understanding and troubleshooting your policies. First, we added resource summaries for you to see the resources defined in your policies. Second, you can now see which services and actions are implicitly denied by a policy. This allows you to see the remaining actions available for a service with limited access. Third, it is now easier for you to identify potential typos in your policies because you can now see which services and actions are unrecognized by IAM. Today, Tuesday, and Wednesday, I will demonstrate these three new features. In today’s post, I review resource summaries.

Resource summaries

Policy summaries now show you the resources defined in a policy. Previously, policy summaries displayed either All for all resources, the Amazon Resource Name (ARN) for one resource, or Multiple for multiple resources specified in the policy. Starting today, you can see the resource type, region, and account ID to summarize the list of resources defined for each action in a policy. Let’s review a policy summary that specifies multiple resources.

The following policy grants access to three Amazon S3 buckets with multiple conditions.

{
 "Version":"2012-10-17",
 "Statement":[
   {
     "Effect":"Allow",
     "Action":["s3:PutObject","s3:PutObjectAcl"],
     "Resource":["arn:aws:s3:::Apple_bucket"],
     "Condition":{"StringEquals":{"s3:x-amz-acl":["public-read"]}}
   },{
     "Effect":"Allow",
     "Action":["s3:PutObject","s3:PutObjectAcl"],
     "Resource":["arn:aws:s3:::Orange_bucket"],
     "Condition":{"StringEquals":{"s3:prefix":["custom", "test"]}}
   },{
     "Effect":"Allow",
     "Action":["s3:PutObject","s3:PutObjectAcl"],
     "Resource":["arn:aws:s3:::Purple_bucket"],
     "Condition":{"DateGreaterThan":{"aws:CurrentTime":"2016-10-31T05:00:00Z"}}
   }
 ]
}

The policy summary (see the following screenshot) shows Limited: Write, Permissions management actions for S3 on Multiple resources and request conditions. Limited means that some but not all of the actions in the Write and Permissions management are granted in the policy.

Screenshot of the policy summary

If I choose S3, I see that the actions defined in the policy grant access to multiple resources, as shown in the following screenshot. To see the resource summary, I can choose either PutObject or PutObjectAcl.

Screenshot showing that the actions defined in the policy grant access to multiple resources

I choose PutObjectAcl to see the resources and conditions defined in the policy for this S3 action. If the policy has one condition, I see it in the policy summary. I can view multiple conditions in the JSON.

Screenshot showing the resources and the conditions defined in the policy for this S3 action

As the preceding screenshot shows, the PutObjectAcl action has access to three S3 buckets with respective request conditions.

Summary

Policy summaries make it easy to view and understand the permissions and resources defined in a policy without having to view the associated JSON. To see policy summaries in your AWS account, sign in to the IAM console and navigate to any policy on the Policies page of the IAM console or the Permissions tab on a user’s page. On Tuesday, I will review the benefits of viewing the services and actions not granted in a policy.

If you have comments about this post, submit them in the “Comments” section below. If you have questions about or suggestions for this solution, start a new thread on the IAM forum.

– Joy

Make with Minecraft Pi in The MagPi 58

Post Syndicated from Rob Zwetsloot original https://www.raspberrypi.org/blog/magpi-58/

Hey folks, Rob here! What a busy month it’s been at The MagPi HQ. While we’ve been replying to your tweets, answering questions on YouTube and fiddling with our AIY Voice Project kits, we’ve managed to put together a whole new magazine for you, with issue 58 of the official Raspberry Pi magazine out in stores today.

The front cover of The MagPi 58

The MagPi 58 features our latest Minecraft Pi hacks!

Minecraft Pi

The MagPi 58 is all about making with Minecraft Pi. We’ve got cool projects and hacks that let you take a selfie and display it in the Minecraft world, play music with Steve jumping on a giant piano, and use special cards to switch skins in an instant. It’s the perfect supplement to our Hacking and Making in Minecraft book!

AIY Voice Projects

It’s been great to see everyone getting excited over the last issue of the magazine, and we love seeing your pictures and videos of your AIY Voice projects. In this issue we’ve included loads of ideas to keep you going with the AIY Projects kit. Don’t forget to send us what you’ve made on Twitter!

Issue 57 of The MagPi, showing the Google AIY Voice Projects Kit

Show us what you’ve made with your AIY Voice Projects Kit

The best of the rest in The MagPi 58

We’ve also got our usual selection of reviews, tutorials, and projects. This includes guides to making file servers and electronic instruments, along with our review of Adafruit’s Joy Bonnet handheld gaming kit.

A page from The MagPi 58 showing information on 'Getting Started with GUIs'

You can get started with GUIs in The MagPi 58

You can grab the latest issue in stores in the UK right now, from WHSmith, Sainsburys, Asda, and Tesco. Copies will be arriving very soon in US stores, including Barnes & Noble and Micro Center. You can also get a copy online from our store, or digitally via our Android or iOS app. Don’t forget, there’s always the free PDF as well.

We hope you enjoy the issue! Now if you’ll excuse us, we need a nap after all the excitement!

The post Make with Minecraft Pi in The MagPi 58 appeared first on Raspberry Pi.

Now Anyone Can Embed a Pirate Movie in a Website

Post Syndicated from Andy original https://torrentfreak.com/now-anyone-can-embed-a-pirate-movie-in-a-website-170522/

While torrents are still the go-to source for millions of users seeking free online media, people are increasingly seeking the immediacy and convenience of web-based streaming.

As a result, hundreds of websites have appeared in recent years, offering Netflix-inspired interfaces that provide an enhanced user experience over the predominantly text-based approach utilized by most torrent sites.

While there hasn’t been a huge amount of innovation in either field recently, a service that raised its head during recent weeks is offering something new and potentially significant, if it continues to deliver on its promises without turning evil.

Vodlocker.to is the latest in a long list of sites using the Vodlocker name, which is bound to cause some level of confusion. However, what this Vodlocker variant offers is a convenient way for users to not only search for and find movies hosted on the Internet, but stream them instantly – with a twist.

After entering a movie’s IMDb code (the one starting ‘tt’) in a box on the page, Vodlocker quickly searches for the movie on various online hosting services, including Google Drive.

Entering the IMDb code

“We believe the complexity of uploading a video has become unnecessary, so we have created much like Google, an automated crawler that visits millions of pages every day to find all videos on the internet,” the site explains.

As shown in the image above, the site takes the iMDb number and generates code. That allows the user to embed an HTML5 video player in their own website, which plays the movie in question. We tested around a dozen movies with a 100% success rate, with search times from a couple of seconds to around 20 seconds maximum.

A demo on the site shows exactly how the embed code currently performs, with the video player offering the usual controls such as play and pause, with a selector for quality and volume levels. The usual ‘full screen’ button sits in the bottom right corner.

The player can be embedded anywhere

Near the top of the window are options for selecting different sources for the video, should it become unplayable or if a better quality version is required. Interestingly, should one of those sources be Google Video, Vodlocker says its player offers Chromecast and subtitle support.

“Built-in chromecast plugin streams free HD movies/tv shows from your website to your TV via Google Chromecast. Built-in opensubtitles.org plugin finds subtitles in all languages and auto-selects your language,” the site reports.

In addition to a link-checker that aims to exclude broken links (missing sources), the service also pulls movie-related artwork from IMDb, to display while the selected movie is being prepared for streaming.

The site is already boasting a “massive database” of movies, which will make it of immediate use to thousands of websites that might want to embed movies or TV shows in their web pages.

As long as Vodlocker can cope with the load, this could effectively spawn a thousand new ‘pirate’ websites overnight but the service generally seems more suited to smaller, blog-like sites that might want to display a smaller selection of titles.

That being said, it’s questionable whether a site would seek to become entirely reliant on a service like this. While the videos it indexes are more decentralized, the service itself could be shut down in the blink of an eye, at which point every link stops working.

It’s also worth noting that the service uses IFrame tags, which some webmasters might feel uncomfortable about deploying on their sites due to security concerns.

The New Vodlocker API demo can be found here, for as long as it lasts.

Source: TF, for the latest info on copyright, file-sharing, torrent sites and ANONYMOUS VPN services.

AWS Knowledge Center Video: Preparing to Send a Snowball Back to AWS

Post Syndicated from Jeff Barr original https://aws.amazon.com/blogs/aws/aws-knowledge-center-video-preparing-to-send-a-snowball-back-to-aws/

Do you know about the AWS Support Knowledge Center? It contains answers to some of the most frequently asked questions and other requests asked of our support team. Many of the answers even include a short video that serves to illustrate the process or to provide additional info on the topic.

For example, I recently stepped in to our studio and created a new video called Preparing to Send a Snowball Back to AWS. In 90 action-packed seconds, this video shows you how to power down the Snowball, stow the cables, lock the back panel, and verify that the proper return address is on the built-in display:

Visit the Knowledge Center to see other videos and to find answers to other questions that you might have about AWS.

Jeff;

 

RIAA Says Artists Don’t Need “Moral Rights,” Artists Disagree

Post Syndicated from Ernesto original https://torrentfreak.com/riaa-says-artists-dont-need-moral-rights-artists-disagree-170521/

Most people who create something like to be credited for their work. Whether you make a video, song, photo, or blog post, it feels ‘right’ to receive recognition.

The right to be credited is part of the so-called “moral rights,” which are baked into many copyright laws around the world, adopted at the international level through the Berne Convention.

However, in the United States, this is not the case. The US didn’t sign the Berne Convention right away and opted out from the “moral rights” provision when they eventually joined it.

Now that the U.S. Copyright Office is looking into ways to improve current copyright law, the issue has been brought to the forefront again. The Government recently launched a consultation to hear the thoughts of various stakeholders, which resulted in several noteworthy contributions.

As it turns out, both the MPAA and RIAA are against the introduction of statutory moral rights for artists. They believe that the current system works well and they fear that it’s impractical and expensive to credit all creators for their contributions.

The MPAA stresses that new moral rights may make it harder for producers to distribute their work and may violate the First Amendment rights of producers, artists, and third parties who wish to use the work of others.

In the movie industry, many employees are not credited for their work. They get paid, but can’t claim any “rights” to the products they create, something the MPAA wants to keep intact.

“Further statutory recognition of the moral rights of attribution and integrity risks upsetting this well-functioning system that has made the United States the unrivaled world leader in motion picture production for over a century,” they stress.

The RIAA has a similar view, although the central argument is somewhat different.

The US record labels say that they do everything they can to generate name recognition for their main artists. However, crediting everyone who’s involved in making a song, such as the writer, is not always a good idea.

“A new statutory attribution right, in addition to being unnecessary, would likely have significant unintended consequences,” the RIAA writes (pdf).

The RIAA explains that the music industry has weathered several dramatic shifts over the past two decades. They argue that the transition from physical to digital music – and later streaming – while being confronted with massive piracy, has taken its toll.

There are signs of improvement now, but if moral rights are extended, the RIAA fears that everything might collapse once gain.

“After fifteen years of declining revenues, the recorded music industry outlook is finally showing signs of improvement. This fragile recovery results largely from growing consumer adoption of new streaming models..,” the RIAA writes.

“We urge the Office to avoid legislative proposals that could hamper this nascent recovery by injecting significant additional risk, uncertainty, and complexity into the recorded music business.”

According to the RIAA it would be costly for streaming services credit everyone who’s involved in the creative process. In addition, they simply might not have the screen real estate to pull this off.

“If a statutory attribution right suddenly required these services to provide attribution to others involved in the creative process, that would presumably require costly changes to their user interfaces and push them up against the size limitations of their display screens.”

This means less money for the artists and more clutter on the screen, according to the music group. Music fans probably wouldn’t want to see the list of everyone who worked on a song anyway, they claim.

“To continue growing, streaming services must provide a compelling product to consumers. Providing a long list of on-screen attributions would not make for an engaging or useful experience for consumers,” RIAA writes.

The streaming example is just one of the many issues that may arise, in the eyes of the record labels. They also expect problems with tracks that are played on the radio, or in commercials, where full credits are rarely given.

Interestingly, many of the artists the RIAA claims to represent don’t agree with the group’s comments.

Music Creators North America and The Future of Music Coalition, for example, believe that artists should have statutory moral rights. The latter group argues that, currently, small artists are often powerless against large corporations.

“Moral rights would serve to alleviate the powerlessness faced by creators who often must relinquish their copyright to make a living from their work. These creators should still be provided some right of attribution and integrity as these affect a creator’s reputation and ultimately livelihood.”

The Future of Music Coalition disagrees with the paternalistic perspective that the public isn’t interested in detailed information about the creators of music.

“While interest levels may vary, a significant portion of the public has a great interest in understanding who exactly contributed to the creation works of art which they admire,” they write (pdf).

Knowing who’s involved requires attribution, so it’s crucial that this information becomes available, they argue.

“Music enthusiasts revel in the details of music they adore, but when care is not taken to document and preserve that information, those details can often lost over time and eventually unattainable.”

“To argue that the public generally has a homogenously disinterested opinion of creators is insulting both to the public and to creators,” The Future of Music Coalition adds.

The above shows that the rights of artists are clearly not always aligned with the interests of record labels.

Interestingly, the RIAA and MPAA do agree with major tech companies and civil rights groups such as EFF and Public Knowledge. These are also against new moral rights, albeit for different reasons.

It’s now up to the U.S. Copyright Office to determine if change is indeed required, or if everything will remain the same.

Source: TF, for the latest info on copyright, file-sharing, torrent sites and ANONYMOUS VPN services.

Texas Court Orders Temporary ‘Pre-Piracy’ Shutdown of Sports Streaming Sites

Post Syndicated from Ernesto original https://torrentfreak.com/texas-court-orders-temporary-pre-piracy-shutdown-of-sports-streaming-sites-170513/

Copyright holders often complain that they have virtually no means to target pirate sites, especially those run from overseas.

Interestingly, however, in recent months it has become apparent that the US Federal Court system can be used as a prime enforcement tool to shut down pirate domain names.

This is also the path Indian media outfit Times Content Limited (TCL) decided to go down. The company operates the cricket channel Willow TV and owns the US broadcasting rights to the Indian Premier League cricket tournament, which is currently ongoing.

Two weeks ago the company sued several sports streaming sites including smartcric.com and crickethdlive.com. These sites allow users to watch cricket games for free over the Internet, without permission.

To stop this from taking place, the Indian company requested a broad injunction, which the court granted last week.

The preliminary injunction (pdf) orders various third party providers to stop working with these sites effective immediately to prevent future copyright infringements. This also applies to any new domain names or websites the operators may launch.

“…all service providers whose services will enable or facilitate Defendants’ anticipated infringement are ordered to suspend all services with respect to smartcric.com, smartcric.eu, crickethdlive.com, and crickethdlive.pw, or any other website or domain that is redirected from the Websites and continues to distribute and publicly perform the 2017 IPL,” it reads.

Domain registries and registrars are not the only parties that are compelled to comply. It also lists a broad range of intermediaries including hosting companies, CDN services, advertising outfits, and streaming providers.

Where this order clearly differs from similar injunctions in the US is that it specifically targets “anticipated infringement.” Or put differently, it aims to prevent piracy before it takes place.

From the injunction

What stands out further is that the injunction is temporary in nature. It only applies while the Cricket tournament is active. This ends on May 22, after which the parties involved are free to lift or reverse the actions they took.

“For the avoidance of doubt, the Court’s intent is to ensure that Defendants’ Websites be rendered offline, inaccessible and incapable of receiving or displaying audio or video signals between the date of this order and 6:00 am. CDT on May 22, 2017,” the injunction reads.

Over the past few days several of the seized domain names have been placed in a Godaddy holding account belonging to the law firm that represents TCL. And per court order, they will stay there until said date.

That doesn’t mean, however, that the case is over after the tournament ends. In the complaint, TCL also requests damages and other punitive measures, which is something that has to be decided over at a later date.

TorrentFreak spoke to the operator of the streaming sites in question, who says that the lawsuit took him by surprise. After losing his initial domain names he registered several new ones, but these were swiftly taken down as well.

“I moved Smartcric.com to Smartcric.be and Crickethdlive.com to Crickethdlive.pw. However, both domains were suspended as well within a day. Later, I moved Crickethdlive content to Crickethdlive.to however that was suspended yesterday as well,” the operator says.

“It was shocking to see that non-US registries were following the order issued by a US court. It was unfair and unjust to comply with orders of a non-competent court by these registries.”

Interestingly, one of the domain names was registered through the domain name service Njalla, which Pirate Bay co-founder Peter Sunde recently launched. Sunde stresses that the domain was seized beyond their control and that no personal information was shared.

“We’re looking into the case at the moment, but the court took the domain and sent it to a legal firm. We have no way of going above the court and ICANN on this. However, we have of course not sent any information about the customer to anyone,” Sunde says.

The streaming site operator still doubts that he will get his domain names back after the injunction expires. Instead, he’s decided to focus his effort on finding a domain name that falls outside of the scope of the US courts.

Source: TF, for the latest info on copyright, file-sharing, torrent sites and ANONYMOUS VPN services.

Julia language for Raspberry Pi

Post Syndicated from Ben Nuttall original https://www.raspberrypi.org/blog/julia-language-raspberry-pi/

Julia is a free and open-source general purpose programming language made specifically for scientific computing. It combines the ease of writing in high-level languages like Python and Ruby with the technical power of MATLAB and Mathematica and the speed of C. Julia is ideal for university-level scientific programming and it’s used in research.

Julia language logo

Some time ago Viral Shah, one of the language’s co-creators, got in touch with us at the Raspberry Pi Foundation to say his team was working on a port of Julia to the ARM platform, specifically for the Raspberry Pi. Since then, they’ve done sterling work to add support for ARM. We’re happy to announce that we’ve now added Julia to the Raspbian repository, and that all Raspberry Pi models are supported!

Not only did the Julia team port the language itself to the Pi, but they also added support for GPIO, the Sense HAT and Minecraft. What I find really interesting is that when they came to visit and show us a demo, they took a completely different approach to the Sense HAT than I’d seen before: Simon, one of the Julia developers, started by loading the Julia logo into a matrix within the Jupyter notebook and then displayed it on the Sense HAT LED matrix. He then did some matrix transformations and the Sense HAT showed the effect of these manipulations.

Viral says:

The combination of Julia’s performance and Pi’s hardware unlocks new possibilities. Julia on the Pi will attract new communities and drive applications in universities, research labs and compute modules. Instead of shipping the data elsewhere for advanced analytics, it can simply be processed on the Pi itself in Julia.

Our port to ARM took a while, since we started at a time when LLVM on ARM was not fully mature. We had a bunch of people contributing to it – chipping away for a long time. Yichao did a bunch of the hard work, since he was using it for his experiments. The folks at the Berkeley Race car project also put Julia and JUMP on their self-driving cars, giving a pretty compelling application. We think we will see many more applications.

I organised an Intro to Julia session for the Cambridge Python user group earlier this week, and rather than everyone having to install Julia, Jupyter and all the additional modules on their own laptops, we just set up a room full of Raspberry Pis and prepared an SD card image. This was much easier and also meant we could use the Sense HAT to display output.

Intro to Julia language session at Raspberry Pi Foundation
Getting started with Julia language on Raspbian
Julia language logo on the Sense HAT LED array

Simon kindly led the session, and before long we were using Julia to generate the Mandelbrot fractal and display it on the Sense HAT:

Ben Nuttall on Twitter

@richwareham’s Sense HAT Mandelbrot fractal with @JuliaLanguage at @campython https://t.co/8FK7Vrpwwf

Naturally, one of the attendees, Rich Wareham, progressed to the Julia set – find his code here: gist.github.com/bennuttall/…

Last year at JuliaCon, there were two talks about Julia on the Pi. You can watch them on YouTube:

Install Julia on your Raspberry Pi with:

sudo apt update
sudo apt install julia

You can install the Jupyter notebook for Julia with:

sudo apt install julia libzmq3-dev python3-zmq
sudo pip3 install jupyter
julia -e 'Pkg.add("IJulia");'

And you can easily install extra packages from the Julia console:

Pkg.add("SenseHat")

The Julia team have also created a resources website for getting started with Julia on the Pi: juliaberry.github.io

Julia team visiting Pi Towers

There never was a story of more joy / Than this of Julia and her Raspberry Pi

Many thanks to Viral Shah, Yichao Yu, Tim Besard, Valentin Churavy, Jameson Nash, Tony Kelman, Avik Sengupta and Simon Byrne for their work on the port. We’re all really excited to see what people do with Julia on Raspberry Pi, and we look forward to welcoming Julia programmers to the Raspberry Pi community.

The post Julia language for Raspberry Pi appeared first on Raspberry Pi.

New – USASpending.gov on an Amazon RDS Snapshot

Post Syndicated from Jeff Barr original https://aws.amazon.com/blogs/aws/new-usaspending-gov-on-an-amazon-rds-snapshot/

My colleague Jed Sundwall runs the AWS Public Datasets program. He wrote the guest post below to tell you about an important new dataset that is available as an Amazon RDS Snapshot. In the post, Jed introduces the dataset and shows you how to create an Amazon RDS DB Instance from the snapshot.

Jeff;


I am very excited to announce that, starting today, the entire public USAspending.gov database is available for anyone to copy via Amazon Relational Database Service (RDS). USAspending.gov data includes data on all spending by the federal government, including contracts, grants, loans, employee salaries, and more. The data is available via a PostgreSQL snapshot, which provides bulk access to the entire USAspending.gov database, and is updated nightly. At this time, the database includes all USAspending.gov for the second quarter of fiscal year 2017, and data going back to the year 2000 will be added over the summer. You can learn more about the database and how to access it on its AWS Public Dataset landing page.

Through the AWS Public Datasets program, we work with AWS customers to experiment with ways that the cloud can make data more accessible to more people. Most of our AWS Public Datasets are made available through Amazon S3 because of its tremendous flexibility and ability to scale to serve any volume of any kind of data files. What’s exciting about the USAspending.gov database is that it provides a great example of how Amazon RDS can be used to share an entire relational database quickly and easily. Typically, sharing a relational database requires extract, transfer, and load (ETL) processes that require redundant storage capacity, time for data transfer, and often scripts to migrate your database schema from one database engine to another. ETL processes can be so intimidating and cumbersome that they’re effectively impossible for many people to carry out.

By making their data available as a public Amazon RDS snapshot, the team at USASPending.gov has made it easy for anyone to get a copy of their entire production database for their own use within minutes. This will be useful for researchers and businesses who want to work with real data about all US Government spending and quickly combine it with their own data or other data resources.

Deploying the USASpending.gov Database Using the AWS Management Console
Let’s go through the steps involved in deploying the database in your AWS account using the AWS Management Console.

  1. Sign in to the AWS Management Console and select the US East (N. Virginia) region in the menu bar.
  2. Open the Amazon RDS Console and choose Snapshots in the navigation pane.
  3. In the filter for the search bar, select All Public Snapshots and search for 515495268755:
  4. Select the snapshot named arn:aws:rds:us-east-1:515495268755:snapshot:usaspending-db.
  5. Select Snapshot Actions -> Restore Snapshot. Select an instance size, and enter the other details, then click on Restore DB Instance.
  6. You will see that a DB Instance is being created from the snapshot, within your AWS account.
  7. After a few minutes, the status of the instance will change to Available.
  8. You can see the endpoint for your database on the main page along with other useful info:

Deploying the USASpending.gov Database Using the AWS CLI
You can also install the AWS Command Line Interface (CLI) and use it to create a DB Instance from the snapshot. Here’s a sample command:

$ aws rds restore-db-instance-from-db-snapshot --db-instance-identifier my-test-db-cli \
  --db-snapshot-identifier arn:aws:rds:us-east-1:515495268755:snapshot:usaspending-db \
  --region us-east-1

This will give you an ARN (Amazon Resource Name) that you can use to reference the DB Instance. For example:

$ aws rds describe-db-instances \
  --db-instance-identifier arn:aws:rds:us-east-1:917192695859:db:my-test-db-cli

This command will display the Endpoint.Address that you use to connect to the database.

Connecting to the DB Instance
After following the AWS Management Console or AWS CLI instructions above, you will have access to the full USAspending.gov database within this Amazon RDS DB instance, and you can connect to it using any PostgreSQL client using the following credentials:

  • Username: root
  • Password: password
  • Database: data_store_api

If you use psql, you can access the database using this command:

$ psql -h my-endpoint.rds.amazonaws.com -U root -d data_store_api

You should change the database password after you log in:

ALTER USER "root" WITH ENCRYPTED PASSWORD '{new password}';

If you can’t connect to your instance but think you should be able to, you may need to check your VPC Security Groups and make sure inbound and outbound traffic on the port (usually 5432) is allowed from your IP address.

Exploring the Data
The USAspending.gov data is very rich, so it will be hard to do it justice in this blog post, but hopefully these queries will give you an idea of what’s possible. To learn about the contents of the database, please review the USAspending.gov Data Dictionary.

The following query will return the total amount of money the government is obligated to pay for contracts awarded by NASA that include “Mars” or “Martian” in the description of the award:

select sum(total_obligation) from awards, subtier_agency 
  where (awards.description like '% MARTIAN %' OR awards.description like '% MARS %') 
  AND subtier_agency.name = 'National Aeronautics and Space Administration';

As I write this, the result I get for this query is $55,411,025.42. Note that the database is updated nightly and will include more historical data in the coming months, so you may get a different result if you run this query.

Now, here’s the same query, but looking for awards with “Jupiter” or “Jovian” in the description:

select sum(total_obligation) from awards, subtier_agency
  where (awards.description like '%JUPITER%' OR awards.description like '%JOVIAN%') 
  AND subtier_agency.name = 'National Aeronautics and Space Administration';

The result I get is $14,766,392.96.

Questions & Comments
I’m looking forward to seeing what people can do with this data. If you have any questions about the data, please create an issue on the USAspending.gov API’s issue tracker on GitHub.

— Jed

Submission deadline for LPC refereed track proposals extended

Post Syndicated from ris original https://lwn.net/Articles/722186/rss

The deadline for submitting refereed track proposals for the 2017
Linux Plumbers Conference (LPC) has been extended until May 13.
The refereed track will have 50-minute
presentations on a specific aspect of Linux “plumbing” (e.g. core
libraries, media creation/playback, display managers, init systems,
kernel APIs/ABIs, etc.) that are chosen by the LPC committee to be
given during all three days of the conference.
” LPC will be held
September 13-15 in Los Angeles, CA.

US Court Orders Registries to Seize Control of ‘Pirate’ Domains

Post Syndicated from Andy original https://torrentfreak.com/us-court-orders-registries-seize-control-of-pirate-domains-170508/

ABS-CBN is the largest media and entertainment company in the Philippines and it is extremely aggressive when it comes to protecting its intellectual property. In fact, it now targets way more ‘pirate’ sites in the United States than the MPAA.

One of the tactics employed by ABS-CBN is targeting the domains of ‘pirate’ sites. On several occasions, the TV outfit has found courts willing to step in with ex parte orders, based on allegations of copyright and trademark infringement.

The United States District Court for the Southern District of Florida is a popular venue for ABS-CBN and in April the company approached the Court again, this time with allegations against 19 streaming platforms (list below).

“Through their websites operating under the Subject Domain Names, Defendants advertise and hold out to the public that they have ABS-CBN’s copyrighted content and perform ABS-CBN’s copyrighted content over the Internet, in order to illegally profit from ABS-CBN’s intellectual property, without ABS-CBN’s consent,” the company wrote in its complaint.

“Defendants’ entire Internet-based website businesses amount to nothing more than illegal operations established and operated in order to infringe the intellectual property rights of ABS-CBN and others.”

Claiming direct and contributory copyright infringement, trademark infringement and unfair competition, among other things, ABS-CBN demanded maximum statutory damages of $150,000 per infringement, plus injunctive relief to avoid future infringement. Following an ex parte process, the Court responded favorably.

In an order granting a preliminary injunction, the Florida district court agreed that the sites present an ongoing threat to ABS-CBN’s business and it’s likely they’ll continue to deceive the public by illegally using the company’s trademarks and content without a license.

Judge Robert N.Scola Jr. restrained everyone connected to the sites from “advertising, promoting, copying, broadcasting, publicly performing, and/or distributing” any of ABS-CBN’s content and/or abusing its trademarks.

While this is fairly standard for this kind of process, it was also remarkably easy for ABS-CBN to deprive the sites of their domains.

In his order, the Judge ordered the domain registrars of the ‘pirate’ sites to transfer the domains to a holding account operated by a new registrar of ABS-CBN’s choosing, pending the outcome of the case. If they fail to do that within a single business day, the TLD (top-level domain) registries are instructed to do it for them.

While the case is underway, each domain is ordered to be re-directed away from the pirate sites and towards a new URL displaying copies of the complaint and subsequent orders.

“After the New Registrar has effected this change, the Subject Domain Names shall be placed on lock status, preventing the modification or deletion of the domains by the New Registrar or the Defendants,” the order reads.

While 19 domains are listed, any other domains “properly brought to the Court’s attention” can be seized in the same manner, the order notes.

Since the ‘pirate’ site operators are unlikely to defend the action, the domains are almost certainly out of reach already. ABS-CBN says it now wants $40m in damages, so arguing over the fate of a few domains is probably low on the operators’ agenda.

“We will continue to shut down these pirate sites to protect the public from harm,” said ABS-CBN assistant vice president and head of global anti-piracy Elisha Lawrence.

“There is only one genuine ABS-CBN internet subscription service that is safe for our fans to use and that is TFC and TFC.t.”

The affected domains

cinesilip.net
pinoychanneltv.me
pinoytambayantv.me
pinoytambayanreplay.net
drembed.com
embeds.me
fullpinoymovies.com
lambingan.ph
magtvna.com
pinoye.com
pinoyteleserye.org
pinoytvnetwork.net
pinoytopmovies.info
teleserye.me
watchpinaytv.com
wildpinoy.net
pinoy-hd.com
pinoytvreplay.ws
pinoychannel.co
wowpinoytambayan.ws
pinoytelebyuwers.se

Source: TF, for the latest info on copyright, file-sharing, torrent sites and ANONYMOUS VPN services.

Pioneers gives you squad goals

Post Syndicated from Olympia Brown original https://www.raspberrypi.org/blog/pioneers-gives-you-squad-goals/

We’re two weeks into the second cycle of Pioneers, our programme to give teenagers a taste of digital making. Teenagers make amazing, ridiculous, awesome things when they are challenged to unleash their creativity using technology. In the first cycle, we had everything from a disco pen to a crotch-soaking water trap. Families and friends can take part, as well as clubs and schools: we call these informal Pioneers teams squads, and we’re hoping that lots will join this second round of the competition.

The creativity on display comes from allowing teenagers to approach a problem from whatever angle they choose. Pioneers has been designed so that it’s flexible and people can take part however they like. As well as making sure the challenge we set is as open as possible, we’re also pretty chilled about how teams participate: when and where the making gets done.

A relaxed-looking polar bear.

We are as chilled as a polar bear in a bucket hat

We’re delighted to see that lots of teenagers have been getting together with their mates, hanging out, and working out how they can best freak out their mum.

Pioneers challenge 1

Make them laugh…

Some of the groups told us that they met at a regular time, and while there was a lot of chat, they’d also find some time to make some cool stuff. Others had some intense sessions over a couple of weekends (certain team members may or may not have been involved with extra bits of tinkering between sessions).

Getting involved in Pioneers

If you’ve got some teenagers lying about the house, why not see if they’d like to challenge themselves to make something linked to the outdoors? We’ve got some starter projects to give them a bit of inspiration, but they can respond to the challenge however they like, as long as they are using tech.

Pioneers: Make it Outdoors

Our challenge for this round of Pioneers: get outdoors!

If you’re mentoring one of these informal Pioneers squads, you are probably mostly there to remind that they might want to meet up, and to prompt them to make their video in time for the deadline. You don’t need to be a tech expert in order to be a mentor, but if you’d like a confidence-booster, you could watch some of our videos to level up your skills. And if you do get stuck on something technical, you can ask for help on the Raspberry Pi forums.

For more information about working as a squad, or about mentoring one, check out our Pioneers page. We can’t wait to see what you come up with!

The post Pioneers gives you squad goals appeared first on Raspberry Pi.