Robot Police

Will you obey robots?

  • I will bow to tryanny.

    Votes: 0 0.0%

  • Total voters
    3

Will E Worm

Conspiracy...
Real-Life 'RoboCop' May Be Coming to a Street Near You


Robot security guards are staples of most futuristic sci-fi movies, video games and TV shows. They exist in real life as well, though the sight of a security robot patrolling the streets is far from common.

The K5 Beta, a just-unveiled prototype from California-based company Knightscope, might change all that.

A bullet-shaped robot that stands about 5 feet (1.5 meters) tall, the K5 looks a lot like the droid R2-D2 from the "Star Wars" films. The K5 might not have all the features of its counterpart from a galaxy far, far away. But what it does have, Knightscope representatives said, is an onboard sensor array that can see, hear, touch and smell its surroundings.


The K5 can also combine that sensory data with "existing raw business, government and crowdsourced social data sets, and subsequently assigns an alert level that determines when a business, community or authorities should be notified of a concern," according to Knightscope's press release.

In other words, the K5 is supposed to combine its observations with public data on the social and financial statistics of its surroundings, and use the information to predict if, when and where a crime is likely to occur.
If the K5 does detect that an "incident" is occurring, it makes all of its sensor data publicly available via Wi-Fi, "to allow the entire community to review the information transparently and contribute additional relevant, real-time information," Knightscope representatives said.

Crowdsourced crime investigations have their pros and cons, as anyone who followed the immediate aftermath of the Boston Marathon bombing knows well. Using Twitter, Boston residents were able to inform others about the events surrounding the attack and help identify the alleged bomber, but not before an innocent person was publicly named as a suspect. Presumably, the K5's software will be able to sift through crowdsourced data to find the most valuable information.

The company was founded in response to another deadly incident that gained national attention: the shooting massacre at Sandy Hook Elementary School in Newtown, Conn., in December 2012.

"Our long-term goal is to show that, with a combination of hardware, software and community involvement, we will, together, be able to cut crime," Knightscope CEO William Santana Li said in a statement.

Many people aren't comfortable with the idea of human police officers carrying video cameras or amassing a private database of footage, so it's unlikely that the public would approve of camera-toting robots. The K5's ability to make its collected data publicly available is intended, in part, to assuage those concerns, as Li told The New York Times. The idea is that people will feel more comfortable if the data the K5 collects is in everyone's hands, and not just the police or a private group.

What's more, Knightscope representatives said the K5 will be able to save money. The robot operates at a cost of about $6.25 per hour, the New York Times reports — more than a dollar below the U.S. minimum hourly wage.

The K5 has just reached beta testing, with no word on when it might become commercially available. However, "initial test deployments" are scheduled for 2014, so the sight of a robotic security guard rolling through a school, mall, museum or city street might not be that far off.


Article






 

Rey C.

Racing is life... anything else is just waiting.
While I think (know) that robotics and automated data and function processing is one of the most fascinating areas of modern manufacturing and business (and the military), I agree that we have to take a closer look at what sort of data is being collected and what happens to that data. I can see the benefits of the K5 and devices like it (lower cost and likely greater efficiency than with human security guards), but the thing that I saw in their video that concerned me was this: "Behavioral Analysis". Several companies (Google, Apple, iRobot and others) are already working on something called predictive analytics. Based on inputs gained from audio and visual observations, they're trying to get devices to predict what you want to do or where you want to go. In other words, based on past habits and your present location, your iPhone might turn the heat up to 72 degrees in your house when you are within 10 miles and traveling in the direction of your house and turn it down to say 64 degrees once you have passed the 10 mile mark and traveling away from your house. This seems to follow that same overall concept - only it's focusing on whether or not you have in mind committing a crime, I guess. Where I have a concern would be its ability to cause a problem for an innocent person, whose actions might be misunderstood by the device. Can it differentiate between a couple having a "normal" argument while leaving a store and a guy who is in the process of assaulting an unknown woman in the parking lot of that same store? Would its default setting cause it to automatically call the police if it's not sure? Would it be better/worse, smarter/dumber than the average human security guard??? :dunno: I guess, like any computer, it'll follow the old GIGO (garbage in/garbage out) Rule.

Here's the actual video promo for this thing. I don't know enough about what it can do, can't do or might be able to do to say how I feel about it right now. I don't think it's set to be any sort of "robo cop", but more of an unarmed security guard/mall cop. Interesting concept...

 

Jagger69

Three lullabies in an ancient tongue
You'll obey them. You won't have a choice, creep. (even with Polish subtitles, you ain't got a chance) :eek:

 
A robot may not injure a human being or, through inaction, allow a human being to come to harm.
A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.
A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
I. Asimov, Runaround

As long as these 3 laws will be respected, robots will do fine. But if these rules are broken and we let them be broken, then, we're doomed.

This is why I'm quite concerned about drones : drones remotely driven by humans are fine and so are dro,es that are programmed to a specific target. But drones that would be given some autonomy to patrol and fire on targets that weren't identified by humans, this woould be quite dangerous...
 

feller469

Moving to a trailer in Fife, AL.
Aren't the choices for the poll similar to "Would you fuck an obese, AIDS-infected meth-addict woman?" or "Are you gay?"
 
Top