Thursday, March 5, 2020

Australian police lied on the use of AI facial recognition software

There are plenty of facts Australians should know when it comes to the place in which they live in, but for the purpose of this post let's start of with these two:

- Australia is a colony (different set of rules from a country on the global stage) 

- It is also a police state.

With the police state agenda in full swing comes an action from those in government against the serfs where NO ACCOUNTABILITY is the name of the game.

At the federal level, the social security (alleged) debt fraud committed against welfare recipients was dubbed 'robo-debt' insinuating that it was machine carried out, where there is no 'legal person' responsible for this action. 

'Persons' have died as a result of false debts claimed against them, but what's worse is there is no 'remedy' (compensation) in sight for their family members.

At a state level, for example in Victoria, property seizures are done unlawfully, under the actions of the so called sheriff, where no warrants exist in order to be able to take property lawfully.

This action is supported by all those in power, when it comes to a serf losing his/her possessions.

These people systematically destroy other's lives with zero repercussion or even accountability IF ever brought before the courts.

See the following article from 4 Mar 2020 by the conversation of the headline:

Australian police are using the Clearview AI facial recognition system with no accountability



Australian police agencies are reportedly using a private, unaccountable facial recognition service that combines machine learning and wide-ranging data-gathering practices to identify members of the public from online photographs.

The service, Clearview AI, is like a reverse image search for faces. You upload an image of someone’s face and Clearview searches its database to find other images that contain the same face. It also tells you where the image was found, which might help you determine the name and other information about the person in the picture.

Clearview AI built this system by collecting several billion publicly available images from the web, including from social media sites such as Facebook and YouTube. Then they used machine learning to make a biometric template for each face and match those templates to the online sources of the images.

It was revealed in January that hundreds of US law enforcement agencies are using Clearview AI, starting a storm of discussion about the system’s privacy implications and the legality of the web-scraping used to build the database.

Australian police agencies initially denied they were using the service. The denial held until a list of Clearview AI’s customers was stolen and disseminated, revealing users from the Australian Federal Police as well as the state police in Queensland, Victoria and South Australia.

Lack of accountability

This development is particularly concerning as the Department of Home Affairs, which oversees the federal police, is seeking to increase the use of facial recognition and other biometric identity systems. (An attempt to introduce new legislation was knocked back last year for not being adequately transparent or privacy-protecting.)

Gaining trust in the proper use of biometric surveillance technology ought to be important for Home Affairs. And being deceptive about the use of these tools is a bad look.


Read more: Why the government's proposed facial recognition database is causing such alarm


But the lack of accountability may go beyond poor decisions at the top. It may be that management at law enforcement agencies did not know their employees were using Clearview AI. The company offers free trials to “active law enforcement personnel”, but it’s unclear how they verify this beyond requiring a government email address.

Why aren’t law enforcement agencies enforcing rules about which surveillance tools officers can use? Why aren’t their internal accountability mechanisms working?

There are also very real concerns around security when using Clearview AI. It monitors and logs every search, and we know it has already had one data breach. If police are going to use powerful surveillance technologies, there must be systems in place for ensuring those technological tools do what they say they do, and in a secure and accountable way.

Is it even accurate?

Relatively little is known about how the Clearview AI system actually works. To be accountable, a technology used by law enforcement should be tested by a standards body to ensure it is fit for purpose.

Clearview AI, on the other hand, has had its own testing done – and as a result its developers claim it is 100% accurate.

That report does not represent the type of testing that an entity seeking to produce an accountable system would undertake. In the US at least, there are agencies like the National Institute for Standards and Technology that do precisely that kind of accuracy testing. There are also many qualified researchers in universities and labs that could properly evaluate the system.

Instead, Clearview AI gave the task to a trio composed of a retired judge turned private attorney, an urban policy analyst who wrote some open source software in the 1990s, and a former computer science professor who is now a Silicon Valley entrepreneur. There is no discussion of why those individuals were chosen.

The method used to test the system also leaves a lot to be desired. Clearview AI based their testing on a test by the American Civil Liberties Union of Amazon’s Rekognition image analysis tool.

However, the ACLU test was a media stunt. The ACLU ran headshots of 28 members of congress against a mugshot database. None of the politicians were in the database, meaning any match returned would be an error. However, the test only required the system to be 80% certain of its results, making it quite likely to return a match.


Read more: Close up: the government's facial recognition plan could reveal more than just your identity


The Clearview AI test also used headshots of politicians taken from the web (front-on, nicely framed, well-lit images), but ran them across their database of several billion images, which did include those politicians.

The hits returned by the system were then confirmed visually by the three report authors as 100% accurate. But what does 100% mean here?

The report stipulates that the first two hits provided by the system were accurate. But we don’t know how many other hits there were, or at what point they stopped being accurate. Politicians have lots of smiling headshots online, so finding two images should not be complex.

What’s more, law enforcement agencies are unlikely to be working with nice clean headshots. Poor-quality images taken from strange angles – the kind you get from surveillance or CCTV cameras – would be more like what law enforcement agencies are actually using.

Despite these and other criticisms, Clearview AI CEO Hoan Ton-That stands by the testing, telling Buzzfeed News he believes it is diligent and thorough.

More understanding and accountability are needed

The Clearview AI case shows there is not enough understanding or accountability around how this and other software tools work in law enforcement. Nor do we know enough about the company selling it and their security measures, nor about who in law enforcement is using it or under what conditions.

Beyond the ethical arguments around facial recognition, Clearview AI reveals Australian law enforcement agencies have such limited technical and organisational accountability that we should be questioning their competency even to evaluate, let alone use, this kind of technology.



Wednesday, March 4, 2020

Coronavirus update: Aussies wipe toilet paper from shelves



During a global viral pandemic, the residents of the colony called Australia have decided that toilet paper is the most important item to obtain, so much so that there is enough retarded people to bring down the industry to its knees, or even ankles if you’re lucky enough.

In these people’s limited mental capacity it’s most important to have a clean sphincter prior to being on the coroner’s table.

If you had shares in the corporations that manufacture toilet paper you’d be saying “that’s the shit” with reference to an excellent business decision.

The authorities are literally laughing at the (dumb) Aussie's priorities, with regards to the pandemic.

Hang on a minute, maybe these dumbos are not so stupid, ‘cause they know they’ll be in deeper shit once this hits home...

Only time will tell, where you can sit on it...



1). Face blurred of dumb Aussie, so that his children will not suffer ridicule and insensitive remarks aimed at the deadbeat dad, or

2). Face blurred, so that the retarded gene pool can continue to breed, where some lucky mum to be will get the jackpot with this muppet.


Post Scriptum: Corona virus does NOT cause dirrahoea!

Sunday, March 1, 2020

Government enacts law to capture real-time data


Centrelink, has caused many people harm, and even death in certain circumstances under an action dubbed robo-debt, where the name can allude to no ‘legal person’ taking responsibility for such action.

The reality is that the so called government was previously warned that this action is unlawful and should not be taken, but instead Centrelink still carried out this unlawful action against vulnerable people of dependence on government financial support.

In a ‘normally’ functioning country one would think that after a federal court stating that this action is unlawful, the business would then refund the stolen cash from people.

Since Australia is not a country but factually (still) a colony, no unlawfully taken cash will be refunded to the serfs, but rather they must take a class action lawsuit to recover their costs, where we know that since lawyers (etc) do not work for ‘free’ it’s another win for [a part of] the system.

As a result of the Centrelink fraud, the government installed a new law, which people may think is for the benefit of those harmed, but instead it's another ‘win’ for the authorities in obtaining more data and controlling the movements of the general population, not just those on benefits.

"That means a person’s income data from the Australian Taxation Office will be automatically uploaded to the government system so bureaucrats can double-check figures."


With every new law enacted, your rights, privileges, benefits and services are being eroded.


Will you ever wake up and do something about it?