For the Greater Good

“With great power comes great responsibility!”

This is a saying we have been taught rather extensively. Whether it’s from our elders, textbooks, or movies, the message has been the same. Having power without knowing how to use it properly can be catastrophic and there is no dearth of examples for that in the history. We also have situations where we can actually decide how much power is going to be allowed to something, and when it came to technology, we were hugely generous. After all, it wasn’t anything like we had seen before. The promises weren’t the only thing that enticed us into investing in it though. Even if it wasn’t quite apparent at the time, somewhere we knew that it had the capabilities to deliver a lot more than initially envisioned. So, all our efforts started to get directed towards this phenomenon’s development. As per the plan, it soon started to grow into a force to be reckoned with.

Nevertheless, with every new milestone, the dazzling world of technology was also making us more and more reliant on it, taking over every aspect of our lives. This begin to fuel a debate over whether technology realm deserve this much trust from us or not. The debate goes on, and in all honesty, it might never end, but Apple has just delivered a strong punch to the technology doubters.

On Monday, Apple announced that it will be monitoring its users’ photo libraries to see if there are any images that give an impression of possible child abuse. However, the company has insisted that they won’t be infiltrating user’s device. Instead, the monitoring will be done only on files stored in iCloud. Another fact worth noting here is that the newly-developed monitoring system issues an alert only when a certain limit of images are exceeded, and it’s at this point when a human review will actively begin.

At the moment, the system will just be focusing on image files, but Apple has already made its plans known of expanding into videos in near future. The company has acknowledged that the system currently doesn’t have a tangible solution for hackers deliberately trying to install child abuse images in someone’s cloud system. However, Apple has reassured that considerations will be allowed to the user, if any such attempts are detected. The system is set to release later this year.

Share

Related

Chris Sunderland Joins Sierra Specialty Insurance as Senior P&C Underwriter

Sierra Specialty Insurance Services—a Managing General Agency and Wholesale...

Closing the Digital Divide

A human life revolves around many factors, but most...

Investing in Community Health

As the fourth wave of COVID-19 infections surges across...

Rolling the Fintech Wheel

To this date, we have seen plenty of discussions...

Transforming Pharma with Key Trends and Technologies at AUTOMA+ 2024

AUTOMA+ 2024 passed in Zurich, Switzerland on 18-19 November...

Betting Bigger on AI to Address the Lopsided World of Modern Employment Requirements

Google Cloud has officially announced a multi-year strategic collaboration...

Americo, Covr Financial Technologies and SCOR Join Forces to Launch a New Insurance Product

Americo Financial Life, Covr Financial Technologies (Covr) and Annuity...

Uncovering the Importance of Electronic Health Records

Electronic Health Records or EHRs are kept to help...

A Call for Safety

It’s frightening how everything we consider as a luxury...

Wiki Finance Expo Hong Kong 2024 Is Coming in May!

Regulation, Forex, Crypto, Web 3.0, NFTs, Metaverse, ESG, AI...

Latest

No posts to display

No posts to display