ITGS Online

‘hanging out the dirty linen’ to delve into the ethics of IT’s role in society.

ITGS Online (weekly)

Posted from Diigo. The rest of ITGSonline group favorite links are here.

ITGS Online (weekly)

  • “Ever get the feeling someone is looking over your shoulder at your phone? Well, you might not have to worry about that in the future: Google’s researchers have developed an AI tool that can spot when someone is sneaking a peek at your screen.”

    Tags: ITGS, ai, google, privacy, security

  • “Facebook has detailed the steps it’s taking to get help for people who need it. Which involves using artificial intelligence to “detect posts or live videos where someone might be expressing thoughts of suicide,” identifying appropriate first responders, and then employing more people to “review reports of suicide or self harm”.

    The social network has been testing this system in the U.S. for the last month, and “worked with first responders on over 100 wellness checks based on reports we received via our proactive detection efforts.” In some cases the local authorities were notified in order to help.”

    Tags: ITGS, facebook, ai, privacy, algorithm

  • “The world’s spookiest philosopher is Nick Bostrom, a thin, soft-spoken Swede. Of all the people worried about runaway artificial intelligence, and Killer Robots, and the possibility of a technological doomsday, Bostrom conjures the most extreme scenarios. In his mind, human extinction could be just the beginning.”

    Tags: ITGS, singularity, ai, peopleandmachines, control

  • “Brain scans, however, are quite telling, especially when analyzed with an algorithm, Brent and his colleagues discovered. “We’re trying to figure out what’s going on in somebody’s brain when they’re thinking about suicide,” says Brent. 

    These scans, taken using fMRI, or functional magnetic resonance imaging, show that strong words such as ‘death,’ ‘trouble,’ ‘carefree,’ and ‘praise,’ trigger different patterns of brain activity in people who are suicidal, compared with people who are not. That means that people at risk of suicide think about those concepts differently than everyone else—evidenced by the levels and patterns of brain activity, or neural signatures.”

    Tags: ITGS, peopleandmachines, ai, algorithms, suicide, anonymity, privacy

Posted from Diigo. The rest of ITGSonline group favorite links are here.

ITGS Online (weekly)

Posted from Diigo. The rest of ITGSonline group favorite links are here.

ITGS Online (weekly)

Posted from Diigo. The rest of ITGSonline group favorite links are here.

ITGS Online (weekly)

Posted from Diigo. The rest of ITGSonline group favorite links are here.

ITGS Online (weekly)

Posted from Diigo. The rest of ITGSonline group favorite links are here.

ITGS Online (weekly)

Posted from Diigo. The rest of ITGSonline group favorite links are here.

ITGS Online (weekly)

Posted from Diigo. The rest of ITGSonline group favorite links are here.

ITGS Online (weekly)

Posted from Diigo. The rest of ITGSonline group favorite links are here.

ITGS Online (weekly)

Posted from Diigo. The rest of ITGSonline group favorite links are here.

« Previous Entries  Next Page »

Archives

Translate