Behind the Headlines – World Book Student
  • Search

  • Archived Stories

    • Ancient People
    • Animals
    • Arts & Entertainment
    • Business & Industry
    • Civil rights
    • Conservation
    • Crime
    • Current Events
    • Current Events Game
    • Disasters
    • Economics
    • Education
    • Energy
    • Environment
    • Food
    • Government & Politics
    • Health
    • History
    • Holidays/Celebrations
    • Law
    • Lesson Plans
    • Literature
    • Medicine
    • Military
    • Military Conflict
    • Natural Disasters
    • People
    • Plants
    • Prehistoric Animals & Plants
    • Race Relations
    • Recreation & Sports
    • Religion
    • Science
    • Space
    • Technology
    • Terrorism
    • Weather
    • Women
    • Working Conditions
  • Archives by Date

Posts Tagged ‘artificial intelligence’

LGBTQ+ Pride Month: Turing Honored on British Bill

Wednesday, June 23rd, 2021
The new polymer bank note, shown in an image provided by the Bank of England, was unveiled to the public nearly two years after officials first announced it would honor Turing. Credit: Bank of England

.Credit: Bank of England

June is LGBTQ+ Pride Month. All month long, Behind the Headlines will feature lesbian, gay, bisexual, transgender, and queer or questioning pioneers in a variety of areas.

On what would have been his 109th birthday, the English mathematician, computer pioneer, and codebreaker Alan Turing is getting a very special gift: a 50-pound (£50) note. It’s not just an old £50 bank note, however. This bank note—and millions of others—will have his face on it.

Following a public nomination process in 2019, Turing was selected to be the new face of the £50 note. His image will replace images of the engineer and scientist James Watt and the industrialist and entrepreneur Matthew Boulton. An image of Elizabeth II will remain on the obverse side of the note, or the side that bears the principal design.

Turing was recognized not only for his important contributions to the development of electronic digital computers, but also for the discrimination he faced as a gay man. After World War II (1939-1945), Turing was prosecuted for his relationship with a man. He was given the choice of either imprisonment or probation with the condition of undergoing female hormone treatment. On June 7, 1954, at the age of 41, Turing took his own life.

In 2009, the British government issued an apology. Four years later, Turing was given a royal pardon, releasing him for the legal penalties for his crime. In 2017, the Turing Law was passed, which pardoned thousands of gay and bisexual men who had been convicted of sexual offenses that have since been eliminated.

Alan Turing (far right) was an English mathematician and computer pioneer. He made important contributions to the development of electronic digital computers. Credit: Heritage-Images/Science Museum, London

Alan Turing (far right) was an English mathematician and computer pioneer. He made important contributions to the development of electronic digital computers.
Credit: Heritage-Images/Science Museum, London

Turing was born on June 23, 1912, in London. He studied mathematics at Cambridge University and Princeton University. In 1936, he developed a hypothetical computing machine—now called the Turing machine—that could, in principle, perform any calculation. The device had a long tape divided into squares on which symbols could be written or read. The tape head of the machine could move to the left or to the right. The machine also had a table to tell it the order in which to carry out operations. The Turing machine became an important model for determining what tasks a computer could perform. During World War II, Turing helped crack German codes.

After the war, he worked on a project to build the first British electronic digital computer. In 1950, he proposed a test for determining if machines might be said to “think.” This test, now called the Turing test, is often mentioned in discussions of artificial intelligence (AI).

 

 

Tags: alan turing, artificial intelligence, codebreaking, computer, lgbtq+ pride month, lgbtq+ rights, mathematics, world war ii
Posted in Current Events | Comments Off

A. M. Turing Award

Friday, April 19th, 2019

April 19, 2019

Last month, on March 27, the Association for Computing Machinery (ACM) in New York City named the computer scientists Yoshua Bengio, Geoffrey Hinton, and Yann LeCun as the recipients of the annual A. M. Turing Award. Working both independently and together, Bengio, Hinton, and LeCun are considered fathers of the “Deep Learning Revolution” that has helped usher in a new era of artificial intelligence (AI).

Alan Turing was an English mathematician and computer pioneer. He made important contributions to the development of electronic digital computers. Credit: Heritage-Images/Science Museum, London

Alan M. Turing (at right) was an English mathematician and computer pioneer. He made important contributions to the development of electronic digital computers.
Credit: Heritage-Images/Science Museum, London

The A. M. Turing Award is given to one or more individuals each year in recognition of contributions of lasting importance in the field of computing. Bengio, Hinton, and LeCun were rewarded for their conceptual and engineering breakthroughs that have made deep neural networks a critical component of computing. Neural networks are sets of algorithms, modeled loosely on the human brain, that are designed to recognize patterns. Neural networks can thus “learn” to “see,” “hear,” and “think” by differentiating among patterns. The networks are essential parts of driverless car technologies, automatic language translation programs, and automated personal assistants such as Alexa or Siri. They are also used in various forms of robotics as well as in automated stock trading and game playing programs. 

Artificial neural networks were introduced in the 1980′s, but by the early 2000′s, Bengio, Hinton and LeCun were among a small group who remained committed to furthering the technology. Their work—and the work of many others—has contributed to the recent boom in “deep learning” computer electronics. Bengio is a professor at the University of Montreal and scientific director at Mila, the Quebec Artificial Intelligence Institute. Hinton is vice president and engineering fellow of Google, chief scientific adviser of the Vector Institute in Toronto, and professor emeritus at the University of Toronto. LeCun is a professor at New York University and vice president and chief AI scientist at Facebook.

The A. M. Turing Award is named after Alan Mathison Turing, a British mathematician and computer pioneer. Turing made key contributions to the development of electronic computers, including his work helping to build the first British electronic digital computer. In 1950, he proposed a test for determining if machines might be said to “think.” This test, now called the Turing test, is still central to discussions of artificial intelligence.

The first Turing Award was given to the American computer scientist Alan J. Perlis in 1966 for his role in developing influential computer-programming techniques. Since then, an award has been given every year. As of 2014, the award includes a $1 million cash prize. Bengio, Hinton and LeCun will formally receive the A.M. Turing Award at ACM’s annual awards banquet on June 15, 2019, in San Francisco, California.

Tags: a. m. turing, a.m. turing award, artificial intelligence, Association for Computing Machinery, deep learning revolution, Geoffrey Hinton, neural networks, Yann LeCun, Yoshua Bengio
Posted in Arts & Entertainment, Business & Industry, Current Events, Education, History, People, Science, Technology | Comments Off

BCI: Mind over Movement

Thursday, July 27th, 2017

July 27, 2017

In the 1940’s, people connected wires, punched cards, and flipped switches to give instructions to early computers. Over time, the command line interface (CLI) was developed, allowing users to instruct a computer simply by typing text commands. By the mid-1980’s, CLI was replaced by the graphical user interface (GUI), a visually intuitive interface that is still in popular use today. Through GUI, users interact with windows and icons to command a computer. People around the world use GUI by pressing keys, clicking on computer mice, speaking words, and swiping screens.

Noninvasive electroencephalography based brain-computer interface enables direct brain-computer communication for training. Credit: U.S. Army

A soldier tests brain-computer interface (BCI) technology for the U.S. Army. Credit: U.S. Army

Today, a new interface looms on the horizon: brain-computer interface (BCI) (sometimes called brain-machine interface, or BMI). BCI technology creates a pathway from the user’s brain to a computer or other device, allowing direct thought communication. Brainwaves are recorded through electrodes (strips of metal that conduct electricity) attached to a person’s scalp or implanted in the brain. BCI technology has been in development since the end of the 1960′s, when the idea of using only one’s mind to control a device was barely more than fantasy. Progress was slow, but today that fantasy is at last becoming reality.

BCI allows users to command devices without using their hands or voices. Some people who suffer from paralysis or other immobility-inducing conditions have already benefited from BCI technology. These individuals have gained the ability to move and control external devices using only their minds, which allows them a greater degree of independence. In February 2017, physiatrists (physical medicine and rehabilitation physicians) from the Hunter Holmes McGuire VA Medical Center in Richmond, Virginia, announced the development of a BCI device for controlling movement in a prosthetic knee. A simple prosthetic knee normally requires manual unlocking to bend, but this device allows the patient to bend the prosthesis using only the mind.

In March 2017, a team of biomedical researchers in Cleveland, Ohio, successfully restored limited arm movement to a quadriplegic man by implanting tiny electrodes directly onto his motor cortex, a region of the brain that controls muscle movement. These electrodes connected to a device affixed to the man’s arm. After some training and practice, he was able to move his shoulder, elbow, wrist, and fingers. BCI technology is also being developed to allow individuals with locked-in syndrome to communicate with the outside world. Locked-in syndrome is characterized by the complete paralysis of voluntary muscles, except in some cases for the muscles that control eye movement.

Medical BCI milestones caught the attention of people in California’s Silicon Valley, and leading technology companies are exploring BCI’s potential for commercial use. In April, Facebook executives announced a project to create a wearable BCI device that would allow the wearer to compose words directly from the brain. CEO Mark Zuckerberg expects BCI “typing” to be up to five times faster than manual finger typing.

In March 2017, South Africa-born entrepreneur Elon Musk announced the establishment of Neuralink, a new BCI development company. The initial goal of Neuralink is to improve the lives of immobile and brain-damaged patients through the development of brain implants. Musk added, however, that the ultimate goal of the company is for BCI to improve human cognition—a goal meant to keep humanity from becoming obsolete in the face of ever-advancing artificial intelligence.

Elon Musk, Chief Executive Officer and Chief Designer, Space Exploration Technologies (SpaceX); listens to a reporter’s question during a media briefing on April 16 at NASA’s Johnson Space Center to preview the SpaceX demonstration mission to the International Space Station, currently scheduled for launch April 30. Credit: NASA

Elon Musk founded Neuralink, a BCI development company, in March 2017. Credit: NASA

A related start-up company called Kernel (founded by U.S. entrepreneur Bryan Johnson) aims to create cutting-edge BCI devices. Kernel is currently focused on improving knowledge of the human brain in hopes of augmenting it in the coming decades. Kernel and Neuralink have the same end goal: to allow humanity to compete with advanced machines by merging biological and digital intelligence and application.

Some people question the necessity of advanced BCI technology, and others dismiss such neural ambitions as science fiction. But many people predict that—in the not-too-distant future—artificial intelligence will surpass that of humans, much as human intelligence surpassed that of other animals. Musk envisions a future in which people use BCI technology to connect quickly to databases, servers, and even to one another, somewhat leveling the competition with artificial super-intelligence.

Neuroscientists disagree on how soon people can expect to be able to type directly from their brains or use telepathic devices to communicate with one another. Many do agree, however, that it is no longer a matter of if, but when humanity’s next evolutionary stage will come to fruition. Years in the future, we may look back at 2017 as a turning point in the evolution of brain-computer interface.

Tags: artificial intelligence, bci, brain-computer interface, computers, technology
Posted in Business & Industry, Current Events, People, Technology | Comments Off

Scientists and Tech Leaders Warn of Artificial Intelligence Risks

Tuesday, January 13th, 2015

January 13, 2015

Credit: © Sean Gallup, Getty Images/Thinkstock

Will systems and robots featuring artificial intelligence (AI) help us to better live in our world or rule our world? Credit: © Sean Gallup, Getty Images/Thinkstock

On January 11 a number of prominent thinkers released an open letter urging society to consider the steadily increasing capabilities of artificial intelligence—and to control its potential risks. Artificial intelligence, or AI, is a broad field of computer research that aims to imitate the capacity human being have for intuition, problem-solving, and learning. The letter’s signers include Elon Musk—the South African entrepreneur who developed PayPal, Tesla Motors, and SpaceX—and Stephen Hawking, the British theoretical physicist who modernized our understanding of black holes.

The threat that artificial intelligence could conquer and oppress humanity has long been a staple of science fiction. Early science-fiction writers even came up with ways to decrease such a threat. For example, in the 1942 short story “Runaround” by American author Isaac Asimov, robots in a future world are programmed to follow laws designed to prevent them from injuring human beings, even if ordered to do so by humans. AI systems today, however, do not resemble the robots from science fiction.

The open letter, along with its attached research priorities document, gives an update on the current state of AI research. It notes that advanced AI systems are already widely distributed and safely used in many forms of technology, including Internet search engines, speech-recognition programs, and automobile systems. The letter also notes the many potential benefits from more advanced AI.  It also disputes the notion that truly powerful AI is an impossible scheme, citing similarly incorrect predictions about the futures of nuclear energy and interplanetary travel.

Imminent threats from real-world AI could involve physical attacks from such “killer robots” as autonomous armed drones (pilotless aircraft). But AI threats also include economic disruption from labor automation, wherein workers are replaced by robotic machines and are unable to find other work. Or doctors relying on flawed AI systems to diagnose and treat people with diseases. The letter urges researchers, lawmakers, and technologists to work together to consider how to prevent such systems from causing harm. For example, new laws might require autonomous AI systems to include a human decision-maker “in the loop.”

World Book articles and links:

  • Computer
  • Open Letter on Artificial Intelligence

 

Tags: artificial intelligence
Posted in Business & Industry, Current Events, Economics, Government & Politics, Science, Technology | Comments Off

  • Most Popular Tags

    african americans archaeology art australia barack obama baseball bashar al-assad basketball black history month california china climate change conservation earthquake european union football france global warming isis japan language monday literature major league baseball mars mexico monster monday music mythic monday mythology nasa new york city nobel prize presidential election russia soccer space space exploration syria syrian civil war ukraine united kingdom united states vladimir putin women's history month world war ii