As NASA’s Double Asteroid Redirection Test (DART) marches on toward a collision with asteroid Dimorphos, researchers are not waiting for its results to save Earth from a potential doomsday event. Some even suggest that new research indicates that we already have the technology to prevent a “Don’t Look Up” type disaster.
Part of the appeal of Fridays is looking forward to what you’re going to do over the weekend (at least for those who typically have Saturdays and Sundays off). Maybe you start off thinking that you’ll take some time to relax, but after an hour or two on the couch or in bed (or wherever you go to unwind), a feeling of…
Guardians of the Galaxyis typically credited with Dave Bautista’s rise to cinematic stardom, but his short turn in Blade Runner 2049 is what drew attention to the actor’s dramatic chops. Thanks to 2049 (and its short film Nowhere to Run), Bautista’s since gone on to show up in pretty good genre work like Dune and …
Apple has introduced a new feature that could help declutter the App Store somewhat. Per support documentation spotted by MacRumors, the marketplace now supports unlisted apps that users can only access through a direct link. Should a developer feel their software isn’t suited for public use, they can make a request through Apple’s website to distribute it as an unlisted app.
If the company grants the request, the app won’t appear “in any App Store categories, recommendations, charts, search results or other listings,” according to Apple. Outside of a direct link, it’s possible to access unlisted apps through Apple’s Business and School Manager platforms.
The company suggests that the new distribution method is ideal for apps that were designed for specific organizations, special events, research studies and other similar use cases. It notes, however, that unlisted apps aren’t a replacement for its TestFlight process since it will decline software that’s in a pre-release or beta state.
Otherwise, Apple notes it will consider both new and existing apps. Once an app is approved, its status as an unlisted app will apply to any future versions of the software a developer may release. In the case of any existing apps, their App Store link will remain the same.
Blizzard recently hinted at a new survival game project, and its newly-unionized developers are super-excited about it. This may well be the most bizarre non-announcement of a new title in the history of gaming, but considering the parties involved it’s ripe for discussion. You see, the company has framed the “announcement” through the lens
NBC News analyzed data from 8,892 weather stations with records going back at least 30 years. 691 of them recorded their highest temperature ever in 2021.
And there’s more cause for concern:
Each January, the National Oceanic and Atmospheric Administration, NASA and the European Union Earth observation agency Copernicus publish reports on the previous year’s temperature data. Copernicus ranked 2021 as the fifth-hottest year since 1850, while NOAA and NASA ranked it as the sixth-hottest since 1880…
In 2021, as Europe recorded its hottest summer, June’s weather anomalies in North America were so significant that the continent recorded its hottest June in 171 years, according to the January Copernicus report. The record-breaking heat was even more notable, scientists say, given that 2021 was a La Niña year, in which climate patterns in the Pacific Ocean produce cooler temperatures across the globe.
An August 2021 United Nations International Panel on Climate Change report concluded that climate change caused by humans “is already affecting many weather and climate extremes in every region across the globe.” Friederike Otto [senior lecturer in climate science at the Grantham Institute for Climate Change and the Environment in London who helped write the report] said that last year’s weather events proved 2021 was “a year that made the evidence unavoidable.” Scientists say damaging spring frosts — such as the one that destroyed winemakers’ crops in France last April — are an example of a weather event that is more likely in a warming world. Denis Lesgourgues, co-owner of ChÃteau Haut Selve, a vineyard in southwest France, lost 60 percent of his crop during last year’s spring freeze. Warmer winters have caused grapevine buds to grow earlier in the year, leaving them vulnerable to previously harmless early spring frosts. Lesgourgues said that now if the buds are out when the frosts hit, they die and are unable to grow grapes….
In other parts of the world, the increased heat can become a matter of life or death. In Portland, the June heat wave sent temperatures up to 116 degrees, shattering heat records by as much as 9 degrees Fahrenheit (5 degrees Celsius) and killing hundreds of people in the region.
It’s the weekend, which means it’s time for another Dealmaster. Our latest roundup of the best deals from around the web includes a number of sales for Lunar New Year across video game storefronts. Steam looks to have the most sweeping selection of deals for PC gamers, but the Epic Games Store is running its own sale as well, and some of the deals from those promotions are also available at competing stores like GOG and Humble. And on the console side, Microsoft has discounted several Xbox games for the occasion.
In general, these sales aren’t quite as extensive as the ones we saw around the holidays, but they still include several notable discounts on games we like. Past Ars game of the year winners Psychonauts 2, Hades, and Celeste are all available for less than their average going rates, as are several of the lesser-hyped gems we recommended during Steam’s summer sale last year. We’ve noted a few more highlights—including deals on Half-Life: Alyx, Untitled Goose Game, and Halo Infinite, among others—below.
End dates for these sales vary, but Steam says its promotion will end on February 3, while Epic says its sale will wrap up on February 10. If you shop through the latter, note that you can use the $10 coupon offer the company has rolled out for previous sales—but you’ll need to sign up for Epic’s emails and alerts program (or already be signed up) to access it. If you can live with that, you’ll get a coupon that’ll take an additional $10 off the deals already in place, provided your cart totals $14.99 or more. This means you could get, say, Hades for $6 instead of its current discounted price of $16. Unlike the Epic sale we saw last month, however, this coupon is one-time use only.
The new year is well underway, and we’ve already started testing out a wide variety of gadgets, devices and components. This week, Devindra Hardawar played with NVIDIA’s RTX 3050 and deemed it a great deal — if it stays at its original price. Steve Dent shot with the Sony’s new A7 IV camera to test out its autofocus, video and image quality improvements, while Billy Steele spent time with both the Jabra Elite 4 Active earbuds and the Shure Aonic 40 over-ear headphones.
Though the RTX 3050 is supposed to be an affordable way to hit 1080p/60fps while gaming, the $250 GPU may wind up costing considerably more due to demand and chip shortages. Devindra Hardawar says the graphics card, which is the lowest priced NVIDIA GPU to also feature ray tracing, is a fantastic component with 2,560 CUDA cores, 8GB of GDDR6 RAM and a boost speed of 1,777 MHz.
Devindra says the RTX 3050 tackled everything he threw at it during testing, averaging 140fps in Hitman 3’s Dubai benchmark. He was impressed at how well the card handled demanding games like Control, where it reached 65fps on average in 1080p (without ray tracing). It also stayed surprisingly quiet and cool at 60 degrees Celsius. While Devindra says it’s best suited to 1080p gaming, he also says it’s an absolute steal – if the price stays low.
With a higher resolution 33-megapixel sensor, improved video capabilities, and an updated autofocus, Steve Dent found a lot to like about the Sony A7 IV mirrorless camera. He even called the hybrid shooter a near-perfect package, and he particularly liked its sturdy grip, precise buttons and dials and the fully articulated rear touch display. Sony’s well-organized menu system also made it easy for him to navigate through the controls.
The main drawback on this camera is a rolling shutter issue: Steve says while shooting silently in electronic mode, the camera needs to be steady and the subject should be smooth, or artifacts like slanted lines will appear in the shots. However, he was quite impressed with the AI autofocus features, which made the A7 IV easy to use and the most reliable camera he’s tested. At $2,500 the A7 IV is more expensive at launch than previous models, but Steve says the improvements in image quality, video and color science make the A7 IV another winner in Sony’s camera lineup.
Jabra’s Elite 4 Active true wireless earbuds continue the company’s trend towards offering small earbuds with a wide array of hands-free features. With a IP57 water resistance, they’re also more useful during workouts, and Billy Steele says their small size makes them more comfortable to wear as well. The new model includes features often seen in pricer models like HearThrough, SideTone and Find My, most of which are adjustable in the companion mobile app.
Billy found the Elite 4 Active delivered good but not great sound quality – while they provided decent clarity and nice detail overall, they lacked a wider soundstage and depth. However, he said the call quality was slightly better than most earbuds thanks to the reduced background noise. Battery tests showed that the Elite 4 Active buds lasted a little over seven hours — enough to get through most of a work day — and the $120 price is competitive.
Billy Steele was pleased to see that Shure didn’t make too many compromises when it came to the Aonic 40 over-ear headphones. Made from aluminum alloy and glass-filled nylon, the cans are easy to fold flat for traveling and have physical buttons for on-board controls, but Billy says he found them a bit uncomfortable to wear for long periods of time. He was more impressed by the Shure app, which provides a robust equalizer, plenty of adjustable settings and the ability to make your own presets.
Billy says while the Aonic 40 offers a punchy bass and clarity across volume levels, the soundstage isn’t wide open and songs lack immersive depth often heard with other headphones. The noise cancellation and ambient sound modes were only decent, but the call quality was above average. The $249 headphones particularly excelled during battery testing where they lasted over 30 hours. Overall, Billy says the Aonic 40’s are a relatively affordable option, but lack polish on the finer details.
Even if you know nothing else about baseball, you probably are aware that Babe Ruth is one of the game’s most famous players. Sure, if you grew up in the 1990s, then there’s a chance you were introduced to The Babe—aka the Sultan of Swat, the Titan of Terror, the Colossus of Clout, the King of Crash, the Great Bambino…
Slashdot reader sciencehabit quotes Science magazine: Imagine using any object around you—a frying pan, a glass paperweight—as the central processor in a neural network, a type of artificial intelligence that loosely mimics the brain to perform complex tasks. That’s the promise of new research that, in theory, could be used to recognize images or speech faster and more efficiently than computer programs that rely on silicon microchips.
To demonstrate the concept, the researchers built neural networks in three types of physical systems, which each contained up to five processing layers. In each layer of a mechanical system, they used a speaker to vibrate a small metal plate and recorded its output using a microphone. In an optical system, they passed light through crystals. And in an analog-electronic system, they ran current through tiny circuits.
In each case, the researchers encoded input data, such as unlabeled images, in sound, light, or voltage. For each processing layer, they also encoded numerical parameters telling the physical system how to manipulate the data. To train the system, they adjusted the parameters to reduce errors between the system’s predicted image labels and the actual labels. In one task, they trained the systems, which they call physical neural networks (PNNs), to recognize handwritten digits. In another, the PNNs recognized seven vowel sounds. Accuracy on these tasks ranged from 87% to 97%, they report in this week’s issue of Nature. In the future, researchers might tune a system not by digitally tweaking its input parameters, but by adjusting the physical objects—warping the metal plate, say. The team is most excited about PNNs’ potential as smart sensors that can perform computation on the fly. A microscope’s optics might help detect cancerous cells before the light even hits a digital sensor, or a smartphone’s microphone membrane might listen for wake words. These “are applications in which you really don’t think about them as performing a machine-learning computation,” they say, but instead as being “functional machines.”
Netflix’s love affair with Masters of the Universe isn’t about to cool down any time soon. The streaming service is partnering with Mattel to develop a live-action Masters of the Universe movie — no, they weren’t put off by the 1987 flop. Production is expected to start this summer, with the Nee Brothers (who created the upcoming The Lost City) co-directing the title and writing it alongside Shang-Chi‘s David Callaham.
The companies haven’t divulged much about the plot, but they’ve already chosen Kyle Allen (Balkan in West Side Story) as Prince Adam/He-Man. Not surprisingly, there are hints Adam will discover his power as He-Man and fight Skeletor to protect Eternia.
This isn’t a surprising move when MOTU has been lucrative for Netflix. Its She-Ra reboot had five seasons, and Kevin Smith’s Masters of the Universe: Revelation is starting its second season in March. There’s also a child-oriented CG animated series, He-Man and the Masters of the Universe. Between this and other ’80s flashbacks, Netflix appears to know what nostalgia makes its audience tick.
The Silent Era of cinema was perhaps its most equitable with both hearing and hearing-impaired viewers able to enjoy productions alongside one another, but with the advent of “talkies,” deaf and hard-of-hearing American’s found themselves largely excluded from this new dominant entertainment medium. It wouldn’t be until the second half of the 20th century that advances in technology enabled captioned content to be broadcast directly into homes around the country. In his latest book, Turn on the Words! Deaf Audiences, Captions, and the Long Struggle for Access, Professor Emeritus, National Technical Institute for the Deaf at Rochester Institute of Technology, Harry G. Lang, documents the efforts of accessibility pioneers over the course of more than a century to bring closed captioning to the American people.
To the millions of deaf and hard of hearing people in the United States, television before captioning had been “nothing more than a series of meaningless pictures.” In 1979, Tom Harrington, a twenty-eight-year old hard of hearing audiovisual librarian from Hyattsville, Maryland, explained that deaf and hard of hearing people “would like to watch the same stuff as everyone is watching, no matter how good or how lousy. In other words, to be treated like everyone else.”
On March 16, 1980, closed captioning officially began on ABC, NBC, and PBS. The first closed captioned television series included The ABC Sunday Night Movie, The Wonderful World of Disney, and Masterpiece Theater. In addition, more than three decades after the movement to make movies accessible to deaf people began, ABC officially opened a new era by airing its first closed captioned TV movie, Force 10 from Navarone.
By the end of March 1980, sixteen captioned hours of programming were going out over the airwaves each week, and by the end of May, Sears had sold 18,000 of the decoding units within four months of offering them for sale. Sears gave NCI an $8 royalty for each decoding device sold. The funds were used to defray the costs of captioning. In addition to building up a supply of captioned TV programs during its first year of operation, so that a sufficient volume would be available for broadcast, NCI concentrated on training caption editors. A second production center was established in Los Angeles and a third in New York City.
John Koskinen, chairman of NCI’s board, reflected on the challenges the organization faced at this time. A much smaller market for the decoders was evident than that estimated through early surveys. As with the telephone modem that was simultaneously developing, the captioning decoders cost a significant sum for most deaf consumers in those days, and the expense of a decoder did not buy a lot because not all the captioned hours being broadcast were of interest to many people. Although the goal was to sell 100,000 decoders per year, NCI struggled to sell 10,000, and this presented a financial burden.
To help pay for the captioning costs, NCI also set up a “Caption Club” to raise money from organizations serving deaf people and from other private sources. By December 1983, $15,000 was taken in and used to pay for subtitles on programs that otherwise would not be captioned. By 1985, there were 3,500 members promoting the sales.
Interestingly, when sales suddenly went up one year, NCI investigated and found that the Korean owner of an electronics store in Los Angeles was selling decoders as a way to enhance English learning.
The next big breakthrough was the move toward the use of digital devices recently adopted by court recorders that, for NCI, allowed the captioning of live television. Having the ability to watch the evening news and sporting events with captions made the purchase of a decoder more attractive, as did the decline in its price over time.
When the American television network NBC showed the twelve hour series Shogun in 1980, thousands of deaf people were able to enjoy it. The $20 million series was closed captioned and 30,000 owners of the special decoder sets received the dialogue.
Jeffrey Krauss of the FCC admitted that deaf people had not had full access to television from the very beginning: “But by early 1980 it should be possible for the deaf and [hard of hearing] to enjoy many of the same programs we do via a new system called ‘closed captioning.’” Sigmond Epstein, a deaf printer from Annandale, Virginia, felt that “there is more than a 100 percent increase in understanding.” And Lynn Ballard, a twenty-five-year-old deaf student from Chatham, New Jersey, believed that closed captioning would “improve the English language skills and increase the vocabulary of deaf children.” Newspaper reports proliferated, describing the newfound joy among deaf people in gaining access to the common television. Educators recognized the technological advance as a huge leap forward. “I consider closed captioning the single most important breakthrough to give the deaf access to this vital medium,” said Edward C. Merrill Jr., president of Gallaudet College, adding presciently, “Its usage will expand beyond the hearing-impaired.” And an ex-cop cried when his deaf wife wept for joy at understanding Barney Miller. He wrote a letter to the TV networks, cosigned by their six small children, to tell of the new world of entertainment and learning now open to his wife.
3-2-1 Contact was among the first group of television programs, and the first children’s program, to be captioned in March 1980. This science education show produced by Children’s Television Workshop aired on PBS member stations for eight years. Later that same year, Sesame Street became the second children’s program to be captioned and became the longest running captioned children’s program. — “NCI Recap’d,” National Captioning Institute
The enthusiasm continued to spread swiftly among deaf people. Alan Hurwitz, then associate dean for Educational Support Services at NTID, and his family were all excited about the captioning of primetime television programs. Hurwitz, who would eventually be president of Gallaudet University, was, like everyone else at this time, hooked on the new closed captioning technology. One of his favorite programs in 1981 was Dynasty, which was shown weekly on Wednesday night at 9 p.m. He flew to Washington, DC, early one Wednesday morning to meet with congressional staff members in different offices all day long. Not having a videotape recorder, he made sure he had scheduled a flight back home in time to watch Dynasty. After the meetings he arrived at the airport on time only to find out that the plane was overbooked and he was bumped off and scheduled for a flight the next morning. He panicked and argued with the airline clerk that he had to be home that night, and stressed that he couldn’t miss the flight. He was put on a waiting list and there were several folks ahead of him. Then, when he learned that he would definitely miss the flight, he went back to the clerk and insisted that he get on the plane. He explained that he had no way to contact his wife and was concerned about his family. Finally, the clerk went inside the plane and asked if anyone would like to get off and get a reward for an additional flight at no cost. One passenger volunteered to get off and Hurwitz was allowed to take his seat. The plane left a bit late and arrived in Rochester barely in time for him to run to his car in the parking lot and drive home to watch Dynasty!
And even with the positive response from many consumers, it was reported in 1981 that the Sears TeleCaption decoders were not selling well. It was a catch-22 situation. “People hesitate to buy because more programs aren’t captioned; more programs aren’t captioned because not that large an audience has adapters.” Increasing one would clearly increase the other. The question was whether to wait for “the other” to happen. To do so would most likely endanger a considerable federal investment as well as the continued existence of the system. Some theorized that the major factors for the poor sale of decoders were the depressed state of the economy, the lack of a captioned prime-time national news program (which deaf and hard of hearing people cited as a top priority), insufficient numbers of closed captioned programs, and an unrealistic expectation by some purchasers that decoder prices would decrease in spite of the fact that the retailer markup was slightly above the actual production cost.
Captioning a TV Program: A Continuing Challenge
On average, it took twenty-five to forty hours to caption a one-hour program. First, the script was typed verbatim, including every utterance such as “uh,” stuttering, and so forth. Asterisks were inserted to indicated changes in speakers. Next, the time and place of the wording was checked in the program. The transcript was examined for accuracy, noting when the audio starts and stops, and then it was necessary to decide whether the captions should be placed on the left, right, or center of the screen. In 1981, NCI’s goal was to provide no more than 120 to 140 reading words per minute for adult programs and sixty to ninety for children’s programs.
“We have to give time for looking at the picture,” Linda Carson, manager of standards and training at NCI, explained. “A lot of TV audio goes up to 250 or 300 words per minute. That’s tough for caption writers. If the time lapse for a 15-word sentence is 4 ½ seconds, then the captioner checks the rate computation chart and finds out she’s got to do it in nine words.”
Carl Jensema, NCI’s director of research, who lost his hearing at the age of nine, explained that at the start of kindergarten, hearing children have about 5,000 words in their speaking vocabulary, whereas many deaf children are lucky to have fifty. Consequently, deaf children had very little vocabulary for the school to build on. Jensema believed that closed captioning might be the biggest breakthrough for deaf people since the hearing aid. He was certain that a high degree of exposure to spoken language through captioned television was the key to enhanced language skills in deaf people.
CBS Resists
Although ABC, PBS, and NBC were involved in collaborating with NCI to bring captions to deaf audiences, the system CBS supported, teletext, was developed in the United Kingdom and was at least three years away from implementation. “It seems to me that CBS, by not going along with the other networks, might be working in derogation of helping the deaf or the hearing-impaired to get this service at an earlier date—and I don’t like it.” FCC commissioner Joseph Fogarty told Gene Mater, assistant to the president of the CBS Broadcast Group. Despite the success of line 21 captioning, CBS’s Mater believed the teletext system was “so much better” and the existing system was “antiquated.” “I think what’s unfortunate is that the leadership of the hearing-impaired community has not seen fit to support teletext. Those people who have seen teletext recognize it as a communications revolution for the deaf.” In contrast, NCI’s Jeff Hutchins summarized that the World System Teletext presented various disadvantages. It could not provide real-time captioning, “at least not in the way we have seen it . . .” Also, it could not work with home videotape. He believed that even if World System Teletext were adopted by the networks and other program suppliers, the technology would not be an answer for the needs of the American Deaf community. He also explained that “too many services now enjoyed by decoder owners would be lost.”
CBS even petitioned the FCC in July 1980 for a national teletext broadcasting standard. Following this, the Los Angeles CBS affiliate announced plans to test teletext in April 1981. “CBS was so opposed to line 21 that even when advertisers captioned their commercials at no charge to CBS,” Karen Peltz Strauss wrote, “the network allegedly promised to strip the captions off before airing the ads.”
CBS continued its refusal to join the closed captioning program, largely because of its own research into the teletext system and because the comparatively low number of adapters purchased. The NAD accused CBS of failing to cooperate with deaf television viewers by refusing to caption its TV programs.
The NAD planned nationwide protests shortly after this. Hundreds of captioning activists gathered at studios around the country. In Cedar Rapids, one young child carried a sign that read, “Please caption for my Mom and Dad.” Gertie Galloway was one of the disappointed deaf consumers. “CBS has not cooperated with the deaf community,” she stated. “We feel we have a right to access to TV programs.” She was one of an estimated 300 to 400 people carrying signs, who marched in front of the CBS studio in Washington and who were asking supporters to refuse to watch CBS for the day. Similar demonstrations were held in New York, where there were 500 people picketing, and the association said that protests had been scheduled in the more than 200 communities where CBS had affiliates.
Harold Kinkade, the Iowa Association of the Deaf vice president, said, “I don’t think deaf people are going to give up on this one. We always fight for our rights to be equal with the people with hearing.”
The drama increased in August 1982 when it was announced that NBC was dropping captions due to decreased demand. It was two years after NBC had become a charter subscriber. John Ball, president of NCI, said, “There is no question that this hurts. This was a major revenue source for NCI. I think the next six months or so are going to be crucial for us.”
Captioning advocates included representatives from NTID, the National Fraternal Society of the Deaf, Gallaudet, and NAD. Karen Peltz Strauss tells the story of Phil Bravin, chair of a newly established NAD TV Access Committee, who represented the Deaf community in a meeting with NBC executives. Although the NBC meeting was successful, CBS was still resisting and Bravin persisted. As Strauss summarized, “After one particularly frustrating three-hour meeting with the CBS President of Affiliate Relations Tony Malara, Bravin left, promising to ‘see you on the streets of America.’”
In 1984, CBS finally gave in, and the network dual encoded its television programs with both teletext and line 21 captions. The issue with NBC also resolved, and by 1987 the network was paying a third of the cost of the prime-time closed captioning. The rest was covered by such sources as independent producers and NCI, with funds from the US Department of Education used for captioning on CBS and ABC as well.
In his book Closed Captioning: Subtitling, Stenography, and the Digital Convergence of Text with Television, Gregory J. Downey summarized that because the film industry was unwilling to perform same-language subtitling for its domestic audience, the focus of deaf and hard of hearing persons’ “educational and activist efforts toward media justice through subtitling in the 1970s and 1980s had decisively moved away from the high culture of film and instead toward the mass market of television.”
Meanwhile, teachers and media specialists in schools for deaf children across the United States were reporting that their students voluntarily watched captioned TV shows recorded on videocassettes over and over again. These youngsters were engaged in reading, with its many dimensions and functions. In the opinion of some educators, television was indeed helping children learn to read.
People at NCI looked forward to spin-offs from their efforts. They liked to point out that experiments on behalf of deaf people produced the telephone and that the search for a military code to be read in the dark led to braille. Closed captioning should be no different in that regard. The technology also showed promise for instructing hearing children in language skills. Fairfax County public schools in Virginia, authorized a pilot project to study the effectiveness of captioned television as a source of reading material. The study explored the use of closed captioned television in elementary classrooms, evaluated teacher and student acceptance of captioning as an aid to teaching reading, and served as a guide to possible future expansion of activities in this area. Instead of considering television as part of the problem in children’s declining reading and comprehension skills, Fairfax County wanted to make it part of the solution. Promising results were found in this study as well as in other NCI-funded studies with hearing children, and when NCI’s John Ball submitted his budget request to Congress for fiscal year 1987 he was citing “at least 1,500,000 learning disabled children” as a potential audience for captioning and the market for decoder purchases.
In a personal tribute to Carl Jensema, Jeff Hutchins wrote that the only aspect of NCI that really made it an “institute” was the work Carl did to research many different aspects of captioning, including its readability and efficacy among consumers. His work led to a revision of techniques, which made captioning more effective. Once Carl left NCI and the research department was shut down, NCI was not really an “institute” any longer. John Ball also believed in the importance of Jensema’s research at NCI. His studies clearly demonstrated the impact of captioning on NCI’s important audience.
Real-Time Captioning
As early as 1978, the captioning program began to fund developmental work in real-time captioning with the objective of making it possible to caption live programs, such as news, sports, the Academy Awards, and space shuttle launches. This developmental work, however, did not result in the system finally being used. The Central Intelligence Agency (CIA) was exploring a system that would allow the spoken word to appear in printed text. As it turned out, a private concern resulted from the CIA project, Stenocomp, which marketed computer translations to court reporters. The Stenocomp system relied on a mainframe computer and was thus too cumbersome. However, when Stenocomp went out of business, a new firm developed—Translation Systems, Inc. (TSI) in Rockville, Maryland. Advances in computer technology made it possible to install the Stenocomp software into a minicomputer. This made it possible for the NCI to begin real-time captioning using a modified stenotype machine linked to a computer via a cable.
On December 20, 1982, the Ninety-Seventh Congress passed a joint resolution authorizing President Ronald Reagan to proclaim December as “National Close-Captioned Television Month.” The proclamation was in recognition of the NCI service that made television programs meaningful and understandable for deaf and hard of hearing people in the United States.
By 1982, NCI was applying real-time captioning to a variety of televised events, including newscasts, sports events, and other live broadcasts, bringing deaf households into national conversations. The information, with correct punctuation, was brought to viewers through the work of stenographers trained as captioners typing at speeds of up to 250 words per minute. Real-time captioning was used in the Supreme Court to allow a deaf attorney, Michael Chatoff, to understand the justices and other attorneys.
However, fidelity was not the case for many years on television, and problems existed with real-time captioning. In real-time captioning, an individual typed the message into an electric stenotype machine, similar to those used in courtrooms, and the message included some shorthand. A computer translated the words into captions, which were then projected on the screen. Because “this captioning occurred ‘live’ and relies on a vocabulary stored in the software of the computer, misspellings and errors* could and did occur during transcriptions.”
Over the years, many have worked toward error reduction in realtime captioning. As the Hearing Loss Association of America has summarized, “Although real-time captioning strives to reach 98 percent accuracy, the audience will see errors. The caption writer may mishear a word, hear an unfamiliar word, or have an error in the software dictionary. In addition, transmission problems can create technical errors that are not under the control of the caption writer.”
At times, captioners work in teams, similar to some sign language interpreters, and provide quick corrections. This was the approach the pioneer Martin Block used during the Academy Awards in April 1982. Block typed the captions while a team of assistants provided him with correct spellings of the award nominees.
There has also been a growing body of educational research supporting the benefits of captions. As one example, E. Ross Stuckless referred to the concept of real-time caption technology in the early 1980s as the “computerized near-instant conversion of spoken English into readable print.” He also described the possibility of using real-time captioning in the classroom. Michael S. Stinson, another former colleague of mine and also a deaf research faculty member at NTID at RIT, was involved with Stuckless in the first implementation and evaluation of real-time captioning as an access service in the classroom. Stinson subsequently obtained numerous grants to develop C-Print access through real-time captioning at NTID, where hundreds of deaf and hard of hearing students have benefited in this postsecondary program. C-Print also was found successful in K–12 programs.
Communication Access Real-Time Translation (CART) is another service provided in a variety of educational environments, including small groups, conventions, and remote transmissions to thousands of participants viewing through streaming text. Displays include computers, projection screens, monitors, or mobile devices, or the text may be included on the same screen as a PowerPoint presentation.
Special approaches have been used in educational environments. For example, at NTID, where C-Print was developed by Stinson, the scripts of the classroom presentations and communication between professors and students are printed out, and errors are corrected and given to the students to study.
In October 1984, ABC’s World News This Morning became the first daytime television program to be broadcast to viewers with decoders through real-time captioning technology. Within a few weeks, the ABC’s Good Morning America was broadcast with captions as well. “This is a major milestone in the evolution of the closed-captioned television service,” John E. D. Ball declared, describing it as a “valued medium” to deaf and hard of hearing viewers. Don Thieme, a spokesman for NCI, explained that the Department of Education had provided The Caption Center with a $5.3 million contract. These two programs joined ABC’s evening news program World News Tonight and the magazine show 20/20 as the only regularly scheduled news and public affairs available for deaf viewers. The captioned news programs would be phased in gradually during the summer and early fall. Real-time captioning was also provided for the presidential political debates around this time. More than sixty-five home video movies had also been captioned for deaf people. This was an important step toward providing more access to entertainment movies for deaf consumers.
The first time the Super Bowl was aired with closed captions was on January 20, 1985. In September 1985, ABC’s Monday Night Football became the first sports series to include real-time captioning of commentary. ABC, its affiliates, the US Department of Education, advertisers, corporations, program producers, and NCI’s Caption Club helped to fund this program. Using stenotype machines, speed typists in Falls Church, Virginia, listened to the telecast and produced the captions at about 250 words per minute and they appeared on the screen in about four seconds. Each word was not typed separately. Instead, the captioner stroked the words out phonetically in a type of shorthand. Then a computer translated the strokes back into the printed word. These words were sent through phone lines to the ABC control room in New York City, where they were added to the network signal and transmitted across the country. Darlene Leasure, who was responsible for football, described one of the challenges she encountered: “When I was programming my computer at the beginning of the season, I found thirteen Darrels with seven different spellings in the NFL. It’s tough keeping all those Darrels straight.”
As TV shows with closed captions grew in popularity, deaf people were attracted away from the captioned film showings at social clubs or other such gatherings. The groups continued to hold their meetings, but for most gatherings the showing of captioned films gradually stopped. At the same time, telecommunications advances had brought telephone access to deaf people and there was less need for face-to-face “live” communication. Together, the visual telecommunications and captioned television technologies profoundly impacted the way deaf people interacted.
The internet’s favorite Simpsons scene, often referred to as “Steamed Hams,” has been recreated in hundreds of strange ways over the years. Now someone has turned the whole thing into a playable point-and-click adventure game, not unlike classics like Monkey Island.
A federal appeals court voted unanimously on Friday to uphold California’s SB-822 net neutrality law, reports The Verge. One year after the Federal Communications Commission repealed net neutrality rules that applied nationwide, the state passed its own set of laws. Those rules barred internet service providers from blocking, as well as throttling select websites and services. However, California could not begin enforcing those laws due to two separate legal challenges.
The first came from the Department of Justice. Under former President Donald Trump, the agency sued the state, arguing its laws were pre-empted by the FCC’s repeal of the Obama-era Open Internet Order. In February 2021, the Justice Department dropped its complaint. Later that same month, a federal judge ruled in favor of the state in a separate lawsuit involving multiple telecom trade groups. This week’s ruling upholds that decision.
In its ruling, the Ninth Circuit Court of Appeals said the lower court “correctly denied” the preliminary injunction brought against California by the telecom industry. It said the FCC “no longer has the authority” to regulate internet services in the way that it did when it previously classified them as telecommunications services. “The agency, therefore, cannot preempt state action, like SB-822, that protects net neutrality,” the court said.
The four trade groups behind the original lawsuit – the American Cable Association, CTIA, the National Cable and Telecommunications Association and USTelecom – said they were “disappointed” by the decision and that they plan to review their options. “Once again, a piecemeal approach to this issue is untenable and Congress should codify national rules for an open Internet once and for all,” the groups told CNBC.
After months of stalemate at the FCC, federal action on net neutrality could come soon. Next week, the Senate Commerce Committee will decide whether to advance Gigi Sohn’s nomination to a full vote of the Senate. President Biden picked Sohn to fill the final empty commissioner seat on the FCC. Her confirmation would give Democrats a three to two edge on the FCC, allowing it to advance the president’s telecom-related policies.
Until now, no Valhall devices (Mali-G57, Mali-G78) ran mainline Linux. Whilst this made driver development obviously difficult, there’s no better time to write drivers than before the devices even get into the hands of end-users. Here’s a tale from Alyssa Rosenzweig, on how she wrote an open-source GPU driver – without the hardware.
The Flashhas been through several speedsters (both fairly good and pretty bad), alien invasions, and universe ending crises through its lifetime. Lest you think the current eighth season would be the last of the CW’s Scarlet Speedster, you are mistaken, because the show is coming back for season nine.
Last week America’s Internal Revenue Service announced a live-video-feed verification of taxpayer’s faces would be required by this summer access online tax service. But now the Washington Post reports that “complaints of confusing instructions and long wait times to complete the sign-up have caused an unknown number to abandon the process in frustration.”
“The $86 million ID.me contract with the IRS also has alarmed researchers and privacy advocates who say they worry about how Americans’ facial images and personal data will be safeguarded in the years to come.”
There is no federal law regulating how the data can be used or shared. While the IRS couldn’t say what percentage of taxpayers use the agency’s website, internal data show it is one of the federal government’s most-viewed websites, with more than 1.9 billion visits last year. The partnership with ID.me has drawn anger from some members of Congress, including Sen. Ron Wyden (D-Ore.), who tweeted that he was “very disturbed” by the plan and would push the IRS for “greater transparency.” Rep. Ted Lieu (D-Calif.) called it “a very, very bad idea by the IRS” that would “further weaken Americans’ privacy.” The Senate Finance Committee is working to schedule briefings with the IRS and ID.me on the issue, a committee aide said…. “No one should be forced to submit to facial recognition as a condition of accessing essential government services,” Wyden said in a separate statement. “I’m continuing to seek more information about ID.me and other identity verification systems being used by federal agencies.”
A Treasury official said Friday that the department was “looking into” alternatives to ID.me, saying Treasury and the IRS always are interested in improving “taxpayers experience….”
About 70 million Americans who have filed for unemployment insurance, pandemic assistance grants, child tax credit payments or other services already have been scanned by the McLean, Va.-based company, which says its client list includes 540 companies; 30 states, including California, Florida, New York and Texas; and 10 federal agencies, including Social Security, Labor and Veterans Affairs…. Equifax, the credit-reporting company that previously confirmed taxpayers’ data for the IRS, had its $7 million contract suspended in 2017 after hackers exposed the personal information of 148 million people…
[ID.me] says 9 of 10 applicants can verify their identity through a self-service face scan in five minutes or less. Anyone who hits a snag is funneled into the backup video-chat verification process…But some who have tried to verify their identities through ID.me for other purposes have reported agonizing delays: cryptic glitches in Colorado, website errors in Arizona, five-hour waits in North Carolina, days-long waits in California and weeks-long benefit delays in New York. The security blogger Brian Krebs wrote last week that he faced a three-hour wait trying to confirm his IRS account, three months before the tax-filing deadline…. The company said it intends to expand its workforce beyond the 966 agents who now handle video-chat verification for the entire country. It has also opened hundreds of in-person identity-verification centers — replicating, in essence, what government offices have done for decades.
The article also points out that advertising is also a key part of ID.me’s operation, with people signing up through their web site asked if they want to subscribe to “offers and discounts” — though the company stresses people do have to opt in. And in addition, the article adds, “If a person is using ID.me to confirm their identity with a government agency, the company will not use that verification information for ‘marketing or promotional purposes,’ the company’s privacy policy says.”
But a senior counsel at the Electronic Privacy Information Center complained to the Post that “We haven’t even gone the step of putting regulations in place and deciding if facial recognition should even be used like this. We’re just skipping right to the use of a technology that has clearly been shown to be dangerous and has issues with accuracy, disproportionate impact, privacy and civil liberties.”
A spokesperson for the U.S. Treasury Department also told Bloomberg News “that any taxpayer who does not want to use ID.me can opt against filing his or her taxes online.”
“We believe in the importance of protecting the privacy of taxpayers, while also ensuring criminals are not able to gain access to taxpayer accounts,” LaManna added, arguing that it’s been “impossible” for the IRS to develop its own cutting-edge identification program because of “the lack of funding for IRS modernization.”
Epic Games has some important allies in its bid to overturn a court ruling that cleared Apple of violating antitrust laws. CNET and FOSS Patents report Microsoft, the Electronic Frontier Foundation and the attorneys general of 35 states have filed briefs supporting Epic’s case with the US Court of Appeals for the Ninth Circuit. The states argued the district court mistakenly claimed the first section of the Sherman Act (a cornerstone of US antitrust law) didn’t apply to unilateral contracts like the terms Apple set for developers. The court also didn’t properly weigh the damage of Apple’s claimed anti-competitive behavior versus the benefits, according to the brief.
Microsoft, meanwhile, noted that it still had reason to be concerned about Apple’s “extraordinary gatekeeper power” despite its size, citing its own interest in maintaining competition and innovation. This included allegedly anti-competitive behavior beyond the rules affecting Epic. Apple’s effective ban on cloud gaming services in the App Store is believed to hurt Xbox Game Pass Ultimate, as an example. Microsoft also disputed the district court’s view that Apple’s in-app payment requirement wasn’t an anti-competitive effort to tie products together.
The EFF, meanwhile, echoed the states’ concerns about weighing harmful effects while offering parallels to Microsoft’s interpretation of tying. The foundation also said the district court made errors when it presumed customers were fully aware of Apple’s policies when they joined the company’s platform.
Apple remained confident in its chances. In a statement to CNET, the company said it was “optimistic” the district court’s ruling would be upheld and maintained its view that it was providing a “safe and trusted” App Store offering a “great opportunity” for creators. Epic has declined to comment.
Briefs like these won’t guarantee success for Epic — the appeals court isn’t obliged to consider them. This is a strong showing of support, however, and it won’t be surprising if Microsoft, EFF and the states influence the decision. If Epic wins its appeal and doesn’t face further challenges, Apple may have to further reform the App Store.