Apple Intelligence launches today with the release of iOS 18.1
With today’s release of iOS 18.1 (and its brethren platforms) comes with it the official launch of Apple Intelligence. It’s a momentous occasion for Apple and the industry as a whole, a day with real “today’s the first day of the rest of your life” vibes for everyone—Apple and its users alike.
Predictably lost amidst the handwringing about Apple’s perceived tardiness to the artificial intelligence arena and how prepared John Giannandrea and crew are, the ramifications of Apple Intelligence’s arrival—imperfect and staggered though it is—on the disability community are immense and cannot be overstated. Also equally lost amongst the media and armchair industry watchers is how one can draw a straight line from each facet of Apple Intelligence to some segment of the disability community. Type to Siri not only maintains decorum, but it obviously helps those in the Deaf and hard-of-hearing community, as well as those with certain speech impairments. Likewise, the writing tools helps those with certain cognitive conditions in addition to reducing fatigue and the like, motor-wise. These aren’t trivial or “niche” use cases; Apple Intelligence is billed as “AI for the rest of us,” after all.
Of course, Apple is merely one player amongst many in the AI game. The reason to focus on Apple is obvious. It does accessibility well. Whatever foibles Apple Intelligence has in acumen, it’s integrated deeply within the systems on which it runs. What this means is Apple Intelligence is already accessible out of the box. Disabled people want to, and do, use AI too; Apple Intelligence has a baseline for inclusion. Like hearing aids in AirPods, this means much more than as mere implementation detail.
Some weeks after Apple unveiled Apple Intelligence at WWDC back in June, I sat down virtually with Ariana Aboulafia of the Center for Democracy and Technology (CDT) and Bonnielin Swenor of the Johns Hopkins Disability Health Research Center to discuss the confluence of disability, bias, and AI. What both women said then resonates now in context of Apple Intelligence’s first wave unleashing unto the world.
Aboulafia, who leads disability rights at the CDT, explained the Center is, in a nutshell, “a rights-based tech policy organization [that] focuses on technologies and technology policies that affect civil rights and civil liberties in the digital age.” As to her work in particular, Aboulafia told me her job involves “providing a disability rights and disability justice lens to all the work that CDT does,” which of course includes technology. Her role touches things such as education, employment, voting rights, and more—including artificial intelligence. Of particular import to the work Aboulafia and team do daily is what she characterized as “how algorithmic bias in various contexts impacts people with disabilities.”
In late July, Aboulafia, alongside Miranda Bogen and Bonnielin Swenor, co-bylined a report which examined how to reduce bias in technology. As the trio of women write in the report’s introduction, when disabled people interact with technology, “there is a risk that they will face discriminatory impacts in several important and high-stakes contexts, like employment, benefits, and healthcare.” For instance, many automated systems used in gauging employment and/or governmental benefits oftentimes wield the sword known as algorithms to deprive disabled people of opportunity. The net result, according to the CDT’s report, is this algorithmic blade work “impacts the ability of those individuals to live independently.” As to Apple Intelligence, it’s absolutely true there’s bias amongst the media and others in their lack in sufficiently recognizing—even at a bare-bones level—that however “late” Apple is to this AI shindig, the company is showing up stocked with plenty of useful party favors for legions of disabled people everywhere.
Swenor, founder and director of the aforementioned Disability Health Research Center, told me while she and Aboulafia have had discussions aplenty on disability bias sheerly due to the nature of each other’s work, the topic of bias in AI, however “certainly elevating and important,” hasn’t been requisitely inclusive of disabled people. This lack of inclusion only stokes more fear into Swenor and Aboulafia because, in Swenor’s words, it will “deepen the inequities that have people with disabilities face.” Conversations are constantly being had around bias, and the concerns valid for people of all groups. According to Swenor, there’s more of a sense of urgency to advocate for people with disabilities partly because there’s such limited data on “potential bias in algorithms and AI approaches towards people with disabilities.” Suffice it to say, Swenor reiterated what’s written in the CDT’s report by telling me “that can have all sorts of implications.” In essence, the CDT’s report is a flashing neon sign not to exclude the disability community when developing new tech.
“[We wanted to] do whatever we can to make sure people with disabilities are part of these conversations [on AI and bias]. It’s a critical part of these conversations [and] advancing equity in these spaces,” Swenor said of the CDT report’s raison d être. “Something in my Center we talk a lot about is how data oppression is real. What I mean is, the fact we don’t have data on people with disabilities means not using that data is really only deepening the inequities and the oppression [faced by] people with disabilities. It’s not happenstance—it’s reflective of societal views of disability that we’re not paying attention to this issue.”
It’s oppression—and it’s privilege to not need to think about it.
As for Apple Intelligence, being inattentive to disability vis-a-vis accessibility to instead push a juicy, fatalistic narrative about Apple playing catchup isn’t happenstance either. It’s a choice that, to borrow Swenor’s sentiments, is “reflective of societal views of disability.”
Aboulafia wholeheartedly agreed with Swenor’s comments, saying in regards to algorithm harm towards disabled people that, in addressing it, it’s instructive to “look at traditional strategies disability rights and disability justice advocates have used to combat risks of harm and discrimination in the past.” She added she considers the CDT’s report detailing “how to inclusively design algorithmic systems” whilst aggressively pushing the notion “you can design systems that work for people with disabilities but also benefit folks without disabilities.”
“If we’re thinking about using inclusive design and how to inclusively design algorithmic systems, the way to do that is to ensure that data that’s used to design those algorithmic systems is more representative of people with disabilities from the beginning.” Aboulafia said.
Both women expressed the idea that there’s much more work to be done in equalizing things, especially in terms of accumulating data. Swenor said it’s an arduous, complicated process—one that undoubtedly will include members of the disability community. She noted there have been proposals made by the Census Bureau to “change the way they measure disability” with its surveys and other questions. For Swenor, the bottom line is “we need to invest in ways to improve those measures [and] better represent the breadth and depth of the disability population, including the different ways that people identify.” She went on to say disability is a “large umbrella” underneath which there are “undergirding ongoing conversations happening right now” spanning multiple fronts.
The sobering truth, Swenor told me, is the primary reason disability data is hard to come by is dollar signs. The so-called “disability data infrastructure” hasn’t improved largely because “this has not been an area of investment by federal research dollars [and] philanthropic dollars.” Money makes the world go ‘round, with Swenor telling me there isn’t enough of it to spur advancement in this key area. “[Funding] has been deprioritized, as it has for so many other areas of disability work and research,” she said. “We’re bearing the brunt of that right now.”
When asked about feedback, Swenor said it’s been encouraging. Many disability-oriented organizations and other community partners have been outspoken in the need to tackle disability bias in AI. She called the calls “motivating” and “full of potential.” For her part, Aboulafia said it’s important to “raise awareness of disability-specific issues” Generally, people who work in (or cover, for that matter) tech simply are ignorant about these types of problems—especially as it relates to gathering data. One of the goals of the CDT’s report, she said, was to “provide recommendations, but also provide some insight as to what’s going on.”
Aboulafia’s point about awareness is well-taken. With a $3 trillion market cap, Apple is the furthest thing from a philanthropic outfit there could possibly be. If you want to get down to brass tacks, Apple Intelligence mainly exists, like the App Store, et al, to drive sales of Apple products because the value proposition is thereby increased.
There’s much one can criticize Apple for—even in accessibility, I’ve been known to knock the company from its industry-leading pedestal on numerous occasions—but as I’ve written many times, accessibility is neither part of “the bloody ROI” nor an empty bromide at Apple. Every executive there, from Tim Cook to Phil Schiller to Greg Joswiak to Alan Dye and others, have told me so to my face. The salient point is the work they do to empower disabled people, myself included, is deeply profound. Apple Intelligence will be no different in the years to come, which is why it’s so frustrating to see ostensibly smart people peddle snake oil over how Apple is behind in artificial intelligence and, as a result, there isn’t much value in this initial batch of fancy new features.
Looking towards the future, both Aboulafia and Swenor both pledged their commitment to keep pushing forward in their respective work. Swenor said the work in AI nowadays really is about “training the next generation” and added the disability community must be part of that process. The reason disability bias has been absent from conversations on bias is precisely because it “hasn’t been on anyone’s radar”—it hasn’t been taught in any curriculum, hasn’t been part of the models on which LLMs are trained, etc. Swenor believes the work she’s done puts her and Aboulafia “on the leading edge” in the type of work they so tirelessly do.
“There will be some discomfort... there will be a need to hold space to allow people to have conversations,” Swenor said of assuming the pole position. “Ariana and CDT are doing such a great job of leading that, but that’s what it’s going to take in the future to drive this change.”
Aboulafia cited the disability community’s motto of “nothing about us without us.” Disabled people surely will have a say in the AI matter.
“It’s really important to center people with disabilities in the innovation and deployment of technologies, but also of technology policies and understanding the ways those are different,” she said. “It’s ensuring that people with disabilities are centered in those conversations in both contexts. That’s equally important. I think awareness of the ways in which technologies impact people with disabilities is growing; hopefully we are contributing to that raising of awareness. As awareness grows, we’ll be able to see more solutions towards more equitable technologies.”
However Apple Intelligence ends up evolving, let its run-up serve two lessons. One, newsrooms desperately need to fortify their tech desks with regular accessibility coverage posthaste. Reporters needn’t be accessibility aesthetes to suggest Apple Intelligence can be good. Two, this story hammers home how disability awareness, in artificial intelligence or otherwise, isn’t for the weak—it’s downright Sisyphean.
Steven is a freelance tech journalist covering accessibility and assistive technologies, and is based in San Francisco. His work has
...