Costco now supports Apple Pay across all of its US stores

Apple has landed a big new partner for Apple Pay in the U.S. after Costco began accepting the mobile payment service across 750 stores. The retailer plans to include support at its gas stations, but that isn’t yet complete.

The rollout — first reported by MacRumors — follows limited trials at selected Costco outlets, including a warehouse near its corporate headquarters in Washington.

This new partnership comes hot on the heels of Apple’s landing similar deals with CVS and 7-Eleven. The deal with CVS is particularly notable since the retailer had held off on supporting the Apple service, to the point that it even developed its own alternative that is based on barcodes. Apple also secured a deal this summer to add Apple Pay support to eBay which gives it more breath among online retailers, too.

The service is operational in 30 international markets and, in the U.S., it is tipped to account for half of all contactless payments operated by an OEM by 2020, according to a recent analyst report.

The market for such services — which includes Samsung Pay, Google Pay and others — is tipped to reach 450 million consumers. Apple, though, is already seeing the benefits. Apple Pay is part of the company’s ‘services’ division which recorded revenue of $9.6 billion in the last quarter, that’s up 31 percent year-over-year.

Apple has removed Infowars podcasts from iTunes

Apple has followed the lead of Google and Facebook after it removed Infowars, the conspiracy theorist organization helmed by Alex Jones, from its iTunes and podcasts apps.

Unlike Google and Facebook, which removed four Infowars videos on the basis that the content violated its policies, Apple’s action is wider-reaching. The company has withdrawn all episodes of five of Infowars’ six podcasts from its directory of content, leaving just one left, a show called ‘Real News With David Knight.’

The removals were first spotted on Twitter. Later, Apple confirmed it took action on account of the use of hate speech which violates its content guidelines.

“Apple does not tolerate hate speech, and we have clear guidelines that creators and developers must follow to ensure we provide a safe environment for all of our users. Podcasts that violate these guidelines are removed from our directory making them no longer searchable or available for download or streaming. We believe in representing a wide range of views, so long as people are respectful to those with differing opinions,” a spokesperson told TechCrunch.

Apple’s action comes after fellow streaming services Spotify and Stitcher removed Infowars on account of its use of hate speech.

Jones has used Infowars, and by association the platforms of these media companies, to broadcast a range of conspiracy theories which have included claims 9/11 was an inside job and alternate theories to the San Bernardino shootings. In the case of another U.S. mass shooting, Sandy Hook, Jones and Infowars’ peddling of false information and hoax theories was so severe that some of the families of the deceased, who have been harassed online and faced death threats, have been forced to move multiple times. A group is suing Jones via a defamation suit.

iFixit finds dust covers in latest MacBook Pro keyboard

Apple released a refreshed MacBook Pro this week and top among the new features is a tweaked keyboard. Apple says its quieter than the last version and in our tests, we agree. But iFixit found something else: thin, silicone barriers that could improve the keyboard’s reliability.

This is big news. Users have long reported the butterfly switch keyboard found in MacBook Pros were less reliable than past models. There are countless reports of dust and lint and crumbs causing keys to stick or fail. Personally, I have not had any issues, but many at TechCrunch have. To date Apple has yet to issue a recall for the keyboard..

iFixit found a thin layer of rubberized material covering the new butterfly mechanism. The repair outlet also points to an Apple patent for this exact technology that’s designed to “prevent and/or alleviate contaminant ingress.”

According to Apple, which held a big media unveiling for new models, the changes to the keyboard were designed to address the loud clickity-clack and not the keyboard’s tendency to get mucked up by dust. And that makes sense, too. If Apple held an event and said “We fixed the keyboards” it would mean Apple was admitting something was wrong with the keyboards. Instead Apple held an event and said “We made the keyboards quieter” admitting the past keyboards were loud, and not faulty.

We just got our review unit and will report back on the keyboard’s reliability after a day or two at the beach. Because science.

InVision mobile app updates include studio features and desktop to mobile mirroring

InVision, the software a service challenger to Adobe’s design dominance, has just released a new version of its mobile app for iOS and is beta-testing new features for Android users as it tries to bring additional functionality to designers on-the-go.

The new app tools feature “studio mirroring” for reviews of new designs directly on mobile devices, so that designers can see design changes to applications made on the desktop display on mobile in real time.

The mirroring feature works by scanning a QR code on a mobile device which lets users view design changes and test user experiences immediately.

The company is also bringing its Freehand support — which allows for collaborative commenting on design prototypes to tablets so teams can comment on the fly, the company said.

The tools will give InVision another arrow in its quiver as it tries to take on other design platforms (notably the 100 pound gorilla known as Adobe) and are a useful addition to a service that’s trying to woo the notoriously fickle design community with an entire toolkit.

As we wrote in May when the company launched its app store:

While collaboration is the bread and butter of InVision’s business, and the only revenue stream for the company, CEO and founder Clark Valberg feels that it isn’t enough to be complementary to the current design tool ecosystem. Which is why InVision launched Studio in late 2017, hoping to take on Adobe and Sketch head-on with its own design tool.

Studio differentiates itself by focusing on the designer’s real-life workflow, which often involves mocking up designs in one app, pulling assets from another, working on animations and transitions in another, and then stitching the whole thing together to share for collaboration across InVision Cloud. Studio aims to bring all those various services into a single product, and a critical piece of that mission is building out an app store and asset store with the services too sticky for InVision to rebuild from Scratch, such as Slack or Atlassian .

Apple’s Shortcuts will flip the switch on Siri’s potential

At WWDC, Apple pitched Shortcuts as a way to ”take advantage of the power of apps” and ”expose quick actions to Siri.” These will be suggested by the OS, can be given unique voice commands, and will even be customizable with a dedicated Shortcuts app.

But since this new feature won’t let Siri interpret everything, many have been lamenting that Siri didn’t get much better — and is still lacking compared to Google Assistant or Amazon Echo.

But to ignore Shortcuts would be missing out on the bigger picture. Apple’s strengths have always been the device ecosystem and the apps that run on them.

With Shortcuts, both play a major role in how Siri will prove to be a truly useful assistant and not just a digital voice to talk to.

Your Apple devices just got better

For many, voice assistants are a nice-to-have, but not a need-to-have.

It’s undeniably convenient to get facts by speaking to the air, turning on the lights without lifting a finger, or triggering a timer or text message – but so far, studies have shown people don’t use much more than these on a regular basis.

People don’t often do more than that because the assistants aren’t really ready for complex tasks yet, and when your assistant is limited to tasks inside your home or commands spoken inton your phone, the drawbacks prevent you from going deep.

If you prefer Alexa, you get more devices, better reliability, and a breadth of skills, but there’s not a great phone or tablet experience you can use alongside your Echo. If you prefer to have Google’s Assistant everywhere, you must be all in on the Android and Home ecosystem to get the full experience too.

Plus, with either option, there are privacy concerns baked into how both work on a fundamental level – over the web.

In Apple’s ecosystem, you have Siri on iPhone, iPad, Apple Watch, AirPods, HomePod, CarPlay, and any Mac. Add in Shortcuts on each of those devices (except Mac, but they still have Automator) and suddenly you have a plethora of places to execute these all your commands entirely by voice.

Each accessory that Apple users own will get upgraded, giving Siri new ways to fulfill the 10 billion and counting requests people make each month (according to Craig Federighi’s statement on-stage at WWDC).

But even more important than all the places where you can use your assistant is how – with Shortcuts, Siri gets even better with each new app that people download. There’s the other key difference: the App Store.

Actions are the most important part of your apps

iOS has always had a vibrant community of developers who create powerful, top-notch applications that push the system to its limits and take advantage of the ever-increasing power these mobile devices have.

Shortcuts opens up those capabilities to Siri – every action you take in an app can be shared out with Siri, letting people interact right there inline or using only their voice, with the app running everything smoothly in the background.

Plus, the functional approach that Apple is taking with Siri creates new opportunities for developers provide utility to people instead of requiring their attention. The suggestions feature of Shortcuts rewards “acceleration”, showing the apps that provide the most time savings and use for the user more often.

This opens the door to more specialized types of apps that don’t necessarily have to grow a huge audience and serve them ads – if you can make something that helps people, Shortcuts can help them use your app more than ever before (and without as much effort). Developers can make a great experience for when people visit the app, but also focus on actually doing something useful too.

This isn’t a virtual assistant that lives in the cloud, but a digital helper that can pair up with the apps uniquely taking advantage of Apple’s hardware and software capabilities to truly improve your use of the device.

In the most groan-inducing way possible, “there’s an app for that” is back and more important than ever. Not only are apps the centerpiece of the Siri experience, but it’s their capabilities that extend Siri’s – the better the apps you have, the better Siri can be.

Control is at your fingertips

Importantly, Siri gets all of this Shortcuts power while keeping the control in each person’s hands.

All of the information provided to the system is securely passed along by individual apps – if something doesn’t look right, you can just delete the corresponding app and the information is gone.

Siri will make recommendations based on activities deemed relevant by the apps themselves as well, so over-active suggestions shouldn’t be common (unless you’re way too active in some apps, in which case they added Screen Time for you too).

Each of the voice commands is custom per user as well, so people can ignore their apps suggestions and set up the phrases to their own liking. This means nothing is already “taken” because somebody signed up for the skill first (unless you’ve already used it yourself, of course).

Also, Shortcuts don’t require the web to work – the voice triggers might not work, but the suggestions and Shortcuts app give you a place to use your assistant voicelessly. And importantly, Shortcuts can use the full power of the web when they need to.

This user-centric approach paired with the technical aspects of how Shortcuts works gives Apple’s assistant a leg up for any consumers who find privacy important. Essentially, Apple devices are only listening for “Hey Siri”, then the available Siri domains + your own custom trigger phrases.

Without exposing your information to the world or teaching a robot to understand everything, Apple gave Siri a slew of capabilities that in many ways can’t be matched. With Shortcuts, it’s the apps, the operating system, and the variety of hardware that will make Siri uniquely qualified come this fall.

Plus, the Shortcuts app will provide a deeper experience for those who want to chain together actions and customize their own shortcuts.

There’s lots more under the hood to experiment with, but this will allow anyone to tweak & prod their Siri commands until they have a small army of custom assistant tasks at the ready.

Hey Siri, let’s get started

Siri doesn’t know all, Can’t perform any task you bestow upon it, and won’t make somewhat uncanny phone calls on your behalf.

But instead of spending time conversing with a somewhat faked “artificial intelligence”, Shortcuts will help people use Siri as an actual digital assistant – a computer to help them get things done better than they might’ve otherwise.

With Siri’s new skills extendeding to each of your Apple products (except for Apple TV and the Mac, but maybe one day?), every new device you get and every new app you download can reveal another way to take advantage of what this technology can offer.

This broadening of Siri may take some time to get used to – it will be about finding the right place for it in your life.

As you go about your apps, you’ll start seeing and using suggestions. You’ll set up a few voice commands, then you’ll do something like kick off a truly useful shortcut from your Apple Watch without your phone connected and you’ll realize the potential.

This is a real digital assistant, your apps know how to work with it, and it’s already on many of your Apple devices. Now, it’s time to actually make use of it.

Apple is rebuilding Maps from the ground up

I’m not sure if you’re aware, but the launch of Apple Maps went poorly. After a rough first impression, an apology from the CEO, several years of patching holes with data partnerships and some glimmers of light with long-awaited transit directions and improvements in business, parking and place data, Apple Maps is still not where it needs to be to be considered a world-class service.

Maps needs fixing.

Apple, it turns out, is aware of this, so it’s re-building the maps part of Maps.

It’s doing this by using first-party data gathered by iPhones with a privacy-first methodology and its own fleet of cars packed with sensors and cameras. The new product will launch in San Francisco and the Bay Area with the next iOS 12 beta and will cover Northern California by fall.

Every version of iOS will get the updated maps eventually, and they will be more responsive to changes in roadways and construction, more visually rich depending on the specific context they’re viewed in and feature more detailed ground cover, foliage, pools, pedestrian pathways and more.

This is nothing less than a full re-set of Maps and it’s been four years in the making, which is when Apple began to develop its new data-gathering systems. Eventually, Apple will no longer rely on third-party data to provide the basis for its maps, which has been one of its major pitfalls from the beginning.

“Since we introduced this six years ago — we won’t rehash all the issues we’ve had when we introduced it — we’ve done a huge investment in getting the map up to par,” says Apple SVP Eddy Cue, who now owns Maps, in an interview last week. “When we launched, a lot of it was all about directions and getting to a certain place. Finding the place and getting directions to that place. We’ve done a huge investment of making millions of changes, adding millions of locations, updating the map and changing the map more frequently. All of those things over the past six years.”

But, Cue says, Apple has room to improve on the quality of Maps, something that most users would agree on, even with recent advancements.

“We wanted to take this to the next level,” says Cue. “We have been working on trying to create what we hope is going to be the best map app in the world, taking it to the next step. That is building all of our own map data from the ground up.”

In addition to Cue, I spoke to Apple VP Patrice Gautier and more than a dozen Apple Maps team members at its mapping headquarters in California this week about its efforts to re-build Maps, and to do it in a way that aligned with Apple’s very public stance on user privacy.

If, like me, you’re wondering whether Apple thought of building its own maps from scratch before it launched Maps, the answer is yes. At the time, there was a choice to be made about whether or not it wanted to be in the business of maps at all. Given that the future of mobile devices was becoming very clear, it knew that mapping would be at the core of nearly every aspect of its devices, from photos to directions to location services provided to apps. Decision made, Apple plowed ahead, building a product that relied on a patchwork of data from partners like TomTom, OpenStreetMap and other geo data brokers. The result was underwhelming.

Almost immediately after Apple launched Maps, it realized that it was going to need help and it signed on a bunch of additional data providers to fill the gaps in location, base map, point-of-interest and business data.

It wasn’t enough.

“We decided to do this just over four years ago. We said, ‘Where do we want to take Maps? What are the things that we want to do in Maps?’ We realized that, given what we wanted to do and where we wanted to take it, we needed to do this ourselves,” says Cue.

Because Maps are so core to so many functions, success wasn’t tied to just one function. Maps needed to be great at transit, driving and walking — but also as a utility used by apps for location services and other functions.

Cue says that Apple needed to own all of the data that goes into making a map, and to control it from a quality as well as a privacy perspective.

There’s also the matter of corrections, updates and changes entering a long loop of submission to validation to update when you’re dealing with external partners. The Maps team would have to be able to correct roads, pathways and other updating features in days or less, not months. Not to mention the potential competitive advantages it could gain from building and updating traffic data from hundreds of millions of iPhones, rather than relying on partner data.

Cue points to the proliferation of devices running iOS, now over a billion, as a deciding factor to shift its process.

“We felt like because the shift to devices had happened — building a map today in the way that we were traditionally doing it, the way that it was being done — we could improve things significantly, and improve them in different ways,” he says. “One is more accuracy. Two is being able to update the map faster based on the data and the things that we’re seeing, as opposed to driving again or getting the information where the customer’s proactively telling us. What if we could actually see it before all of those things?”

I query him on the rapidity of Maps updates, and whether this new map philosophy means faster changes for users.

“The truth is that Maps needs to be [updated more], and even are today,” says Cue. “We’ll be doing this even more with our new maps, [with] the ability to change the map in real time and often. We do that every day today. This is expanding us to allow us to do it across everything in the map. Today, there’s certain things that take longer to change.

“For example, a road network is something that takes a much longer time to change currently. In the new map infrastructure, we can change that relatively quickly. If a new road opens up, immediately we can see that and make that change very, very quickly around it. It’s much, much more rapid to do changes in the new map environment.”

So a new effort was created to begin generating its own base maps, the very lowest building block of any really good mapping system. After that, Apple would begin layering on living location data, high-resolution satellite imagery and brand new intensely high-resolution image data gathered from its ground cars until it had what it felt was a “best in class” mapping product.

There is only really one big company on earth that owns an entire map stack from the ground up: Google .

Apple knew it needed to be the other one. Enter the vans.

Apple vans spotted

Though the overall project started earlier, the first glimpse most folks had of Apple’s renewed efforts to build the best Maps product was the vans that started appearing on the roads in 2015 with “Apple Maps” signs on the side. Capped with sensors and cameras, these vans popped up in various cities and sparked rampant discussion and speculation.

The new Apple Maps will be the first time the data collected by these vans is actually used to construct and inform its maps. This is their coming out party.

Some people have commented that Apple’s rigs look more robust than the simple GPS + Camera arrangements on other mapping vehicles — going so far as to say they look more along the lines of something that could be used in autonomous vehicle training.

Apple isn’t commenting on autonomous vehicles, but there’s a reason the arrays look more advanced: they are.

Earlier this week I took a ride in one of the vans as it ran a sample route to gather the kind of data that would go into building the new maps. Here’s what’s inside.

In addition to a beefed-up GPS rig on the roof, four LiDAR arrays mounted at the corners and eight cameras shooting overlapping high-resolution images, there’s also the standard physical measuring tool attached to a rear wheel that allows for precise tracking of distance and image capture. In the rear there is a surprising lack of bulky equipment. Instead, it’s a straightforward Mac Pro bolted to the floor, attached to an array of solid state drives for storage. A single USB cable routes up to the dashboard where the actual mapping-capture software runs on an iPad.

While mapping, a driver…drives, while an operator takes care of the route, ensuring that a coverage area that has been assigned is fully driven, as well as monitoring image capture. Each drive captures thousands of images as well as a full point cloud (a 3D map of space defined by dots that represent surfaces) and GPS data. I later got to view the raw data presented in 3D and it absolutely looks like the quality of data you would need to begin training autonomous vehicles.

More on why Apple needs this level of data detail later.

When the images and data are captured, they are then encrypted on the fly and recorded on to the SSDs. Once full, the SSDs are pulled out, replaced and packed into a case, which is delivered to Apple’s data center, where a suite of software eliminates from the images private information like faces, license plates and other info. From the moment of capture to the moment they’re sanitized, they are encrypted with one key in the van and the other key in the data center. Technicians and software that are part of its mapping efforts down the pipeline from there never see unsanitized data.

This is just one element of Apple’s focus on the privacy of the data it is utilizing in New Maps.

Probe data and privacy

Throughout every conversation I have with any member of the team throughout the day, privacy is brought up, emphasized. This is obviously by design, as Apple wants to impress upon me as a journalist that it’s taking this very seriously indeed, but it doesn’t change the fact that it’s evidently built in from the ground up and I could not find a false note in any of the technical claims or the conversations I had.

Indeed, from the data security folks to the people whose job it is to actually make the maps work well, the constant refrain is that Apple does not feel that it is being held back in any way by not hoovering every piece of customer-rich data it can, storing and parsing it.

The consistent message is that the team feels it can deliver a high-quality navigation, location and mapping product without the directly personal data used by other platforms.

“We specifically don’t collect data, even from point A to point B,” notes Cue. “We collect data — when we do it — in an anonymous fashion, in subsections of the whole, so we couldn’t even say that there is a person that went from point A to point B. We’re collecting the segments of it. As you can imagine, that’s always been a key part of doing this. Honestly, we don’t think it buys us anything [to collect more]. We’re not losing any features or capabilities by doing this.”

The segments that he is referring to are sliced out of any given person’s navigation session. Neither the beginning or the end of any trip is ever transmitted to Apple. Rotating identifiers, not personal information, are assigned to any data or requests sent to Apple and it augments the “ground truth” data provided by its own mapping vehicles with this “probe data” sent back from iPhones.

Because only random segments of any person’s drive is ever sent and that data is completely anonymized, there is never a way to tell if any trip was ever a single individual. The local system signs the IDs and only it knows to whom that ID refers. Apple is working very hard here to not know anything about its users. This kind of privacy can’t be added on at the end, it has to be woven in at the ground level.

Because Apple’s business model does not rely on it serving to you, say, an ad for a Chevron on your route, it doesn’t need to even tie advertising identifiers to users.

Any personalization or Siri requests are all handled on-board by the iOS device’s processor. So if you get a drive notification that tells you it’s time to leave for your commute, that’s learned, remembered and delivered locally, not from Apple’s servers.

That’s not new, but it’s important to note given the new thing to take away here: Apple is flipping on the power of having millions of iPhones passively and actively improving their mapping data in real time.

In short: Traffic, real-time road conditions, road systems, new construction and changes in pedestrian walkways are about to get a lot better in Apple Maps.

The secret sauce here is what Apple calls probe data. Essentially little slices of vector data that represent direction and speed transmitted back to Apple completely anonymized with no way to tie it to a specific user or even any given trip. It’s reaching in and sipping a tiny amount of data from millions of users instead, giving it a holistic, real-time picture without compromising user privacy.

If you’re driving, walking or cycling, your iPhone can already tell this. Now if it knows you’re driving, it also can send relevant traffic and routing data in these anonymous slivers to improve the entire service. This only happens if your Maps app has been active, say you check the map, look for directions, etc. If you’re actively using your GPS for walking or driving, then the updates are more precise and can help with walking improvements like charting new pedestrian paths through parks — building out the map’s overall quality.

All of this, of course, is governed by whether you opted into location services, and can be toggled off using the maps location toggle in the Privacy section of settings.

Apple says that this will have a near zero effect on battery life or data usage, because you’re already using the ‘maps’ features when any probe data is shared and it’s a fraction of what power is being drawn by those activities.

From the point cloud on up

But maps cannot live on ground truth and mobile data alone. Apple is also gathering new high-resolution satellite data to combine with its ground truth data for a solid base map. It’s then layering satellite imagery on top of that to better determine foliage, pathways, sports facilities, building shapes and pathways.

After the downstream data has been cleaned up of license plates and faces, it gets run through a bunch of computer vision programming to pull out addresses, street signs and other points of interest. These are cross referenced to publicly available data like addresses held by the city and new construction of neighborhoods or roadways that comes from city planning departments.

But one of the special sauce bits that Apple is adding to the mix of mapping tools is a full-on point cloud that maps in 3D the world around the mapping van. This allows them all kinds of opportunities to better understand what items are street signs (retro-reflective rectangular object about 15 feet off the ground? Probably a street sign) or stop signs or speed limit signs.

It seems like it also could enable positioning of navigation arrows in 3D space for AR navigation, but Apple declined to comment on “any future plans” for such things.

Apple also uses semantic segmentation and Deep Lambertian Networks to analyze the point cloud coupled with the image data captured by the car and from high-resolution satellites in sync. This allows 3D identification of objects, signs, lanes of traffic and buildings and separation into categories that can be highlighted for easy discovery.

The coupling of high-resolution image data from car and satellite, plus a 3D point cloud, results in Apple now being able to produce full orthogonal reconstructions of city streets with textures in place. This is massively higher-resolution and easier to see, visually. And it’s synchronized with the “panoramic” images from the car, the satellite view and the raw data. These techniques are used in self-driving applications because they provide a really holistic view of what’s going on around the car. But the ortho view can do even more for human viewers of the data by allowing them to “see” through brush or tree cover that would normally obscure roads, buildings and addresses.

This is hugely important when it comes to the next step in Apple’s battle for supremely accurate and useful Maps: human editors.

Apple has had a team of tool builders working specifically on a toolkit that can be used by human editors to vet and parse data, street by street. The editor’s suite includes tools that allow human editors to assign specific geometries to flyover buildings (think Salesforce tower’s unique ridged dome) that allow them to be instantly recognizable. It lets editors look at real images of street signs shot by the car right next to 3D reconstructions of the scene and computer vision detection of the same signs, instantly recognizing them as accurate or not.

Another tool corrects addresses, letting an editor quickly move an address to the center of a building, determine whether they’re misplaced and shift them around. It also allows for access points to be set, making Apple Maps smarter about the “last 50 feet” of your journey. You’ve made it to the building, but what street is the entrance actually on? And how do you get into the driveway? With a couple of clicks, an editor can make that permanently visible.

“When we take you to a business and that business exists, we think the precision of where we’re taking you to, from being in the right building,” says Cue. “When you look at places like San Francisco or big cities from that standpoint, you have addresses where the address name is a certain street, but really, the entrance in the building is on another street. They’ve done that because they want the better street name. Those are the kinds of things that our new Maps really is going to shine on. We’re going to make sure that we’re taking you to exactly the right place, not a place that might be really close by.”

Water, swimming pools (new to Maps entirely), sporting areas and vegetation are now more prominent and fleshed out thanks to new computer vision and satellite imagery applications. So Apple had to build editing tools for those, as well.

Many hundreds of editors will be using these tools, in addition to the thousands of employees Apple already has working on maps, but the tools had to be built first, now that Apple is no longer relying on third parties to vet and correct issues.

And the team also had to build computer vision and machine learning tools that allow it to determine whether there are issues to be found at all.

Anonymous probe data from iPhones, visualized, looks like thousands of dots, ebbing and flowing across a web of streets and walkways, like a luminescent web of color. At first, chaos. Then, patterns emerge. A street opens for business, and nearby vessels pump orange blood into the new artery. A flag is triggered and an editor looks to see if a new road needs a name assigned.

A new intersection is added to the web and an editor is flagged to make sure that the left turn lanes connect correctly across the overlapping layers of directional traffic. This has the added benefit of massively improved lane guidance in the new Apple Maps.

Apple is counting on this combination of human and AI flagging to allow editors to first craft base maps and then also maintain them as the ever-changing biomass wreaks havoc on roadways, addresses and the occasional park.

Here there be Helvetica

Apple’s new Maps, like many other digital maps, display vastly differently depending on scale. If you’re zoomed out, you get less detail. If you zoom in, you get more. But Apple has a team of cartographers on staff that work on more cultural, regional and artistic levels to ensure that its Maps are readable, recognizable and useful.

These teams have goals that are at once concrete and a bit out there — in the best traditions of Apple pursuits that intersect the technical with the artistic.

The maps need to be usable, but they also need to fulfill cognitive goals on cultural levels that go beyond what any given user might know they need. For instance, in the U.S., it is very common to have maps that have a relatively low level of detail even at a medium zoom. In Japan, however, the maps are absolutely packed with details at the same zoom, because that increased information density is what is expected by users.

This is the department of details. They’ve reconstructed replicas of hundreds of actual road signs to make sure that the shield on your navigation screen matches the one you’re seeing on the highway road sign. When it comes to public transport, Apple licensed all of the type faces that you see on your favorite subway systems, like Helvetica for NYC. And the line numbers are in the exact same order that you’re going to see them on the platform signs.

It’s all about reducing the cognitive load that it takes to translate the physical world you have to navigate into the digital world represented by Maps.

Bottom line

The new version of Apple Maps will be in preview next week with just the Bay Area of California going live. It will be stitched seamlessly into the “current” version of Maps, but the difference in quality level should be immediately visible based on what I’ve seen so far.

Better road networks, more pedestrian information, sports areas like baseball diamonds and basketball courts, more land cover, including grass and trees, represented on the map, as well as buildings, building shapes and sizes that are more accurate. A map that feels more like the real world you’re actually traveling through.

Search is also being revamped to make sure that you get more relevant results (on the correct continents) than ever before. Navigation, especially pedestrian guidance, also gets a big boost. Parking areas and building details to get you the last few feet to your destination are included, as well.

What you won’t see, for now, is a full visual redesign.

“You’re not going to see huge design changes on the maps,” says Cue. “We don’t want to combine those two things at the same time because it would cause a lot of confusion.”

Apple Maps is getting the long-awaited attention it really deserves. By taking ownership of the project fully, Apple is committing itself to actually creating the map that users expected of it from the beginning. It’s been a lingering shadow on iPhones, especially, where alternatives like Google Maps have offered more robust feature sets that are so easy to compare against the native app but impossible to access at the deep system level.

The argument has been made ad nauseam, but it’s worth saying again that if Apple thinks that mapping is important enough to own, it should own it. And that’s what it’s trying to do now.

“We don’t think there’s anybody doing this level of work that we’re doing,” adds Cue. “We haven’t announced this. We haven’t told anybody about this. It’s one of those things that we’ve been able to keep pretty much a secret. Nobody really knows about it. We’re excited to get it out there. Over the next year, we’ll be rolling it out, section by section in the U.S.”

For Apple, this year’s Global Accessibility Awareness Day is all about education

Following Apple’s education event in Chicago in March, I wrote about what the company’s announcements might mean for accessibility. After sitting in the audience covering the event, the big takeaway I had was Apple could “make serious inroads in furthering special education as well.” As I wrote, despite how well-designed the Classroom and Schoolwork apps seemingly are, Apple should do more to tailor their new tools to better serve students and educators in special education settings. After all, accessibility and special education are inextricably tied.

It turns out, Apple has, unsurprisingly, considered this.

“In many ways, education and accessibility beautifully overlap,” Sarah Herrlinger, Apple’s Senior Director of Global Accessibility Policy and Initiatives, said to me. “For us, the concept of differentiated learning and how the accessibility tools that we build in [to the products] help make that [learning] possible is really important to us.”

Apple’s philosophy toward accessibility and education isn’t about purposely targeting esoteric use cases such as IEP prep or specialized teaching methodologies.

In fact, Apple says there are many apps on the iOS App Store which do just that. The company instead believes special education students and teachers themselves should take the tools as they are and discover creative uses for them. Apple encourages those in schools to take the all-new, low-cost iPad and the new software and make them into the tools they need to teach and learn. It’s a sentiment that hearkens back how Steve Jobs pitched the original iPad: It’s a slab of metal and glass that can be whatever you wish it to be.

In other words, it’s Apple’s customers who put the ‘I’ in iPad.

In hindsight, Apple’s viewpoint for how they support special education makes total sense if you understand their ethos. Tim Cook often talks about building products that enrich people’s lives — in an education and accessibility context, this sentiment often becomes a literal truism. For many disabled people, iOS and the iPad is the conduit through which they access the world.

Apple ultimately owns the iPad and the message around it, but in actuality it’s the users who really transform it and give it its identity. This is ultimately what makes the tablet exceptional for learning. The device’s design is so inherently accessible that anyone, regardless of ability, can pick it up and go wild.

(Photo by Tomohiro Ohsumi/Getty Images)

Apple’s education team is special

At the March event, one of the onstage presenters was Kathleen Richardson, who works at Apple on their ConnectedED program. She is one of many who work on the company’s education team, whose group is tasked with working with schools and districts in evangelizing and integrating Apple products into their curricula.

I spoke with Meg Wilson, a former special education teacher who now works on education efforts inside Apple. A former Apple Distinguished Educator, Wilson is the resident “special education guru” who provides insight into how special education programs generally run. With that knowledge, she provides guidance on how Apple products can augment the process of individualizing and differentiating educational plans for special ed students.

A focus of our discussion was the Schoolwork app and how it could be used to suit the needs of teachers and support staff. One example Wilson cited was that of a speech therapy session, where a speech pathologist could use Schoolwork not necessarily for handouts, but for monitoring students’ progress toward IEP goals. Instead of the app showing a worksheet for the student to complete, it could show a data-tracking document for the therapist, who is recording info during lessons. “What we need in special ed is data — we need data,” Wilson said. She added Schoolwork can be used to “actually see the progress” students are making right from an iPad without mountains of paper. A key element to this, according to Wilson, is Schoolwork’s ability to modernize and streamline sharing. It makes conferring with other members of the IEP team a more continuous, dynamic endeavor. Rather than everyone convening once a year for an annual review of students’ progress, Wilson said, Schoolwork allows for “an amazing opportunity for collaboration amongst service providers.”

Wilson also emphasized the overarching theme of personalizing the iPad to suit the needs of teacher and student. “When you are creative with technology, you change people’s lives,” she said.

To her, the iPad and, especially, the new software scale for different learners and different environments really well. For special educators, for instance, Wilson said it’s easy to add one’s entire caseload to Schoolwork and have progress reports at the ready anytime. Likewise, the ability in Classroom to “lock” an entire class (or a single student) into an activity on an iPad, which takes its cues from iOS’s Guided Access feature, helps teachers ensure students stay engaged and on task during class. And for students, the intuitive nature of the iPad makes it so that students can instantly share their work with teachers.

But it isn’t only Apple who is changing education. Wilson made the case repeatedly that third-party developers are also making Apple’s solutions for education more compelling. She stressed there are many apps on the App Store that can help in special education settings (IEP prep, communication boards, etc.), and that Apple hears from developers who want to learn about accessibility and, crucially, how to make their apps accessible to all by supporting the discrete Accessibility features. Wilson shared an anecdote of an eye-opening experience for one developer, who expressed the idea of supporting accessibility “didn’t even occur to him,” but doing so made his app better.

One “big idea” that struck me from meeting with Wilson was how diverse Apple’s workforce truly is. Wilson is a former special education teacher. Apple’s health and fitness team reportedly is made up of such medical professionals as doctors and nurses. Apple’s education team is no different, as my conversation with Wilson attested. It’s notable how Apple brings together so many, from all walks of life, to help inform as they build these products. It really does intersect liberal arts with technology.

Apple makes learning code accessible to all

In early March, Lori Hawkins at the Austin American-Statesman reported on how Apple has made its Everyone Can Code program accessible to all. Hawkins wrote that representatives from Apple visited Austin’s Texas School for the Blind and Visually Impaired to teach students to fly drones with code written in the Swift Playgrounds app. As you’d expect, Swift Playgrounds is fully compatible with VoiceOver and even Switch Control. “When we said everyone should be able to code, we really meant everyone,” Herrlinger told the Statesman. “Hopefully these kids will leave this session and continue coding for a long time. Maybe it can inspire where their careers can go.” Herrlinger also appeared on a panel at the SXSW festival, where she and others discussed coding and accessibility pertaining to Everyone Can Code.

For Global Accessibility Awareness Day this year, Apple has announced that a slew of special education schools are adopting Everyone Can Code into their curricula. In a press release, the company says they “collaborated with engineers, educators, and programmers from various accessibility communities to make Everyone Can Code as accessible as possible.” They also note there are “additional tools and resources” which should aid non-visual learners to better understand coding environments.

In addition to the Texas School for the Blind and Visually Impaired in Austin, Apple says there are seven other institutions across the country that are implementing the Everyone Can Code curriculum. Among them are two Bay Area schools: the Northern California campuses of the California School for the Blind and the California School for the Deaf, both located in Fremont.

At a special kick-off event at CSD, students were visited by Apple employees — which included CEO Tim Cook — who came to the school to officially announce CSB and CSD’s participation in the Everyone Can Code program.

Students arrived at the school’s media lab for what they believed to be simply another day of coding. In reality, they were in for a  surprise as Tim Cook made his appearance. Members of Apple’s Accessibility team walked students through controlling drones and robots in Swift Playgrounds on an iPad. Cook — along with deaf activist and actor Nyle DiMarco — toured the room to visit with students and have them show off their work.

In an address to students, Cook said, “We are so happy to be here to kick off the Everyone Can Code curriculum with you. We believe accessibility is a fundamental human right and coding is part of that.”

In an interview Cook told me, “Accessibility has been a priority at Apple for a long time.” He continued: “We believe in focusing on ability rather than disability. We believe coding is a language — a language that should be accessible to everyone.” When I asked about any accessibility features he personally uses, Cook said due to hearing issues he likes to use closed-captioning whenever possible. And because he wears glasses, he likes to enlarge text on all of his devices, particularly the iPhone.

Accessibility-related Apple retail events

As in prior years, Apple is spending the month of May promoting accessibility and Global Accessibility Awareness Day by hosting numerous accessibility-centric events at its retail stores across the globe. (These are done throughout the year too.) These include workshops on the accessibility features across all Apple’s platforms, as well as talks and more. Apple says they have held “over 10,000 accessibility sessions” since 2017.

Today, on Global Accessibility Awareness Day 2018, Apple is holding accessibility-related events at several campuses worldwide, including its corporate headquarters in Cupertino, as well as at its satellite campuses in Austin, Cork and London.

What Apple’s education announcements mean for accessibility

From an accessibility news standpoint, this week’s Apple event in Chicago was antithetical to the October 2016 event. At the latter event, Apple began the presentation with a bang — showing the actual video being edited using Switch Control in Final Cut. Tim Cook came out afterwards to talk some about Apple’s commitment to serving the disabled community before unveiling the then-new accessibility page on the company’s website.

By contrast, the education-themed event in Chicago this week went by with barely a mention of accessibility. The only specific call-out came during Greg Joswiak’s time on stage talking about iPad, when he said “accessibility features make iPad a learning tool for everyone.”

That doesn’t mean, however, accessibility has no relevance to what was announced.

I was in the audience at Lane Tech College Prep on Tuesday covering the event. As a former special educator –and special education student — I watched with keen interest as Apple told their story around education. While Apple is targeting the mainstream, I came away with strong impressions on how Apple can make serious inroads in furthering special education as well.

It’s Called ‘Special’ for a Reason

Apple is obviously—rightfully—building their educational strategy towards mainstream students in mainstream classes. It’s a classic top-down approach: Teachers assign students work via handouts, for such activities as writing essays or completing science projects. This is the entire reason for Apple’s Classroom and Schoolwork apps. However well-designed, they lack an element.

Where they lack is there is nothing afforded, at least in specific terms, to teachers and students in special education settings. Apple’s strategy here is defined, again, by the classic teacher-student relationship, without any regard for other models. I’m not levying a criticism on the company; this is the reality.

At many levels, special education classrooms do not function in a way that’s conducive to Apple’s vision for learning at this time. In the moderate-to-severe early childhood (Pre-K) classrooms I worked in for close to a decade, the structure was such that most, if not all, activities were augmented by a heavy dose of adult support. Furthermore, most of our students were pulled out of class at certain times for additional services such as speech services and physical/occupational therapy sessions.

In short, there were no lectures or essay prompts anywhere.

This is where accessibility comes in. There is enormous potential for Apple to dig deeper and expand the toolset they offer to educators and students. To accommodate for special education is, in my view, akin to accommodating disabled users by offering accessibility features on each of Apple’s software platforms.

Special education is special for a reason. It involves ways of teaching and learning that are unique, and the people who work and learn in these environments deserve the same consideration.

Accessibility is Apple’s Secret Weapon

Leading up to the event, there was much talk in the Apple community of writers and podcasters that Google is eating Apple’s lunch in the schools market because Chromebooks are dirt cheap for districts and most everyone relies on Google Docs.

I’m not interested in the particulars of this argument. What I am interested in, however, is simply pointing out that despite the perception Apple products are too expensive and less capable, they are better in one meaningful sense: accessibility.

Consider Chromebook versus iPad. In many levels of special education, an iPad is far superior to a Chromebook. The tablet’s multi-touch user interface is far more intuitive, and more importantly, iOS is built with accessibility in mind. From VoiceOver to Dynamic Type to Switch Control and more, an iPad (or an iPod Touch, for that matter) can provide a far more accessible and enriching learning experience for many students with disabilities than a Chromebook. And lest we forget the App Store effect; there are many outstanding apps geared for special ed.

This is a crucial point that many technology pundits who lament Apple’s position in the education market always seem to miss.

Making Special Educators More Special

One area where Apple can greatly improve the lives of teachers is by broadening the Schoolwork app such that it makes IEP prep easier and, playing to Apple’s core strength, more modern. Historically, even today, IEPs are planned and written using stacks of paperwork. Goals, assessments, and consent forms are handwritten (sometimes typed) and stapled together. And being a binding legal document, teachers must ensure there are the proper signatures on every page, or else be dinged for being out of compliance with protocols. In sum: the IEP is the bane of every special educator’s existence because they take so much time.

To this end, Apple could do special education teachers a grand service by adding a module of sorts to its Schoolwork app that would allow them to more easily create and track a student’s IEP. There could be charts for tracking goal progress, as well as ways to collate and distribute documents amongst the IEP team (SLPs, OT/PT, etc) and of course parents. Teachers could even send an email to parents with any consent forms attached and encourage them to sign with Apple Pencil on their iPad, if they have one.

At the very least, it would make IEP prep infinitely more efficient, and perhaps alleviate some of the stress at the actual meetings. Digitizing the process would be game-changing, I think.

Bottom Line

The ideas I’ve outlined here are well within Apple’s wheelhouse. They would likely need to collaborate with special educators and districts on things like IEP forms and policies, but it is certainly within them to do so. They can do this if they want.

To reiterate an earlier point, special education deserves just as much thoughtful consideration and innovation as the education industry at large. Given Apple’s unwavering support of accessibility, this is an area in which they can surely improve.

Apple’s “Pencil” now works with its iWork toolkit

Apple is bringing its pencil to the masses.

The pencil tool will now work across Apple’s suite of iWork tools — including the popular Pages (document creation) Numbers (its spreadsheet app), and Keynote (for presentations) apps — on the low-cost iPad that Apple first brought to market last year.

At an event today in Chicago, Apple announced its latest iPad, in a bid to challenge the dominant player in the education technology — Google (a subsidiary of Alphabet).

In addition, the company said that Logitech is introducing a $49 pencil stylus called the “crayon” which slashes the cost of the pencil hardware from its previous, $99 price point.

 

 

 

France takes legal action against Apple and Google for their app stores

France’s Economy Minister Bruno Le Maire criticized Apple and Google for the way they run the App Store and Play Store. According to him, Google and Apple have too much power against app developers. Le Maire will ask a court to look at it and fine tech giants if necessary.

“When developers want to develop their apps and sell them through Google or Apple, those companies set the prices, Google and Apple get back some data, Google and Apple can unilaterally modify contracts with developers,” Le Maire told RTL. “All of this is unacceptable, this isn’t the economy that we want.”

This isn’t the first time French officials have paid attention to the App Store and Play Store. Last month, ARCEP president Sébastien Soriano shared a report that said that net neutrality should go beyond carriers and internet service providers. The ARCEP thinks that big tech companies also have a responsibility when it comes to the neutrality of the internet.

“This report has listed for the first time ever all the limitations you face as a smartphone user,” Soriano told me at the time. “By users, we mean both consumers and developers who submit apps in the stores.”

It’s clear that developers have no choice but to comply with App Store and Play Store rules. They have no choice but to pay Apple and Google 30 percent of their sales (or 15 percent for subscriptions).

If Apple or Google remove an app from their respective store, developers can’t take legal action because they signed a contract with them. But at the same time, they don’t have a say when it comes to negotiating the terms of those contracts.

“I’m going to take legal action against Google and Apple with the Paris Commercial Court for abusive business practices,” Le Maire said. Google and Apple shouldn’t really be worried as he expects a fine of a few million euros. But it’s an interesting case anyway.

As for the upcoming European tax plans on big tech companies, Le Maire said that it will become effective by the end 2018. Earlier this month, he said that we will hear more about this in the coming weeks.