Who would’ve though that an app that flags down roving towncars would blow up as dramatically as it has? Apparently every venture capitalist worth their salt. Uber revealed earlier this afternoon that it just raised a staggering $1.2 billion in…
Remember how your parents and grade school teachers kept chiding you to make sure you only colored within the lines? And you wondered why it was so important? Well, here’s your answer. Peter Deligdisch’s Between The Lines is a coloring book for experts that will put all of your colored pencil and crayon skills to the ultimate test.
Similar Articles: Tom Savage chicago blackhawks Aubrey Peeples Alex Hribal texas motor speedway
WWII saw the development of some zany designs for weapons, such as when the U.S. developed pigeon guided missiles and (literal) bat bombs (the latter of which were a little too effective, accidentally destroying the testing base when they escaped), or when the Soviets trained exploding anti-tank dogs. Not to be left out of the fun, the Japanese developed their own oddball weapon. Starting in November of 1944, Japan launched over 9,000 devices they called "Fu-Gos" aimed at the United States and Canada. Fu-Gos were hydrogen balloons equipped with incendiary devices that, in theory, would be transported over the Pacific Ocean via the jet stream to devastate the landscape, perhaps starting massive fires in farm fields and forests across North America.
Similar Articles: Yovanna Ventura dallas mavericks Bob Coy final four Michael B Jordan
The always cool Yeti Dynamics has released another brilliant scientific simulation: What would Earth’s sky look like if Saturn went off its orbit to race into the inner solar system on its way to the Sun and our planet was on its path? I just love these impossible but visually arresting scenarios.
Almost 8 years after it was introduced, you still can’t attach files to emails in iOS. While iOS 6 has introduced a method to attach photos and videos to in-progress emails, it suffers from poor discoverability, and only works with content from the Photos app. If you want to attach any other file to an email, it’s a usability disaster. That’s because iOS has an ImagePicker but no broader DocumentsPicker, so it can’t handle a broader range of attachments. So why is that such a bad thing?
The current method for adding photo or video attachments to in-progress emails is done via the same pop-up menu originally introduced in IOS 3 for cut, copy, and paste. You tap the screen to get the popup, tap a tiny, obscure arrow button to get more options, and then tap to add the attachment.
An easy to find, easy to use attachment button would be simpler. It’s a solution employed by third-party apps like Gmail. Maybe a paperclip is obscure as well, but it’s discoverable.
Emailing, and including files as attachments in email, is a common task and something that takes only a few seconds on the Mac with OS X. Trying to attach a non-photo or video file on an iPhone or iPad with iOS takes an annoying amount of time and causes an unreasonable amount of frustration. Here’s some blog-theater by way of example:
“Hey, Rene, can you email me the dates for that trip?”
“Sure, Kevin.” I grab my iPhone, open the [Mail](http://www.imore.com/mail “Apple Mail app for iOS and OS X) app, add Kevin as the recipient, add the subject “Trip”, paste in the dates, then–
“Could you also attach that outline for discussion topics?”
My only option now is to copy the contents of the email, trash it, go to the app I wrote the topics in, find the file, tap share, tap email, add Kevin again, add the subject again, paste in the dates again–
Crap. I deleted the app I wrote the topics in. A hotter, newer app came out and I started using that instead, and even though both use iCloud, neither has any idea the other exists so… I re-download the old app and pray the data is either still there, or magically comes back from the cloud.
“And those two PDF files about that thing?”
Double crap. Both those PDF files are in different PDF apps, one in a simple reader, the other one in an app that supports annotation. Now I have to send the discussion topics from one app, and each of the PDF files from their apps. That’s three separate emails, and nothing approaching a thread.
Oh, wait. I have copies in Dropbox. I can share from Dropbox… Only no, each file is in a different directory and I can only share from one directory at a time! I’m back to three separate emails again!
Now Kevin is laughing his ass off at me and asking me to tell him again how the iPhone is easy to use, and I want to punch things.
And the reason for all this goes back to the violation of a cardinal principle of design: the attempt to keep things simple can ultimately cause ridiculous levels of complexity.
Apple already uses a blue + button to add contacts. They already use a gray camera icon in Messages. Something like that could work for attachments as well. Especially if, as I’ve been asking for the last 4 years, iOS added a DocumentsPicker controller that works the same way the current ImagePicker controller works.
A DocumentsPicker controller for iOS would remove unnecessary cognitive load from users and solve a wide swathe of current usability problems with iOS, including email attachments. Any file could be attached to any in-progress email, without the need for a Share Sheet, or for the user to remember app ownership. And it would do so in a way that’s consistent with how iOS already works and increasing simplicity at the same time.
And it would make iOS more powerful without alienating the mainstream user base Apple is so good at embracing.
Note: This post was originally written prior to the announcement of iOS 7 but has been updated for iOS 8.
We’ve already taken a look at the virtual balconies Royal Caribbean is introducing on one of its upcoming mega ships, but that’s apparently just the tip of the iceberg when it comes to over-the-top cruise amenities. On its Quantum of the Seas, set to launch in 2015, vacationers will find an observation pod that hangs from a 300-foot crane over the ship.
To celebrate Earth Day, Apple is transforming its retail stores with a hint of the green. As seen in photos obtained by Appleinsider, the leaf on the Apple logo has been turned to green while store colleagues have all been given special, green t-shirts to wear for the day.
Apple is making the news this week with it’s new push on environmetal strategy and an expanded device recycling program. While stores in Europe are just waking up, those in the U.S. won’t be opening for some time with their temporary new look. If you’re swinging by your local store today, keep an eye out.
This guy discovered that one measure of Happy is six seconds, exactly like a Vine clip. He then used Vine to record himself and his friends playing different instruments and singing, creating this perfect symphony using multiple iPhones.
Depending on who you are and how you feel, iOS 7 either took the training wheels off to fully embrace digital design, or it removed so much interface as to crush usability. Both are true. iOS is used by a wide range of people, from the digital and mobile immigrant — those who grew up reading newspapers or using PCs — to the digital and mobile native — those who were born to iPhone and iPad. One, single, static default choice can never properly meet the needs of everyone across that range. But what if iOS 8 could make affordance and accessibility dynamic?
Affordance — the characteristics of an interface element that help hint at the actions it can perform — is essential in human machine interaction. One of the design principles for iOS 7 was deference and in its service a lot of interface chrome — the bars, borders, and other structural or decorative elements — was stripped away in favor of making content more prominent. Most famously, the shapes around buttons were removed leaving only the naked text or glyphs/icons behind. Where previously the touch-target — the area that can trigger an action — of a button was visually apparent, now only its center-point remains.
It was cleaner but it offered far less affordance. Instead of seeing something that looked like a button, you had to know or figure out it was a button. So, responding to the complaints, Apple added an Accessibility toggle to restore button shapes to some iOS navigation elements.
But what if toggles weren’t necessary, or at least were a secondary, manual option? What if iOS could determine when people were struggling to use an interface element and bump up affordance and accessibility automatically, and then eased back as/when people got more comfortable.
For example, if someone taps near a button over and over again in a short period of time, iOS could realize they’re trying to hit it and are missing, and automatically turn on button shape hinting — perhaps fading it in — and even increase the tap target size temporarily so the next touch triggers it even if it’s still a tiny bit off.
Likewise, iOS assumes taps higher up on the screen are coming in at a greater angle, which can sometimes frustrate people — and robots — holding their iPhone in less common ways and tapping from less common angles.
Apple’s multitouch display technology, however, has the ability to detect capacitance some distance from the screen and use that information to figure out things like which finger on a hand is likely the one doing the tapping. Based on that electrical guess work, perhaps a the telemetry could also be guessed, and if people are tapping frequently and missing slightly, perhaps the tap target could again be dynamically adjusted to better suit their angle.
As gestures become increasingly common, gesture collisions become increasingly common as well. Swipe to favorite, swipe to reply, even horizontal scrolling are all great unless, while simply trying to go down a list or a page, the angle you’re using keeps misfiring those horizontal gestures. It’d be great if iOS could detect when those gestures are aborted or reversed repeatedly and temporarily increase the angle or emphasis needed to trigger them.
Thanks to sensors, iOS devices know their positions is space, perhaps angles could be dynamically adjusted to account for how you’re holding your device. Standing, walking, lying down, lounging, straight up, slightly or steeply angled, compensating could be tricky but could also be beneficial.
Hey, in some whacky future world perhaps an iPhone or iPad could use frequent pinch-to-zoom, proximity, or even PrimeSense-detected squinting to dynamically start increasing text size if people are having trouble reading.
And perhaps many of these methods, and others still, could be brought to bear to create a next-generation keyboard for iOS 8 that’s as far ahead of current keyboard technologies as Apple’s original iPhone keyboard was its virtual competition of 2007.
We talked a little bit about of dynamic affordance on a recent Iterate podcast and some really, really smart designers poked some really, really smart holes into idea and the realties of its implementation, but I still can’t help but long for some form of smarts to enter into the iOS interface.
We’re in the midst of a contextual awakening, and as much as that can make everything around us and our device better, here’s hoping it can make what’s on our devices, and how we use them, better as well.
Are there any accessibility or interface issues you’d like to see Apple automate in iOS 8, or would you rather keep all of that completely under your own, manual control?