Four things Apple needs to fix


1– Add iCloud versions/Time Machine or some other way of backup, once people migrate their data to iCloud traditional backup methods become impossible or at least highly impractical, add to that the ability to delete files from any device and the lack of backup starts to show all the signs of a usability nightmare.

I do not expect the majority of people to bump into this until they are deep into relying on iCloud for storage in the years to come, but they eventually will and
Apple should have something ready as soon as possible.

2– Fix iTunes (including AppStore, iBooks store etc) discovery and search, it is pretty much broken at the moment as plain text searches yield hundreds of results with irrelevant sorting by download counts.

Having so many featured and suggested categories is a weak solution and it makes no sense since it copies the model of the entertainment business which works totally different from that of software, it is no use to have billion apps in the store when the actual 0.1 of the apps get 99% of the users (just a illustrative estimation, not factually accurate)

It looks like they just copied their music store model piecemeal without giving it a second thought, and it should be pretty clear by now that a  custom approach is needed for the app stores, a good model to emulate would be Netflix, the way it works to highlight new or relevant movies to you, not just push the already hyped and popular blockbusters down your throat.

3– Fix AppleTV by adding the “passive” mode to every stream eg: shuffle for Music, play top Podcasts, play my Vimeo Feed, that is analogous to legacy TV behavior where we quickly select a passive stream (TV channel) with minimal effort, this has to be available in every content module in AppleTV, accessible with preferably a single user interface interaction/dedicated remote button

Preferably it should also be smart and pick content based the user is statistically probable to like, not just push the most popular content (see #2 above)

4– Fix OS X for the post-pc era by removing it’s features that were added specifically for the kind of audience that migrated to iOS devices.
Consciously and explicitly switch to a paradigm where they consider the users for OS X professional tech-savy users as opposed to the iOS users and start to design software accordingly, specifically adding more finesse to OS X and make it tend to the professional and technology savvy.

We have seen signs of this with the revival of Dock support for symlinks, Finder folder merge etc, but this should be made into a explicit direction for OS X in order to focus it on the post-pc users that are not merely using it for web browsing and content consumption.

With MountainLion they pretty much achieved the needed synergy between their two OS’es, from now on it is time to play on each individual one’s strengths.

basics of iOS and OS X API’s


The structure of OS X and iOS native* API’s is very straight forward/minimalistic, typically one does not need to know about anything that goes below the Foundation API however taking a look at the headers can help understanding it better and clearing common confusions between Foundation and CoreFoundation or what exactly constitutes CocoaTouch and Cocoa since the later is a explicit framework while the former is just a naming convention.

All the API’s are present as binary frameworks under /System/Library/Frameworks (with resources but without headers) and under your Xcode toolchain SDK (with headers but without resources) , the objc binary is at /usr/lib/libobjc.A.dylib while the headers are under /usr/include/objc/ and your Xcode toolchain.

I
Now let’s dig right into it, at the lowest level there is CoreFoundation and objc, they are independent of each other :

<objc/objc.h>
objc_class,objc_object,objc_selector etc

<objc/runtime.h>
objc_getClass,objc_getProtocol,class_conformsToProtocol etc

<objc/message.h>
super_class,objc_super,objc_msgSend,objc_msgSendSuper etc

(you can include all the above with #import <objc/objc-class.h>)

<CoreFoundation/CoreFoundation.h>
CFString,CFNumber,CFArray,CFRunLoop,CFStream etc (this is just C, there is no Objective-C syntax or anything at this level)

II
On top of these and including (relying on both) is Foundation, as the name implies you typically never use any API’s below foundation directly.

<Foundation/Foundation.h>
NSString,NSNumber,NSArray,NSRunLoop,NSStream etc, much of it is toll-free bridged to CoreFoundation

III
on top of Foundation there is AppKit for OS X or UIKit for iOS
(on OS X you typically include Foundation, AppKit and CoreData with #import <Cocoa/Cocoa.h>, there is no corresponding CocoaTouch shell framework on iOS)

<AppKit/AppKit.h>
NSView,NSButton,NSColor,NSEvent etc

<UIKit/UIKit.h>
UIView,UIButton,UIColor,UIEvent etc

This is pretty much all there is, from this on there are multiple optional frameworks you can use for specific cases, but the basics are just in the headers above, most API’s are Cocoa but there is still a big chunk of C API’s especially on the OS X side.

It’s worth nothing that there are two types of frameworks : private and public , the private ones are not safe to be used and not allowed in the Mac App Store , the public ones are safe to be used as long as they have headers in the SDK (they could be present in /System/Library/Frameworks but not in the SDK) typically such disparity is a rare occasion nowadays and it was more common prior to 10.6.

One more note is that public frameworks with headers might not have all their methods/classes documented, nevertheless using them should be pretty safe but the lack of documentation is a indication that they are more likely to change/go away than the documented ones.

*OS X and IOS also have the low level BSD API’s (found in /usr/include) most of which are cross-platform and outside the scope of this post.

extended attributes, spotlight and xcode screenshots

If you shall find yourself wondering, as i did, how exactly does Xcode know which device too what screenshot from the ones it manages the answer is simple, it just saves extended attributes for the files with the device id under com.apple.DTDeviceKit.screenshot.device_id e.g.

[valexa@VAiMac:~] $ xattr -l /Volumes/Storage/Screenshots/Screenshot 2010.07.13 01.43.57.png
com.apple.DTDeviceKit.screenshot.device_id: 5a14571ebe34512345b7345e13454a

Finder being finder has no way whatsoever to display or search for extended attributes, however some useful spotlight metadata is saved (the spotlight metadata itself used to be saved as extended com.apple.metadata attributes and xattr is still the only way to edit it) :

[valexa@VAiMac:~] $ mdls /Volumes/Storage/Screenshots/Screenshot 2010.07.13 01.43.57.png
….
kMDItemPixelHeight = 1024
kMDItemPixelWidth = 768
….

This can in fact be searched with finder even if not readily apparent, you have to add a specific Raw Query for it to understand the raw commands that you would have given to mdfind e.g.:

[valexa@VAiMac:~] $ mdfind -onlyin /Volumes/Backup kMDItemIsScreenCapture == 1
/Volumes/Backup/10.8/Screen Shot 2012-03-03 at 12.00.30 AM.png

This searches for screenshots taken from your mac (you can search for specific types for example whole screen ones with kMDItemScreenCaptureType == “display”, screenshots taken of specific windows with kMDItemScreenCaptureType == “window” or “selection” etc)

I had to do this because my screenshots folder contains both the Xcode ones and my mac screenshots, my specific goal was to figure out why some iOS screenshots there no longer showed under their corresponding devices, it turns out that i edited some with Photoshop and it replaced the extended attributes.

Editing those attributes with finder and AppleScript while possible is extremely convulted and employs shell calls anyway so we just head back to Terminal with the newfound knowledge of what screenshots we have.

Now if you only have one device for each screen resolutions available in iOS you are in luck, to print the extended attributes for iPhone, iPhone Retina, iPad, iPad Retina respectively, you can do:

mdfind -onlyin /Volumes/Storage/Screenshots/ “kMDItemPixelWidth == 480 || kMDItemPixelHeight == 480” -0 | xargs -0 xattr -l
mdfind -onlyin /Volumes/Storage/Screenshots/ “kMDItemPixelWidth == 960 || kMDItemPixelHeight == 960” -0 | xargs -0 xattr -l
mdfind -onlyin /Volumes/Storage/Screenshots/ “kMDItemPixelWidth == 768 || kMDItemPixelHeight == 768” -0 | xargs -0 xattr -l
mdfind -onlyin /Volumes/Storage/Screenshots/ “kMDItemPixelWidth == 1536 || kMDItemPixelHeight == 1536” -0 | xargs -0 xattr -l

Now that you seen your device id’s of the screenshots with proper attributes you can go ahead an set the proper id for all screenshots for a screen type e.g.:

mdfind -onlyin /Volumes/Storage/Screenshots/ “kMDItemPixelWidth == 480 || kMDItemPixelHeight == 480” -0 | xargs -0 xattr -w com.apple.DTDeviceKit.screenshot.device_id ‘5a14571ebe34512345b7345e13454a’

Xcode will immediately catch on the change and credit the screenshot properly for it’s source device.

refresher on resource forks

Resource forks are a strange beast, while Apple started moving away from them (around 10.4) and migrated to HFS Attributes, software like Adobe’s Photoshop still save (can be disabled in Preferences > File Handling) file previews as resource forks, here is a refresher on how to view, find and delete resource forks.

You can see is a file has a resource fork in a number of ways (they all involve the terminal)

1- see if the file has a “com.apple.ResourceFork” extended attribute

xattr /Volumes/Volumename/Dirname/filename.extension

2- lookup the attributes of the resource fork directly

ls -l@ /Volumes/Volumename/Dirname/filename.extension

ls -ila /Volumes/Volumename/Dirname/filename.extension/..namedfork/rsrc

File system operations can be performed on a resource fork just like any other file so you can copy or delete them, to get a path to the resource fork you add /..namedfork/rsrc to the full path of the file in question, for example to copy then remove the fork:

cp /Volumes/Volumename/Dirname/filename.extension/..namedfork/rsrc ~/Desktop/thefork.rsrc

rm /Volumes/Volumename/Dirname/filename.extension/..namedfork/rsrc

You can also delete the fork by removing the extended attribute

xattr -d com.apple.ResourceFork /Volumes/Volumename/Dirname/filename.extension

Or print a hex dump of the actual data in the fork

xattr -l com.apple.ResourceFork /Volumes/Volumename/Dirname/filename.extension

You can find all the files that have resource forks with this terminal command :

find / -type f -exec test -s {}/..namedfork/rsrc ; -print

Now if you were to combine the last two you could delete all resource forks (in a given type of files under a certain path) e.g.:

find /Volumes/Volumename/Dirname -type f -name “*.extension” -exec test -s {}/..namedfork/rsrc ; -print0 | xargs -0 xattr -d com.apple.ResourceFork

 

NOTES:
Messing around with/deleting resource forks should be pretty safe nowadays, they were the mechanism used in Snow Leopard for storing HFS compressed files but this has been removed altogether in Lion.

The actual file as we know it is referred to as the data fork in this context and there used to be a way to get to it with /..namedfork/data but that does not appear to work anymore, if anyone can clarify please comment.

If you want to really dig into the gory details of your filesystem or maybe are in the unfortunate predicament of having to recover lost data i strongly recommend fileXray by Amit Singh, writer of the Mac OS X Internals book.

Sandboxing woes

It’s a brave new sandboxing world they say and that brings about many implications good and bad, to a security professional asking the user for permission to read every single file might be pure heaven, to a UX professional it might be hell.

Either way consider this scenario, you have a application that needs to know some operating system setting, some configuration context, Apple can never provide exhaustive API’s for all scenarios and you will inevitably have to read or write to files the user does not directly need to interact with.

Before sandboxing you could just do this transparently, this is all good unless a attacker takes over your application and leverages it to wreak havoc, that is what sandboxing prevents but it also prevents legitimate scenarios and until Apple adds a way to specify in the entitlements a list of files that the application transparently needs to access the only way is to ask the user explicit permission.

Here is a way :

-(void)punchHoleInSandboxForFile:(NSString*)file
{
    //only needed if we are in 10.7
    if (floor(NSAppKitVersionNumber) < = 1038) return;
    //only needed if we do not allready have permisions to the file
    if ([[NSFileManager defaultManager] isReadableFileAtPath:file] == YES) return;
    //make sure we have a expanded path
    file = [file stringByResolvingSymlinksInPath];
    NSString *message = [NSString stringWithFormat:@"Sandbox requires user permision to read %@",[file lastPathComponent]];

    NSOpenPanel *openDlg = [NSOpenPanel openPanel];
    [openDlg setPrompt:@"Allow in Sandbox"];
    [openDlg setTitle:message];
    [openDlg setShowsHiddenFiles:NO];
    [openDlg setTreatsFilePackagesAsDirectories:YES];
    [openDlg setDirectoryURL:[NSURL URLWithString:file]];
	[openDlg setCanChooseFiles:YES];
	[openDlg setCanChooseDirectories:NO];
	[openDlg setAllowsMultipleSelection:NO];
	if ([openDlg runModal] == NSOKButton){
        NSURL *selection = [[openDlg URLs] objectAtIndex:0];
        if ([[[selection path] stringByResolvingSymlinksInPath] isEqualToString:file]) {
            return;
        }else{
            [[NSAlert alertWithMessageText:@"Wrong file was selected." defaultButton:@"Try Again" alternateButton:nil otherButton:nil informativeTextWithFormat:message] runModal];
            [self punchHoleInSandboxForFile:file];
        }
	}else{
        [[NSAlert alertWithMessageText:@"Was denied access to required files." defaultButton:@"Carry On" alternateButton:nil otherButton:nil informativeTextWithFormat:@"This software can not provide it's full functionality without access to certain files."] runModal];
    }
}

You need to add a call to punchHoleInSandboxForFile before every file access call eg:

[self punchHoleInSandboxForFile:@"/etc/hostconfig"];
NSString *stuff = [[NSString alloc] initWithContentsOfFile:@"/etc/hostconfig"];

This nags the user once for each file, once the hole has been punched for that file it persists for the lifetime of the process, it presents a file dialog with the file in question already selected (however that does not seem to be consistent, sometimes selecting the file will be required) .

Here’s hoping Apple adds something along the lines of setting specific files with permissions in the entitlements sooner than later, until then feel free to use this and suggest any better alternatives you can find.

 

Off by 7%

I noticed something quite strange today, almost a year since Apple launched their Magic Trackpad their marketing materials on the website do not depict the actual device correctly in terms of proportions.

Don’t get me wrong, a 7% width difference from the promotional images to that of the actual device is a far cry from deceptive marketing and had it been any* other company than Apple such a slip up would not even worth mentioning, but i find it quite notable given how notorious Apple is for the attention to detail in both products and marketing going as far as to depict the exact time that Steve Jobs introduced the iPhone on all iPhone marketing materials.

Here is how the images on apple.com/magictrackpad/ compare to a image of a actual Magic Trackpad, all images show it 7% wider, and it is not that they do not have images of the hands on properly sized Magic Trackpads, boxes for it from the initial US version and the current EU version both show accurate sized devices.

*Take ViewSonic for example that had OSX depicted running on their ViewPad tablet‘s promotional materials

discrete graphics and you

In 2010 Apple introduced a feature in MacBooks with the aim of extending battery life “Automatic graphics switching” that switches between using either the integrated Intel GPU  inside the CPU or the standalone discrete AMD/Nvidia GPU.

The exact conditions however that make it switch one or the other were never clearly stated by Apple, and there has been much debate confusion and speculation on the subject.

Upon testing i found the answer to be pretty straight forward in that if a application ever loads the OpenGL.framework the machine switches to the standalone GPU (easiest way to determine this is inspect the files area of the program with Activity Monitor and look for OpenGL.framework) ,i have so far unable to find any other framework besides it to trigger the switch.

Determining what can make it load that framework is not that straight forward however, and remember it does not have to be linked for it to be loaded, any API that relies on it can trigger it’s loading, and the likely culprit in most cases will be Core Animation which uses OpenGL backing.

So to make sure you do not trigger the standalone GPU you need to make sure your code does not rely on any Core Animation API’s or any that otherwise uses OpenGL, also remember that all the “Effects” in Interface Builder require a Core Animation layer that will trigger the loading of the OpenGL.framework

EDIT: As of lion there is a property you can control this behavior with, the NSSupportsAutomaticGraphicsSwitching key as detailed in the Allowing OpenGL applications to utilize the integrated GPU QA

 

iOS and vectorial artwork

Any conscious iOS developer wants his artwork to look as good as possible, and with the retina display for example we were told to upgrade our artwork to higher resolution, which involved alternative @2x versions for each file and gets daunting fast if you have a lot of artwork files, and especially if a lot of your artwork is vectorial and you could just use it directly.

A popular vectorial format is PDF, and Apple has implemented a lot of the resolution independent artwork in their OSX apps as PDF, unfortunately there is no straightforward way to do that, the iOS SDK does not yet have the PDFKit.framework that exists on OSX (Interface Builder does still accept pdf files as images but they will not show on iOS) so what is there to do ?

Well the only way to do it currently is to set your artwork images in code, after rendering the PDF file into a UIImage, there are a couple of approaches to this and i am going to show you the one i use :

#include <dlfcn.h>

-(UIImage *)UIImageFromPDF:(NSString*)fileName size:(CGSize)size{
    CFURLRef pdfURL = CFBundleCopyResourceURL(CFBundleGetMainBundle(), (CFStringRef)fileName, NULL, NULL);
    if (pdfURL) {
        CGPDFDocumentRef pdf = CGPDFDocumentCreateWithURL(pdfURL);
        CFRelease(pdfURL);
        //create context with scaling 0.0 as to get the main screen's if iOS4+
        if (dlsym(RTLD_DEFAULT,"UIGraphicsBeginImageContextWithOptions") == NULL) {
            UIGraphicsBeginImageContext(size);
        }else {
            UIGraphicsBeginImageContextWithOptions(size,NO,0.0);
        }
        CGContextRef context = UIGraphicsGetCurrentContext();
        //translate the content
        CGContextTranslateCTM(context, 0.0, size.height);
        CGContextScaleCTM(context, 1.0, -1.0);
        CGContextSaveGState(context);
        //scale to our desired size
        CGPDFPageRef page = CGPDFDocumentGetPage(pdf, 1);
        CGAffineTransform pdfTransform = CGPDFPageGetDrawingTransform(page,kCGPDFCropBox,CGRectMake(0,0,size.width,size.height),0,true);
        CGContextConcatCTM(context, pdfTransform);
        CGContextDrawPDFPage(context, page);
        CGContextRestoreGState(context);
        //return autoreleased UIImage
        UIImage *ret = UIGraphicsGetImageFromCurrentImageContext();
        UIGraphicsEndImageContext();
        CGPDFDocumentRelease(pdf);
        return ret;
    }else {
        NSLog(@"Could not load %@",fileName);
    }
    return nil;
}

Typically you would use it like this : [someButton setImage:[self UIImageFromPDF:@”search.pdf” size:CGSizeMake(20,20)] forState:UIControlStateNormal];

The best thing about this is that it not only makes your artwork compatible with the current form and scaling factors, but most likely with further ones as well.

This is compatible with iOS 3.0+ and has been tested on iPhone and iPad, it is compatible with any resolution scaling (retina display has 2.0 for example) , i have also attached 4 pdf icons to get you started (created with Photoshop, exported with no layers or color profiles).

If for some reason you do not want to #include dlfcn.h and use dlsym() you can weak link UIKit and just check if UIGraphicsBeginImageContextWithOptions is NULL, or obviously if you are not supporting anything before iOS4 remove the conditional altogether.

NOTE: technically on IOS 4  you can currently exploit a bug and load a pdf file from [UIImage imageNamed:@”filename”] by stripping the extension, however the default size of the artwork in the pdf has to be exactly the size of the UIImage you want, if it gets scaled significant pixelation occurs basically making this shortcut unfeasible, plus it could stop working altogether at any time.

iphone signal testing with code

Closest cell tower 1 Km, iPhone 3G, Wi-fi mode, iOS 4.0, Orange network.

With device held gently by edges:

signal avg 90 (outdoor, line of sight to cell tower, same altitude)
signal avg 😯 (indoor, line of sight to cell tower, same altitude)
signal avg 60 (indoor, no line of sight to cell tower, same altitude)
signal avg 4O (outdoor, no line of sight to cell tower, ground level)

With device cradled in hand a average decrease of signal by about 20 in all cases and whatever finger configuration.

Turning 3G on causes a average decrease of signal by 5 in all cases.

The connection strength bars shows 5 bars in all cases.

Absolute minimum experienced : 19, absolute maximum 96.

The thing to take away from all this is that if  you already had a very bad connection under 20 and subtract the hand attenuation of about 20 you are left with no signal, and it did not hep that until iOS 4.0.1  a bad connection with a 20 strength would have shown 4 or even 5 bars.

The application used to test gets the signal strength from private calls to apple’s CoreTelephony framework, in a effort to help the iphone signal testing going on share a common base i am providing the application to anyone, please let me know what are the results of your tests and feel free to contribute to the code on github.

The application is named  VAFieldTest , it’s code is under a open source license and can be found at http://github.com/valexa/VAFieldTest , you need Xcode and a iPhone developer account to compile and run it.

iOS 4.0.1 update
same behavior as before, with the difference that now the connection bars properly reflect changes in the strength (I 1-19, II 20-29, III 30-39, IIII 40-49, IIIII 50-99 ) , with previous versions they just show 5 bars for any signal strengths above 25.

how to make your iOS device read stuff

Here is a little handy tip to make your device speak textual content with almost acceptable computer voice:

Go to Settings > General > Accessibility > Triple-click home to “Toggle VoiceOver” or “Ask” if you want to get a dialog asking wether to turn it On/Off (remember that tapping is done by double-tapping when voiceover is on).

Optionally go into Settings > General > Accessibility > Voiceover and set the “Speaking Rate” slider to around 20%, i found that speed to work best for me when reading papers.

Now go to your book or webpage or anything you want read and triple-click home and behold the machine start speaking your currently selected element,

Remember to turn voiceover off when you are done or else you will have to interact with the phone interface in a a whole different way (if by any chance you lock the device as i did you can unlock it by tapping the swipe bar to select it then tapping it again to unlock).

The major pitfalls besides the surprisingly hard to perform triple-tap home is that you have to select any text blocks you want read, and while this might be acceptable for pdf books where you can select a whole page in basically everything else you have to select individual paragraphs, even sentences, so do not expect a audiobook experience, and do not expect it to speak the contents of apps that do not have selectable text like WSJ, NYT etc (BBC/Reuters/AP work).

It is a shame Apple did not make it possible for this to be implemented as a book/content speaker with a consistent behavior and without relying on voiceover tricks, as it will even support reading in a lot of other languages than English, (for the full list see this for iPad and this for the other devices)

Voiceover is only available on iPhone 3GS, 4, iPad and 3rd generation iPod touch.