Category: Uncategorized

  • Slow Decline Of The Mac Pro

    I wanted to write a bit more about the future of “pros” on the Mac, but about the Mac Pro.

    Pros are the most easily spooked, jittery segment of the computer market, and they have reason to be. When they buy equipment from a vendor, whether that is Apple or HP or Dell or whoever, they are spending a substantial amount of money, and are risking their business on a platform. Buying the wrong equipment or buying into the wrong strategy has serious consequences to the bottom line. If a business chooses wrong it would take a serious amount of time and money to migrate users, equipment, and existing projects. If computers become slower, billable hours become higher and less competitive. Often I see posts on Twitter complaining that people critical of Apple are spending too much time focusing on specs or timely updates or on having the fastest available computers, but these are all crucial factors when looking at pro hardware for good reason.

    Apple, for decades, has had a basic pact with pro users (although I’m starting to suspect Apple never knew it.) Windows has always been the less risky platform, just due to vendor choice. If you’re a business that buys all HP, but HP stops creating solutions that are right for your business, it’s very little trouble to migrate to Dell. If you run your businesses on the Mac, and especially if you run your business on Mac only software like Final Cut Pro, it’s harder to transition off the platform, and Apple is a larger risk to your business. But pro users have been content with this risk as long as Apple continues to deliver as fast or faster hardware than their competitors, and they upgrade every year. This basic pact has even helped resolve a lot of Apple’s secrecy issues. You don’t need to know Apple’s roadmap as long as you know, whatever it is, it will show up next year, be faster, and be better. Apple still works this way on iOS. You could run trains on Apple’s typical iPhone and iPad update schedule, even with all the secrecy.

    I’ve heard the tower Mac Pro’s sales were quite good. I don’t know anything about the 2013 Mac Pro sales, but I could guess that they probably aren’t that good.

    Before the 2013 Mac Pro, Apple hadn’t upgraded the Mac Pro in three years (and Apple’s neglect of Final Cut Pro 7 didn’t help.) I with video pros at the time and the panic was already setting in. A two year gap, like the one from 2006 to 2008, was digestible. But at three years you start to wonder if the Mac Pro was going to be updated at all. And if you don’t think the Mac Pro is going to be updated, for the good of your business, you’re going to start looking at the Adobe Suite and Windows workstations, and start that transition as early as possible. In that span of time, the uncertainty took Apple’s Final Cut Pro dominance, and handed it to Adobe.

    When Apple released the 2013 Mac Pro it never calmed the pro community. The 2013 Mac Pro a risky proposition for businesses because it was slower than Windows hardware, which translates to dollars on the bottom line. A job that takes twice as long to render costs twice as much. And that just continued to feed the narrative that investing in the Apple platform was a risky proposition. And then three years later Apple still hasn’t shipped an upgrade, continuing the tailspin in pro’s confidence of Apple. Mac Pro sales are likely down a bit due to the specs, but I think Mac Pro sales are down as low as they are because Apple can’t demonstrate a commitment to their platform for professionals.

    I think the Mac Pro could sell a whole lot. People need workstations. But to revive sales of the Mac Pro Apple needs to do two basic things:

    • Release a 2018 Mac Pro. No, that’s not a typo. I don’t think it’s the next Mac Pro that will be important as the one that comes after, and I hope that’s not discouraging because I really think Apple could succeed with pros. I’ve already had people tell me they won’t buy the next Mac Pro because they are worried it will be the last one, they don’t want to be on a dying platform, and would rather move over now.
    • Say Apple is committed to the Mac Pro. Apple has been able to keep their roadmaps secret because their release schedule has been dependable. If the Mac Pro releases aren’t dependable, stop jerking people around. All Apple has to do to calm pro users right now is say that there is a new Mac Pro coming but they haven’t been able to show it yet. And Phil Schiller has come so close to saying this. If you can’t rebuild the trust with actual releases, rebuild the trust through the press.
    • Specs? It’s honestly less important than rebuilding trust, but still important. Intel may have been standing still, but GPU vendors were not. The 2013 Mac Pro uses 2012 GPUs that were already dated when it shipped. AMD has floundered a bit, but Nvidia has at least released three solid updates since. For a pro business, that lost productivity is pretty hard to ignore.

  • On Managing Expectations (Macbook Pro follow up)

    One of the counter points to criticism of the Macbook Pro event is that expectations are too high. Users are expecting that a laptop should be just as powerful as a desktop, and that’s unreasonable. Generally, I agree. The Macbook Pro has not really been a good desktop replacement since almost the Powerbook G3.

    But the problem is Apple themselves is marketing the Macbook Pro as a desktop replacement.

    I mentioned in the previous post that a lot of the angst from pro users probably would have been avoided if desktop Macs were mentioned or updated. I still think that’s true. If you don’t think the Mac Pro is going to be updated, and that the Macbook Pro is what Apple is pitching as a replacement, you’re going to compare it to desktop workstations. Even if you think the Mac Pro is going to be updated, Apple’s lack of a mention of it (or the iMac) implies that Apple is still misjudging the expectations of the pro community. When you’re a Pro, you don’t like uncertainty around the tools you need to earn a living. Would you risk your business on a vendor that doesn’t have a clear plan on continuing to support your workflow?

    I think it’s fair to criticize Apple on not clearing up all this uncertainty with the different Mac lines during the event. After not getting any serious updates for three years, the 2013 Mac Pro was announced six months before it shipped. When I worked in IT we were apprehensive about ordering PowerPC machines after the Intel transition was announced. Apple responded by letting us pre-order the original Macbooks before they were announced to the public. It’s easy to say that Apple operates in complete secrecy and we just all need to deal with it, but Apple selectively keeps secrets only when it benefits them. Even a “we’re working on” for the Mac Pro would have gone a long way towards re-assuring a community that depends on Apple’s roadmap for a living.

  • Mac Apple Event Thoughts

    I’m very supportive of going all in on Thunderbolt 3. Thunderbolt 3 is a huge advance, and I think it’s worth ditching all the legacy connectors. It will be a bumpy transition at first, but once it’s done having one universal connection will be worth it (although I’m not holding my breath for corporate projectors to start adopting USB-C or Thunderbolt 3.)

    AMD and Nvidia have been working hard on shrinking the size of their chips, and AMD’s 400 series (known as Polaris 10 for mid range desktops, Polaris 11 for laptops) and Nvidia’s 1000 series (known as Pascal) offer approximately double the performance per watt, and have balanced this improvement with increasing performance and power savings.

    Apple appears to be offering the highest end Polaris 11 part available: the Radeon 460. This is a huge improvement over previous generations where Apple tended to only use the middle end of AMD’s mobile offerings. But while AMD has improved their performance compared to their previous generation, they’ve failed to take the performance crown from Nvidia. Nvidia’s low end professional notebook GPU, the GTX 1060m, is still almost twice as fast as the Radeon 460.

    The issue with the new Macbook Pro is it ignores everything professionals have been asking for, while adding things that they didn’t. Unnecessarily making the laptop thinner prevents them from using a mobile GPU like Nvidia’s 1080m, which offers nearly four times the performance of the Radeon 460. And as GPU advancements slow again and GPUs become more and more power hungry, the increased thinness of Apple’s design may also force them back to lower end mobile GPUs.

    Apple also ignored almost the full list of what pros were looking for in a new Macbook Pro: features like upgradable storage, higher resolution displays, more RAM, external graphics expansion… Apple is pushing this laptop as a 4k editing professional notebook, but hasn’t even equipped it with a 4k display. Whatever you think about Microsoft’s new Surface Studio product, it’s at least trying to get at that list of needs pros have. It’s at least showing some sort of awareness of what the market is asking for that Apple isn’t.

    A lot of pros still work in environments where they need the best possible workstations to work efficiently. Movies still don’t render instantly. VR and 3D graphics work is still very hardware bound. I even have Xcode projects that take a considerable amount of time on my Macbook Pro to build. The Mac used to be the best choice for these sorts of use cases. Apple provided the fastest hardware, with the most reliable operating system, and it made an easy choice for environments where your computer’s efficiency directly made you more money. While macOS does maintain a slim reliability lead over Windows, Apple’s slower hardware is hurting the bottom line of these kinds of businesses. If a Macbook Pro takes twice as long to render your film than a competing Windows notebook, is it really worth it to stay on the platform? At a certain point, even if you love Apple, macOS, or the fancy new Touch Bar, you are losing money by staying with Apple.

    There is a giant unknown in all of this, and that is the Mac Pro. Competitively slow Macbook Pro performance was tolerated as long as Apple offered a fast desktop for people to use for performance oriented tasks. The classic Mac Pro was beloved because it fit in perfectly with compute hungry workplaces. Apple literally took the best Intel had to offer, and the almost-best GPU makers had to offer, threw them into a nice, flexible box, and sold them to pro users. It wasn’t complicated, but it didn’t need to be. The job of the Mac Pro was not to make a statement, but to burn through any creative task as fast or faster than any other machine on the market.

    I don’t think the Mac Pro is dead (MacWorld is claiming there will be a new Mac Pro in November). If the Mac Pro is updated, it will quiet some of the complaints creative pros have with Apple right now. But Apple has also been ignoring the needs of Mac Pro users as well. Besides the lack of updates, the design of the 2013 Mac Pro also missed the mark. It got dual GPUs standard, but it sacrificed dual CPUs. The design is too small to fit any higher end GPUs, and can only fit one SSD. Apple made a large number of important sacrifices to achieve a design nobody asked for or needed.

    If Apple really wants to pro user market to return, they just need to keep it simple. Stuff the fastest possible components into well priced, reliable macOS boxes that help people get work done. They don’t need to art pieces, and they don’t even need to be razor thin. Apple needs to build workhorses again. It may not be exciting, but pros don’t want excitement in their computer purchasing, they want reliability. And throwing the fastest components into a few computers every year is a cheap way to keep a reliable income stream from happy users going.

    Bonus: Death of Apple Displays

    The new LG displays are nice. I’d buy one if I had a machine that I could plug one into. But I’m a little mystified on why Apple didn’t just take the step of slapping an Apple logo on the display, and selling it as an Apple branded product. I’m sure that US based Mac Pro factory has some overhead to put together some Apple monitor cases.

    It’s more than just being superficial. The monitor not being Apple branded means it is no longer Apple supported. When you buy an Apple branded monitor with a Mac, it’s covered under the same warranty as your Mac. If your Mac had three year AppleCare, your monitor was covered for three years too. And your monitor was serviced at the same local stores your Mac was serviced at. With an LG monitor, that piece of mind I had is now gone. I’ve had Apple monitors die and get repaired under a three year AppleCare plan. If I have an issues with an LG display, I don’t have a local store to get it serviced at. And what about out of warranty repairs? My cat chewed on the cables of my 27″ Cinema Display, and for a small fee that Apple store replaced the built in cables. If I have any other accidental or out of warranty issues, will LG fix them for a fee?

    I don’t know how many monitors Apple sold. My hunch is it wasn’t as many as Dell or HP, but I also saw enough of them around I can’t imagine they didn’t sell at all. But having one vendor to deal with all your problems was always a great thing about buying Apple gear. Now Apple wants me to buy third party displays. If I’m looking at Dell or HP displays, I might also take a look at their computers too. They both offer on site service, their computers are faster than Apple’s, and I only have to work through one vendor. Sounds pretty compelling to me.

    It would be great if Apple could service the LG displays, cover them under AppleCare, or at least act as a front line for passing hardware issues along to LG. That would make their relationship feel a lot more partner-y and make me more comfortable with buying Apple.

  • iPad Pro Initial Thoughts

    I’ve been working on an app intended for use with the Apple Pencil, so I went to the store and picked up an iPad Pro this morning. (Sadly no Apple Pencil or Keyboard, both are deeply backlogged it seems.) At my desk I have a Mac Pro, I carry a Macbook Pro for working on the go, and I have an iPhone 6 Plus and iPad Air 2, so I’ve been thinking a lot about how the iPad Pro fits in with my workflow as it is now.

    I might publish more thoughts on it as I spend more time with the device, but I’ve already had a few reactions and thoughts on it, both good and bad.

    Good: The Hardware

    On the outside it looks a lot like a bigger iPad Air 2, which isn’t a bad thing. Apple has added speakers on the “top” near the lock button, and the “bottom” near the Lightening port (or the left and the right of the iPad if you hold it in landscape.) There are two sets of holes on each edge for stereo sound in both orientations.

    The speakers sounds very good for an iPad. The bass is audible, and the volume is much much higher than my iPad Air 2. I’m not sure if it sounds as good as the built in audio on my Macbook Pro, but it’s at least pretty close. The built in output is not a replacement for a decent pair of speakers, but it sounds great for a portable device. My only complaint is that Apple is still opting for side mounted speakers instead of front facing speakers. I hold the device by the sides, and it’s very easy for my hands to cover the speakers and for the sound to become muddled.

    I’m writing this without an Apple Keyboard or Pencil, but I’ll say the typing experience is miserable without them. Worse than the iPad Air. The keyboard is simply too big on the larger screen. Typing two handed is bad enough, but with one hand it’s unbearable. I’m not this is going to be a good replacement for a laptop even with a physical keyboard, but if you’re going to be doing anything as basic as typing medium sized emails, do yourself a favor and get the hardware keyboard. Long term, I’d love to see Apple add handwriting recognition, even if it’s not super. They at least have a starting point with the Mac’s handwriting recognition that’s available for graphics tablets.

    The performance is good. I’ve seen the synthetic benchmarks that make the performance look very favorable compared to the iPad Air 2. But some of the numbers I’ve seen in running my own apps indicate that performance of applications may actually be imperceptibly slower. The extra CPU gains Apple made with the A9 may be getting used up driving the larger display. The iPad Air 2 always felt like a snappy device, so if Apple is just able to deliver the same user experience on the iPad Pro, it’s not a significant issue. But if you’re coming from an iPad Air 2 I’m not convinced things are going to feel significantly faster.

    Speaking of the display size… It’s big. I told someone earlier it feels like I’ve been given a novelty giant iPad as a joke. Not in a bad way, I like the extra real estate, but it’s not an easy to carry device like the smaller iPads. Most of the time I use it I let it lay completely flat on a desk instead of holding it (which makes me regret not buying the Smart Case to prop it up with.)

    I’ve been tinkering with creative apps on it, and the extra screen size is great. As I mentioned, I don’t have my Apple Pencil yet, and I’m sure the hardware will feel even better once it arrives.

    The one thing I’d like to see on a future iPad Air is support for Thunderbolt, and beyond that support for pointing devices. One impressive thing about the Surface Pro is the transition it can make to a desktop PC when you plug it into a docking station. It would be nice to be able to plug an iPad Pro into a Thunderbolt display, and make use of the wired network, keyboard, mouse and other accessories.

    Bad: The Software

    When I talked about the hardware, I mentioned a lot about how it just felt like a bigger iPad Air 2. This is a good thing. With the software, it’s pretty much the same thing: it feels like a bigger iPad Air 2. This is a bad thing.

    Originally I was on the fence about whether I should buy an iPad Pro or a Surface Pro. The attractive thing about the Surface was the lack of boundaries put in place by the software. Want to run real Photoshop with a bunch of windows? Go ahead! Mount network shares or plug in a USB printer? No problem! Run a DOS emulator to play a 20 year old game that happens to be touch friendly? Go for it!

    A lot of apps have been updated, but there are still some strange gaps. Garageband doesn’t seem to be updated for the iPad Pro screen. Neither has the WWDC app. (One of my favorite third party games, Blizzard’s Hearthstone, doesn’t seem to be either.) I was expecting a premier Apple application like Garageband to be updated before launch. Apps like Keynote have been updated, and they look great on the display. Apps that haven’t been updated simply appear to be stretched, and they look pretty clearly pixelated compared to other modernized applications that look brilliant on the iPad Pro display.

    Some apps that have been updated have an annoying habit of leaving the additional space empty. Apps like Messages, Twitter and News all have the habit of dealing with the extra space by just leaving ridiculous margins around content. I’m hoping in time this issue gets fixed.

    The big problem with iOS on the iPad Pro is it still struggles with the productivity basics. Multitasking has been nice on the iPad Air 2, and it’s certainly better on the iPad Pro. But it still can only run two apps at the same time. Navigating between applications is slow and cumbersome. And worse yet, you can still only have one window an application open at a time. Want to compare two Excel spreadsheets side by side at the same time? Nope, out of luck.

    Initial setup was also not great as I realized how fragmented applications have become. Panic and Adobe both have excellent apps on the iPad Pro, but both have their own sync services with their own separate logins because Apple has placed restrictions on iCloud usage, and doesn’t provide any sort of single sign on service to make up the gap. (And to be clear: I’m not blaming Adobe or Panic for a situation that is rooted in how Apple treats Mac applications.) I dug into the Adobe apps only to realize I didn’t have my stock artwork available. I couldn’t login to my network share to copy the artwork down, and I couldn’t download a zip of it from the internet because there is no app to decompress the zip, and no filesystem to decompress it to. Adobe seems to have a way to load the artwork into their proprietary cloud, but I haven’t done so yet, and I shouldn’t have to set up a new proprietary cloud system for every application just to load some files in.

    The iPad Pro still shares the same basic problem as it’s older iPad Air 2 sibling: Productivity on the device is killed by a thousand tiny paper cuts. I’m not trying to say you shouldn’t buy one, but I’m saying that you should expect to have the same productivity on it as you would an iPad Air 2. The screen size can’t solve the productivity issues without the software.

    I’ll revisit this when the Apple Pencil comes out. I’ve heard really great things about it, and I’m sure for artists this will be a great supplemental device. But I don’t think anything about the iPad Pro has changed to make it a dramatically better PC replacement device than the iPad Air 2. If the iPad Air 2 has been a good PC replacement for you, the iPad Pro will continue to be, but with a larger screen size. Otherwise, Apple’s continued resistance to making iOS more serious for professional workflows will just slow you down compared to a Macbook.

    I don’t mean to be too down on the iPad Pro. I’ve mostly been talking about the iPad Pro as a PC replacement because Apple has been talking about the iPad Pro as a PC replacement. The hardware is great, and I can definitely see some sort of future here. I’m not totally convinced that a touch based tablet can take the place of a laptop with dedicated keyboard and trackpad (something Apple themselves have repeatedly said in response to other faux-laptop tablet combos like the Surface Pro), but for me it’s easy to see this as a good ultra portable device. And as a developer, I see all sorts of cool things I could do on a touch based device this large and this powerful. But as a user, the software still holds me back from getting things done as efficiently as I could on a laptop. I know my needs are greater than most PC users, but I’m just not convinced that the iPad Pro has changed the decision making process someone goes through for buying a tablet vs. buying a PC.

  • Swift Needs KVO/KVC

    I’m just finishing up my first app store bound project that was written in Swift. It’s nothing hugely exciting, just a giant calculator sort of application. Why I chose Swift is that Swift’s static typing really made me think about the data layer, and how data flows through the application. What I missed terribly was KVO/KVC, and I’m not alone. Brent Simmons has also mentioned this, but as someone who’s used a lot of KVO and KVC over the years, I find that it’s helped me ship code a lot more quickly, and has been one of the most valuable features of the Mac frameworks. A lot of developers who are new to the platform aren’t aware of these constructs.

    The idea is something like this: We’re done a really good job of optimizing the model layer of Model/View/Controller applications. And Swift has done an amazing job. Static typing provides huge advantages in reliability and coherency. But the Obj-C philosophy is really about re-usable components. In that philosophy, components written by one vendor need a way to seamlessly talk to another, and this is really where Swift and static typing fall flat. A view from one vendor or component needs a way to render data from a model from another component. We find this even in the system where a component like CoreData needs to be passed into a controller where it needs to be searched, or…

    Hold on. I can hear the Swift developers already. “We have protocols and extensions for that! I can make a component from one source talk to a component from another source. All I need to do is define a protocol in an extension and I can have my static typing and everything!”

    Ok. Let’s go down the rabbit hole.

    The Swift Protocol Route

    Let’s take a classic case that is actually a scenario that Apple shipped on the Mac in Mac OS 10.4. I want to have a controller, that given an input of an array of types, will filter the array based on a search term and output the result. The key here is my search controller doesn’t know the input type beforehand (maybe it came from a different vendor) and my input types don’t know about the search controller. I want to have a re-usable search controller, that I can use across all my projects, with minimal integration effort to save implementation time.

    Using protocols, you might define a new protocol called “Searchable”. You extend or modify your existing model objects to follow the protocol. Under the “Searchable” protocol, objects would have to implement a function that receives a search term string input and return true or false based on if the object thinks it matches the search term. Easy.

    But there are a few problems with this approach. The controller has become blind to how the search is actually performed, which isn’t what I wanted at all. The idea was that the controller would perform the search logic for me so I didn’t have to continuously rewrite it, and now I’m rewriting it for every searchable object in my project. If I need search to be customizable, where the user was selecting which fields they wanted to search, or selecting options like case sensitive or starts with/contains search, those options now need to be sent down into each Searchable object, and then logic written in each object to deal with that. Reusable components was supposed to make my code easier to write, and this sounds worse, not better.

    Maybe I could try and flip this around. Instead of having extensions for my objects, I can have a search controller object that I subclass, and fill in with details about my objects. But I’d still have the same problem. I’m writing a lot of search logic all over again, when the point is I want to reuse my search logic between applications.

    (If you’ve used NSPredicate, you probably know where this is going.)

    Function Pointers

    Alright, so clearly we were trying to implement this all in a naive way. We can do multiple searchable fields. When the search controller calls in to our Searchable object, we’ll provide it back an map of function pointers to searchable values, associated with a name for each field. This way all the logic stays in the controller. It just has to call the function pointers to get values, decide if the object meets the search criteria, and then either save or discard it. Easy. And we are getting closer to a workable solution, but now we have a few new problems.

    Live search throws a wrench into this whole scheme. Not only do we need a way to know if an object meets the search criteria, but now we also need a way of knowing if an object’s properties have changed that could make it’s inclusion in our search change. This is especially important if I have multiple views. Maybe I have a form and a graph open for my model objects in different windows. If I change an entry in the form, I’d want the graph to possibly live update. And the form view and the graph view might have no knowledge of each other. So we need a way to callback to an interested observer of the object when a value changes. We could use a timer to check every second or so for changed values, but in some scenarios that could be a very expensive and needless operation. So while that would work, performance and battery life would significantly suffer. And it’s more code we don’t want to write.

    There’s also the issue of nested values. Maybe what I’m searching are objects that represent employees, but now I also want to search on the name of the department employees belong to. In my object graph, it’s very likely that departments will be another model object type that will have a relationship with employee objects. So now I’m not just looking at returning function pointers to not just my employee objects, but department objects they belong to. And now I need to communicate changes not only in my object’s own values, but changes in it’s relationships to other objects.

    Also there is the small issue of this approach not working with properties. As far as I know, you can’t create an function pointer to a property. So now I need to wrap all my properties with functions.

    This is getting complicated again. Once again I’m writing a lot of code, and not saving any time at all. There has got to be a better way.

    Key Value Coding

    Well fortunately after years of going through this same mess in other languages, Apple came up with Key Value Coding as a solution.

    Key Value Coding is extremely simple: It’s a protocol that allows any Obj-C object to be accessed like a dictionary. It’s properties (or getter and setter functions) can be referred to by using their names as keys. All NSObject subclasses have the following functions:

    func valueForKey(_ keyString) -> AnyObject?
    func setValue(_ valueAnyObject?, forKey keyString)

     

    (Reference to the entire protocol, which contains some other interesting functions, is here.)

    Now my search controller is easy. I can simply tell the search controller all the possible searchable properties like so:

    class Employee: NSObject {
        dynamic var lastName: String?
        dynamic var firstName: String?
        dynamic var title: String?
        dynamic var department: Department?
    }
    searchController.objects = SomeArrayOfEmployees
    searchController.searchKeys = ["firstName", "lastName", "title"]

     

    Now I can have a generalized search controller, that I can share between projects or provide as a framework to other developers, that doesn’t have to know anything about the Employee object ahead of time. I can describe the shape of an object using it’s string key names. Underneath the hood, my search controller can call valueForKey passing the keys as arguments, and the object will dynamically return the values of it’s properties.

    Another great example of the advantages of keys is NSPredicate. NSPredicate lets you write a SQL-like query against your objects, which is harder to do without being able to refer to your object’s fields by name.

    There is a catch. If you’re a strong static typing proponent, you’ll notice that none of this is statically typed. I’m able to lie about what keys an object has, as there is no way to enforce the name of a key I’m giving as a string actually exists on the object before hand. I don’t even know what the return time will be. valueForKey returns AnyObject.

    Quite simply, I don’t think static typing helps this use case. I think it hurts it. I don’t see a way to make this concept workable without dropping static typing, and I think that’s ok. Dynamic typing came about because of scenarios like this. It’s ok to use dynamic typing where it works better. And all isn’t lost. When our search controller ingests this data, if it’s a Swift object, it will have to transition these objects back into a type checked environment. So even though static typing can’t cover this whole use case, it improves the reliability of using Key Value Coding by validating that the values for keys are actually the type we assumed they would be.

    Key Value Paths

    There are a few problems KVC hasn’t solved yet. One is the object graph problem that was talked about above. What if we want to search the name of an employee’s department? Fortunately KVC solves this for us! Keys don’t just have to be one level deep, they can be entire paths!

    The KVC protocol defines the following function:

    func valueForKeyPath(_ keyPathString) -> AnyObject?
    keyPath
    A key path of the form relationship.property (with one or more relationships); for example “department.name” or “department.manager.lastName”.

     

    Hey look, that’s uhhhh, exactly our demo scenario.

    So now I can traverse several levels deep in my object. I can tell my search controller, after some modification, to use a key path of “department.name” on my employee object.

    searchController.objects = SomeArrayOfEmployees
    searchController.searchKeyPaths = ["firstName", "lastName", "title", "department.name"]

     

    Now internally, instead of calling valueForKey, my search controller just needs to call valueForKeyPath. I can use single level deep paths with valueForKeyPath with no issue, so my existing keys will work.

    Notice that valueForKey and valueForKeyPath are functions that are called on your object. I’m not going to do a deep dive right now, but you could use these to implement fully dynamic values for your keys and key paths. Apple’s implementation of this function inspects your object and looks for a property or function that’s name matches the key, but there is no reason you can’t override the same function and do your own lookup on the fly. It’s useful for if your object is abstracting JSON or perhaps a SQL row.

    It’s also important that this works on any NSObject. I can insert placeholder NSDictionary objects for temporary data right alongside my actual employee objects, and the same search logic will work across them. As long as the object has lastName, firstName, title, and department values, the object type no longer matters.

    Key Value Observing

    Well all that’s great, but we still have one more issue. We need to know when values change. Enter Key Value Observing. Key Value Observing is simple: Any time a property is called, or a setter function is called, a notification will automatically be dispatched to all interested objects. An object can signal interest in changes to a key’s value with the following function:

    func addObserver(_ anObserverNSObject,
          forKeyPath keyPathString,
             options optionsNSKeyValueObservingOptions,
             context contextUnsafeMutablePointer<Void>)

     

    (It’s worth checking out the other functions. They can give you finer control over sending change notifications. Also lookup the documentation for specifics on the change callback.)

    Notice that the function takes a key path. An employee’s department name will not only change if their department’s name changes, but also if their department relationship changes. This covers both cases by observing any change to any object within the “department.name” path.

    It’s also worth checking out the options. We can have the change callback provide both the new and old value, or even the inserted rows and removed rows of an array. Not only is this a great tool for observing changes in objects that our class doesn’t have deep knowledge of, but it’s just great in general. This sort of observing is really handy for controlling add/remove animations in collection views or table views.

    In our search controller, we just need to observe all the keys we are searching in all the objects we are given, and then we can recalculate the search on an object by object basis. There are no timers running in the background, this change can fire directly from an object’s value being set.

    So what’s the problem in Swift?

    I’ve mentioned one problem already: Only classes that subclass NSObject can provide KVO/KVC support. Before Swift, that wasn’t a major problem. Now with Swift, we have non-NSObject subclasses, and non class types. Structs can’t support KVO/KVC in any fashion.

    The properties/functions being observed also have to be dynamic. Again, not a problem in Obj-C where all functions and properties are dynamic. But not only are Swift functions not dynamic by default, some Swift types are not supported by dynamic functions. Want to observe a Swift enum type property? Can’t do that.

    Even more worrisome, the open source Swift distribution could possibly not include any dynamic support, and KVO/KVC are defined as part of the Cocoa frameworks, which aren’t likely to be included with open source Swift. Any code that wants to target cross platform Swift might be forced to avoid KVO/KVC support. Ironically, just as we could be entering a golden age of framework availability with Swift, we might be discarding the technology which makes all those frameworks play cleanly with each other.

    So what would I like to see from Swift?

    • Include KVO/KVC functionality as part of core Swift: The current KVO/KVC are defined as part of Foundation. They don’t need to be moved, but Swift needs an equivalent that can bridge, and is cross platform.
    • Have more dynamic functionality on by default: Another issue is that dynamic functionality is currently opt in. This is for a good reason: things like method swizzling won’t work with Swift’s static functions. But Apple could split the difference: Allow statically linked functions (and properties) to at least be looked up dynamically. This would allow functionality like KVO and KVC to work without giving up direct calling of functions or opening back up to method swizzling.
    • Have the KVC/KVO replacement work with structs and Swift types: Simple. Enums in Swift are great. Now I just want to access them with KVC and observe them.

  • WWDC Quick Thoughts

    My initial takes on WWDC announcements… (more…)

  • My Wish For Swift At WWDC: C++ Support

    At work, we support a lot of platforms. We support iOS and Android, Windows, Linux, supermarket checkout scanners, Raspberry Pis, old Windows CE devices, and more. And all the devices run our same (large) core code, and all that code is written in C++. I’m not the biggest fan of C++. But there’s no doubt when we need to write something that works across a range of platforms, it’s a rich, commonly understood tool. It’s also been a massive blocker for Swift adoption for us.

    For our mobile customers, we do provide both Java and Obj-C APIs. They’re both just wrappings around our C++ core, and they do the conversion from all the Obj-C or iOS native formats into the raw buffers we need to handle in our C++ core. Whenever I look at doing a Swift native SDK in the future, I’m still stuck on not having native C++ support from Swift code. In order to provide a pure native Swift API in the future, I’d have to wrap our ever growing C++ source base once in Obj-C, and then wrap it again in Swift. It just doesn’t make sense to wrap the same code twice over. (more…)

  • WWDC OS X Wish List

    As WWDC fast approaches, and rumors of the next OS X update focusing on polish persist, I thought I’d go over my wish list for what I’d like to see Apple address.

    Vulkan/Enhanced Graphics Support

    The graphics situation on OS X is bad. 3D graphics on OS X have been rocky from the beginning (the first ever public OpenGL demo on OS X, which sadly I cannot find video of), but Apple in the past was willing to participate in a benchmark war with DirectX. At WWDC 2006, I remember Apple was still rolling out new features to try and compete with DirectX’s performance. And with 10.6 Apple introduced OpenCL which went on to become an industry standard for GPU based computation.

    But recently, while Linux and Windows have had continuous cycles of improvement for 3D graphics, OS X has been seeing improvements in fits and starts. Mavericks, which moved to OpenGL 4, looked like it might be a recommitment to improving 3D graphics on OS X. But Yosemite didn’t seem to bring any real improvement to OpenGL at a time when OpenGL on the platform is already seriously lagging.

    There is some speculation that Apple will bring Metal to Mac OS X, which from what I’ve heard seems like a strong possibility. Metal on Mac OS X might provide a direct route to high performance 3D and 2D graphics, but Apple might have a long road ahead in convincing developers to adopt Metal, and GPU manufacturers to write drivers for it. The driver quality on OS X for OpenGL is already so hit and miss it’s hard to believe that Apple would be able to get AMD and Nvidia to write a good driver for Apple’s proprietary Metal platform, when the OpenGL driver quality is already not that great. I have no doubt that Apple would be able to get a few software vendors on stage to demo Metal support on the Mac (Epic seems like a good possibility), but there are several third party vendors (like Adobe or Valve) that I see not being eager to have to support another standard. I could see Adobe dragging their feet for a long time on supporting Metal, or possibly not supporting it ever.

    Recently the next version of OpenGL called Vulkan was announced and I think this would be a great chance for Apple to really improve cross platform graphics on OS X. Vulkan already has strong support from Valve (which is huge for a developer that used to be strongly in the DirectX camp), and is more likely to draw support from Adobe. The word is Apple is still on the fence about supporting Vulkan on the Mac. Vulkan supports the same capabilities as Metal, which puts it in an awkward position on OS X. But there will be a lot of smaller developers that won’t be able to put forth the effort to port to Metal, especially in the professional software community, so it would be foolish not to support Vulkan. Vulkan also promises a simplified driver architecture, which could be a boon for the traditionally complex development of OpenGL drivers on Mac OS X. I’d even suggest that maybe Apple should be spinning down development of Metal and really focusing on Vulkan instead, but I think the politics of Apple wouldn’t really allow for that.

    At the very least, seeing support for OpenGL 4.5 in Mac OS 10.11 would be ideal, but it would be great to see Vulkan support. Metal would be an improvement as well, but I worry it would further alienate some developers of both professional software and games if only Metal were present as a modern API.

    A Modern Window Server

    (I don’t have visibility into the OS X window server, so this section is based on speculation on how Apple has implemented some parts of Yosemite. Please let me know if there are any corrections to be made.)

    OS X’s window server is another component that had a very impressive start, looked to have a very impressive future, and then suddenly seemed to have public facing development stop while competing platforms passed it by.

    The very first version of the Mac OS X window server did every bit of it’s drawing on the CPU. That made sense at the time. Great dedicated GPUs were still not common on the hardware OS X supported, and the OpenGL support required may not have been present either. But as a result, basic things like dragging windows across the screen was choppy on the first versions of Mac OS X.

    In 10.2, Apple added something called Quartz Extreme. While everything inside of a window was still drawn on the CPU under Quartz Extreme, the contents of the windows themselves were stored on the GPU, meaning operations like moving a window around on the screen became very quick. There was a further project called Quartz 2D Extreme (later QuartzGL) which would have drawn the contents of the windows themselves with the GPU. This project was comparable to Windows Vista’s Windows Presentation Foundation which promised similar features to unlock fancy effects such as windows with a glass like transparency. While WPF was the source of a lot of Vista’s early performance issues, high requirements, and consumer confusion (Microsoft had two tiers of Vista certification for hardware, and one was not compatible with WPF), Microsoft’s investment in a fully GPU accelerated window server eventually paid off with a fast and responsive user interface that is capable of rendering advanced effects with ease.

    Apple’s QuartzGL project eventually fizzled out due to performance issues (some interesting benchmarks are still posted here). With 10.11, it might be time for Apple to take a second look at extending QuartzGL further. Microsoft has pushed through their performance issues, and many of the visual effects Apple is trying to do with Yosemite could be speed up by GPU acceleration.The “glass” vibrancy effect OS X is using doesn’t seem to be GPU accelerated, which might be leading to some performance issues (I sometimes see vibrancy backed views “lag” behind their window being drawn which leads me to believe the vibrancy effect isn’t drawn as part of window compositing on the GPU.) GPUs are really good at these sorts of effects, and it would be great to have a window server that could do things like flag a portion of a window to the GPU as transparent, and then attach a shader to that portion of the window. As with Windows, not every element on the display would have to be drawn on the GPU. Certain views could opt in to GPU backed drawing (such as the vibrancy view), which would help the performance of views that could benefit, without hurting the performance of other views.

    My understanding is that QuartzGL is still present in some form in Yosemite, and that applications can opt in to it, but it would be great to see QuartzGL taken a few steps further to allow the advanced OS X vibrancy effects to be GPU backed.

    Networking and WebKit Enhancements

    Yosemites’ networking woes have been well documented at this point, but I’ve also noticed a decline in Safari’s quality. HTML5 web views in particular don’t perform right, the controls always seem to be in the wrong place, the video is scaled incorrectly, and sometimes they don’t play at all. I’m having to open Chrome more and more often these days and I really wish I wouldn’t have to.

    Pro Hardware

    As a follow up to my post about the Mac Pro, I had one more thing that I wouldn’t mind seeing at WWDC, but it’s really a long shot. If Apple doesn’t really care to continue maintaining their pro hardware, they should really think about licensing OS X. It doesn’t have to be a license free for all. Maybe they could only license to certain HP product lines. And Apple hasn’t been against licensing in the past. Apple’s failures in markets like the server market weren’t necessarily because OS X made a poor choice for servers, but because the company didn’t seem to be willing to provide the support services required for that market. Apple’s Pro hardware may be in a similar place.

    If Apple really comes out supporting Pros over the next year in a big way, I’ll gladly eat my words. But I’m becoming more and more convinced that licensing OS X would allow Apple to keep users who are happily buying Apple software (and would be paying Apple licensing fees) as an alternative to having them leave the platform entirely would be the best course of action. If Apple isn’t willing to dirty themselves with towers that can be opened and customized, or 1U rack mount servers, they could let someone else serve those markets for them.

    End of Sandboxing For the Mac App Store

    The Mac App Store is slowly dying, and sandboxing is a big reason. Apple either needs to fix sandboxing, or remove it as a requirement. This is again a place where Windows 10 is pulling ahead of Apple.

    Real Exchange Support on OS X

    Microsoft Outlook, especially the new beta version, provides a good Exchange client on Mac OS X. But I’d love to see the built in support enhanced. Mail.app still doesn’t support push email. The Exchange client in iCal still sucks (better meeting availability assistant, please.)

    Conclusion

    A lot of the suggestions I’m making are not user facing, but are instead foundational. OS X hasn’t had many foundational enhancements in almost 10 years (not counting security orientated things like Gatekeeper.) Microsoft has put a lot of working into their foundations that cost a lot of time and money that Apple has so far been unwilling to commit. My worry is that if Apple doesn’t concentrate on the less glamorous aspects of OS X, like graphics performance, the user experience will continue to decline. Apple is simply stacking too many new features on an aging foundation. Beyond stabilizing the features they already have, Apple really needs to look at building a foundation for the next 10 years of the Mac.

  • Where is the no compromise Mac Pro?

    I’ve been looking at replacing my 2008 Mac Pro, and as much as I’d like to replace it with a new Mac Pro, the Mac Pro just looks so odd when you compare it to both Windows products and other Macs.

    The most visible missing feature that has been widely commented on is the lack of an Apple 5k display. I don’t really want to buy a Mac Pro at the present time because I’m pretty sure it’s going to get outdated by Thunderbolt 3, and I’m not sure what the 5k support picture will look like for the current Mac Pro in the future. (more…)

  • Pushing Updates and User Expectation

    In the debate on Apple quality, OS X Snow Leopard is usually cited as the high mark in recent OS X quality. Whether or not you believe that Snow Leopard had less bugs than any other release (which I tend to), another thing has changed since Snow Leopard is how Apple distributes Mac OS X. (more…)