Chuq Von Rospach is a Silicon Valley veteran doing Technical Community Management and amateur photographer with a strong interest in birds, wildlife and landscapes. My goal is to explore the Western states and working to tell you the stories of the special places I've found. You can find out more on the About Page.
New: For Your Consideration
I'm thrilled to announce that I've launched a project I've been working on for the last couple of months. For Your Consideration is my attempt to re-think how we interact with information on the Internet.
My goal of For Your Consideration is to slow down, focus on good and interesting things, give them context. It is one posting per day, seven days a week.
Find out more in the FYC Manifesto. Help me get the word out. Tell your friends about it. Encourage people to try it and follow FYC. When you see interesting content on FYC, share it with your friends.
The Gear Bag
You’ll want this
More to Explore
While you're here, check out more of my work. Here are some of my most popular articles:
- Some Thoughts on Lightroom Keywords
- More than you want to know about backups (the 2013 edition)
- Should you consider upgrading your home network to a NAS?
- How not to be a doofus with a camera
- Getting started in bird photography: Choose Your Weapons
- Getting going in Photography on the Cheap
Free to download Wallpapers
New on the Blog
Search This Site
Category Archives: Computers and Technology
I’m thrilled to announce that I’ve launched a project I’ve been working on for the last couple of months.
For Your Consideration
Things to Ponder and Enjoy
For Your Consideration is my attempt to re-think how we interact with information on the Internet. The net has long encouraged habits I’ve considered negatives; the belief was that you had to post frequently to keep your audience entertained has lead to a mentality of often and fast instead of thoughtful and good. Financial models were built about generating lots of pageviews and throwing advertising in the reader’s face, so revenue generation was in conflict with creating longer, better-researched and more thoughtful pieces of content.
Curation has been a go-to word around the Internet for a couple of years and I’ve long been fascinated with the idea, but far too often, curation been turned into “let me post links to stuff I found” without thought or context. In many cases, these attempts at curation have added to the noise rather than reduced it.
I’ve been experimenting in small ways with creating a way to share things in a way that created a high-signal/low-noise environment. None of the experiments I tried ever felt to me like it was the solution I was looking for, but they did help me understand what was possible and what might work.
For Your Consideration is my attempt at implementing what I think is a way to share in that high-signal environment. We’ll find together how well it works.
The Goal of For Your Consideration
My goal of For Your Consideration is to slow down, focus on good and interesting things, give it context. Good food is better than fast food. Good links are better than fast links. If you want fast, there are places that’ll give it to you. Good things will still be good after those fast things are forgotten for the next one.
I want For Your Consideration to be interesting but not controversial. It’s going to have an opinion, but I want to avoid having an attitude. I want to find things that you will find interesting, and give them to you in a way that allows you to look at them without being in a hurry to the next thing.
For Your Consideration is one posting per day, seven days a week. One item per day.This restriction gives me the time to do the research and put the item in context where necessary. It means you don’t have to worry about being overloaded by so many things that you have to skim past them. I think it’s important that both the publisher and the reader break the “gotta get through it all” mentality that’s dominated the internet mindset the last few years.
By restricting myself to one item a day, I have to make choices. That forces me to become a filter and choose the best content for the audience. By limiting it to one item a day, I allow you the time to be able to find out why I felt it was worth your time. I hope that by slowing down, we may see fewer things, but those things we do see will mean more and impact our lives more.
Most days will be one link to one thing. Occasionally I’ll post a longer pieces with multiple items on a single topic or a review of something I think you’ll be interested in. I don’t promise to never be topical. I don’t promise not to have an opinion. I do promise that the primary goal is to post things that are interesting and not to promote any ideology.
You can read more about how I plan to share content — and what content won’t be shared here — in the FYC Manifesto.
Getting involved with For Your Consideration
You can subscribe to For Your Consideration in various ways, and you should choose the one most convenient for you.
- Items posted to For Your Consideration will be posted to my twitter feed.
- For Your Consideration has an RSS feed you can use in any system that reads RSS, like Feedly or Prismatic.
- It is available via email if you subscribe to our mailing list (managed by Mailchimp)
- And you can always just wander by and read the web site at http://fyc.chuqui.com.
To help make For Your Consideration successful I’ll need a little help. Here’s how you can help:
Help me get the word out. Tell your friends about it. Encourage people to try it and follow FYC.
When you see interesting content on FYC, share it with your friends.
If you run across something you think should be published on FYC, submit it to me. You can do that by emailing the information to email@example.com, or by sending it along to the FYC Raw Feed on Twitter.
The For Your Consideration Raw Feed on Twitter
One of the goals of FYC is to limit the number of items that we publish on the site, which forces me to make choices about what the most interesting items are. Some of you will probably want to at least skim all of the candidates whether or not they make the cut for publication. For those that want that I’ve set up a special twitter feed at https://twitter.com/CollatingLife. That is, literally, the in-box for FYC, and everything I find that I might decide to turn into an item on FYC gets posted there, so you can monitor the inbox as well. I seem to be posting 1 to 7 items a day on average, so even that feed isn’t going to be overwhelming.
I think it would be an interesting experiment to turn the selection of items for FYC into a crowdsourced operation with commentary selected from the community at large, but that will have to wait for a future generation of the site.
Defining Success with For Your Consideration
One thing I think is crucial when you launch something like this is to have some way to judge whether it’s succeeding or not. If it isn’t, you need to either improve it or shut it down. If it is, you look for ways to invest in it and make it better. You can’t do that if you don’t know what success means.
To me success is going to be defined not by how many people subscribe and follow the site, but by how often you feel material is worthy of being shared to your friends and contacts. The better job I do at curation, the more often that content should be interesting enough to pass along. I’d love your feedback on what kind of content we should do more of and what kind of content is less interesting, but my primary way of determining that is going to be how often things get passed around and shared.
And the cost?
It’s free. I plan on this always being a free service. There are a couple of Amazon affiliate ads on the site. I’d be much appreciative if once in a while you decide to buy something through them so Amazon pays me some affiliate fees, but only if you feel the service is worth it. The nice thing about Amazon affiliate is that it costs you zero, so it’s a tip jar that’s free to you, since 100% of what you pay is spent on the item you’re getting, and I get a couple of percent from Amazon for brokering the sale. My goal is simple: I’d like this site to pay its bills. Anything beyond that is gravy.
The Amazon affiliate model is, to me, the least painful and least intrusive advertising model I can use. Nothing is ever going to pop up over the content, pop under the content, lock you out of the content, or annoy the crap out of you to donate before letting you see the content. I’d like to think that lack of annoyance would be worth an occasional dollar in the tip jar, and for now, Amazon is where the tip jar lives…
Welcome! I’m hope you find this site interesting. If you do, subscribe and tell your friends. If you don’t… Tell me, so I can improve it. Over the next few weeks I’ll write some pieces on the research and thinking that went into the site and why I made some of the decisions I made. Hopefully you’ll find that interesting as well.
I think computer users can be broken down into three camps:
- Computer users who haven’t had a hard disk fail and haven’t yet figured out they need to back up their systems.
- Computer users who have had a disk fail but still don’t back up their systems reliably (or at all), even though they know they should.
- Grouchy old computer geeks who yell at the first two groups because we’re the ones who get that call at 10PM because a disk failed and they need a file back because they’re on deadline and oh my god please help me I don’t have a backup what do I do?
I warn you up front, I am one of that last group. My goal is to convince you to start backing up your computer before it’s too late, because I want those late night on deadline oh my god I’m doomed please help me phone calls to stop. Even though I know it’ll never happen in my lifetime.
A hard drive is a spinning mechanical device with motors and magnets and bearings and a read-write head that flies milimeters away from the surface of the platter where the data is stored. It is inevitable this device will fail. Not IF, but WHEN. Newer computer use SSDs, which are solid state devices instead of spinning mechanic ones, but they, too, fail.
That’s the reality: whatever you store your data on is going to fail some day. If you don’t plan for that, bad things will happen. And when bad things happen, you call your geek friend late at night blubbering and crying and asking for help. Neither of us want that.
You can’t prevent the failure, but you can reduce the chances of it happening, and you can back up your data so that if a disk fails, it’s not a big deal, because that data also exists on another hard disk. Or two. Or three. The more the merrier.
This article will help you understand how to reduce the chance of that failure and to limit the pain and damage when it happens.
The Best Backup is Never Needing your Backup
The best and most reliable backup is never needing to recover data from your backup. You can never guarantee that a drive will never fail — but you can reduce the chances of it happening.
How? Simple: replace your drives before they fail. Backblaze is a company that will back up your data over the internet to their servers. They have lots of data on lots (and lots) of hard drives, and it’s their job for that data to never be missing. They’ve got lots of experience with failing hard drives and how long it takes for one to fail, and they’ve been nice enough to provide the data. If you’re interested in the details, read their study. The executive summary is that after a hard drive is three years old, the failure rate starts to rise rapidly. So the first thing you can do to reduce the chance of a hard drive failing on you is retiring it and replacing it with a new one before it gets to be four years old.
I take this one step further: if you have a laptop that you carry around, that laptop tends to get bounced and jostled. Inside that laptop is a hard drive, which is also getting jostled and bounced around. My experience is that laptop hard drives have a tendency to die younger than hard drives in machines that don’t move around, so if you have a laptop, you really want to replace that hard drive earlier.
My hard drive policy is simple:
- Any hard drive used I use as a working drive (attached to a computer and powered up for use on a daily basis) is replaced when it is between two and three years old.
- Any hard drive installed inside a laptop is replaced earlier: between 18 months and two years.
That doesn’t mean their useful life is over: the drives I used as my day to day drives get turned into backup drives (unless they’re too small). They’re used as backups until they’re around four years old, and then they’re retired.
Backup drives tend to be powered off a lot more, their usage is much lower, and you don’t put them under stress. That reduced stress means they’re less likely to fail. You use a drive hard when it’s new, give it a reduced role as it ages, and retire it before it hits that point in time where failure becomes likely.
If you do that, you will rarely have a drive fail on you. It costs a little money, but the cost of a new laptop drive these days is under $100, so it’s not that expensive. It’s a lot less expensive than the time and stress of recovering from a failure, that’s for sure.
A note on SSDs: As SSD (solid state drives, with no moving parts) mature, they’re rapidly replacing spinning drives for data storage. The failure tendencies of SSDs are a lot different than for hard disks, and it can be much different from one manufacturer to another. So what should you do about replacing aging SSDs? I don’t know yet. My current (tentative) plan is to let the SSD in my laptop go for three years and then replace and retire it rather than make it a backup drive, but that’s subject to change once I do more research. I still think the 3 years and out concept works for them, but I don’t think you need to be as aggressive moving them out of a high use mode.
A note on Hybrid Drives: Apple and some other companies are shipping computers with what they call a hybrid drive, which is both a hard disk and an SSD merged together. My view right now is that you treat them like hard drives and replace them like one, but I haven’t looked into the real-world failure tendencies of them yet.
Setting up backups
Even if you never have a hard drive fail, you still need backups. There are many ways for your data to disappear other than a drive failure: your house or office could burn down. Your computer could fail and scribble Shakespeare’s Sonnets all over your disks and data. You could be sitting in Starbucks and watch as someone grabs your laptop and runs out the door. You could drop your laptop (yes, I know, that never happens, right?). There are many bad things that can happen to your data.
The only way to protect yourself from these bad things is to keep multiple copies of your data. and since if your house burns down it may destroy everything inside it, not just your computer, you need to keep those copies in multiple places. This can turn into a hassle quickly, and one reality of backups is that the more hassle they are to do, the less likely it is you’re going to do them. So we need to keep doing and managing backups as simple as possible (but not too simple to be useful).
The basic goal of your backups is therefore to have at least three copies of your data, and have those copies exist in two independent locations.
My basic setup: back up data do a separate disk on a regular basis, and then swap that drive to an offsite location once a month. This gives you three copies of your data: on your computer, on your backup drive, and on your offsite drive. It minimizes cost, because you only need two backup drives that you swap. It limits the hassle factor, because as long as your backups are run automatically, you only need to intervene once a month to swap drives and take the updated one off-site.
One of the tradeoffs: not all of your data will be in all three places; your newest data won’t get out to the offsite until you swap disks at the end of the month. Remember, though, that the offsite backup is there to recover from catastrophic disasters (house burned down! oops!); the compromise between reduced hassle of constantly swapping that drive and losing some data in that situation is a reasonable one; in reality, you are unlikely to ever need that catastrophic backup. But if you do, you’ll be glad it’s there.
That said, it never hurts to have more copies of your data. You can do this in a number of ways. Using an offsite backup is one — our friends Backblaze, for instance, or Crashplan is another option. There are other companies doing this as well. The downside is that these services use your internet connection and that connection can be slow; if you have a lot of data, it can take a long time to upload them to the remote backup server and if your data fails before it’s backed up, you’re hosed. That’s one reason why I like to use these services as a supplemental backup and not a primary one.
Some ISPs put data caps on your internet connection. If yours does, doing an online backup could cause you to use more data than the cap allows and you can find your network throttled to a really slow speed, or turned off completely. Before you go online, you need to understand how big your data set it you want to back up, how long it will take to upload, how long it might take to recover if you need to, and whether you have a data cap to worry about. I generally recommend that people consider using these online services to back up the important data, but not everything.
Another online option are services like Dropbox or box.net or Google Drive. These services turn a part of your hard drive into a virtual folder that gets copied onto their servers, and then copied down to any other computer that you set up to share that virtual folder. This can be quite useful if you use multiple computers at different times, but it can also act as a kind of backup because the data gets copied to multiple places. It’s not something you should use as your primary backup, and like the other online backup services, slow network connections and data caps may impact its usefulness.
These are all ways to create multiple copies of your important data in relatively painless ways that you don’t need to spend time managing.
How to back up your data
This section assumes you’re using a Macintosh. If you don’t, there are other equivalent tools you can use to back up your computer, but I’m not the person to tell you which one to use.
Backing up a Macintosh can actually be very simple: use Time Machine. For a lot of people, this will work quite well and it’s free with all copies of Mac OS X. I use Time Machine for part of my backups system because I like it’s incremental backups so you can go back and find a file and it’s data at a given time.
Time Machine’s big weakness is large data sets. Because it’s doing incremental backups, it is going to want a backup drive larger than the amount of data you have created. I’ve found that it works best when the backup drive is at least 2X the data being backed up, and I prefer 3X. This means if you have, say, a 500Gb boot drive in a laptop and a firewire drive with 1.2 Terabytes on it, your total data set is 1.7 Terabytes. Time Machine is going to struggle keeping that backed up on a 2 Terabyte drive, so you really need 3TB for your backup at a minimum. If you update large parts of your stored data, you can really give it indigestion (for instance: take 1000 photos in Adobe Lightroom, and assign a new keyword to each, and make sure the updated metadata is flushed to the DNG with an embedded XML sidecar. You just created 60-70 gigabyte backup). The larger the data set, the larger the disk Time Machine needs to back it up and work efficiently, and as your data set continues to grow, this is going to be a challenge.
I am not a big fan of Time Machine to recover a failed disk. I’ve done it, and sometimes it works fine, and sometimes it’s fought me and taken forever to get the data restored. Apple’s done a lot of work improving Time Machine since the early days of Mac OS X so a lot of my reservations about it aren’t true if you’re running Snow Leopard or Mavericks — but I still prefer to have a way to recover an entire disk as well.
For that I use Superduper. This tool makes an exact clone of a disk, one that you can plug into a computer and use without any work; even boot the computer from it. I use it to make bootable copies of my computer’s main drives; so if I lose one, I can clone a copy quickly, or just boot the backup drive and get back to work. And it creates another copy of my data for me (never a bad thing).
Do you need this? How badly do you want to protect your data? How quickly do you want to recover from a drive failure and get back to work? How many hard drives are you willing to buy and manage? If your data is really worth the effort, it’s a good way to create a reliable and quick-to-recover copy of it — but it does entail more time, energy and money. Whether it’s worth it to you is a decision you’ll have to make. It’s worth it to me.
I am not a fan of Apple’s Time Capsule for backups. It’s very simple, but offsite backups are effectively impossible. Recovering a failed drive from it takes time, and it’s hard (to impossible) to replace the drive as it ages. I want the ability to upgrade my WIFI router separately from my backup drives. And Time Capsule is not a good solution if the number of computers to be backed up is two or greater. I do use one in one specific situation: my mother’s house with my mother’s Mac, where absolute simplicity is the prime directive. If your needs are simple and you’re willing to forgo offsite copies of your backup, it’ll do the job, but I think for most uses, it’s not the right solution.
What if I have big data sets?
As your data set grows, it gets more complicated. As the number of computers you need to back up grows, it gets more complicated. As it gets more complicated you’ll need to spend more time (and money) making sure you have good reliable backups and that the backups work. If you’re a serious photographer or a videographer, you’ve probably stopped thinking about gigabytes and now think about terabytes.
You can keep plugging disks into your computer to store all of that data, but that’s expensive, unwieldy, and backing them up is a horror (so chances are, you’ll stop and pray nothing bad happens). That’s a disaster waiting to happen. So at some point, you need to start thinking about disk subsystems, or network-based disks, or some other setup designed to handle large sets of data.
I’ve recently hit that point, and my choice was to go to a NAS, or a Network Attached Storage device. I talk about that in some detail in Should you consider upgrading your home network to a NAS?
Is this an option you need to consider? Here are my general guidelines:
If you’re managing a single computer, a NAS probably doesn’t buy you much, until your data set starts growing past 4Terabytes. At that point, you’re talking about plugging in multiple drives and multiple backup drives and things start getting complex, and the NAS will make your life easier and you’ll end up buying less hardware over time. If you’re someone who is wandering the house/office with a laptop wireless, a NAS starts making sense sooner because your data can live on the network and you don’t need to plug in to work on that project as often.
If you’re a multi-computer environment, the complexity of your data management and keeping your backups going reliably is going to be harder and harder. The NAS helps a lot with that, and so you should consider it. I think a good general metric is when you hit 2-3 computers and your total data you have to manage hits around 4 Terabytes, it’s a good time and cost effective to start considering a NAS. If your data requirements are small, you may not need one, but if you’re a photographer or videographer, your data requirements aren’t small any more.
Once you hit 5-6 computers in the installation, the advantage of centralized online backups to the NAS seem to be overwhelming. you’re an idiot to not consider it. IMHO.
If you’re in a single-computer setup, another option are dedicated disk arrays that connect via Thunderbolt or Firewire like the Drobo. I personally think the NAS is a better option and most of the time will be less expensive and more flexible, because I like the ability to connect to it over WIFI if I grab the laptop and wander around the house. The direct-connect systems like Drobos, on the other hand, will win on pure performance, so if you need absolute max performance, they’re your better option.
My backup strategy
I’m going to close out by documenting my current backup strategy. Not everyone is going to want to implement all of this but I want people to see what I do and understand why, and have the ability to adopt in the pieces that make sense. My data situation is moderately large and I have predicted that growth will accelerate. We’re a three computer family, two of us are photographers and I’m starting to work with some video. My photo collection is well past 30,000 images, and my wife’s is 20,000+. So we have a big hunk o’ data.
I’ve just migrated to using a NAS, and I no longer have a second (or third, or fourth) drive attached to my computer. I have the boot drive on the laptop, which is a 500GB SSD and everything else lives on the NAS.
My wife keeps her data on a mirrored RAID drive (in part because she hasn’t had time to sort out what should get moved to the NAS). All three computers are backed up to the NAS via Time Machine. The NAS has a backup capability, so I back up all of the data onto two external drives, and its those drives that get swapped offside monthly.
Here’s a diagram that shows everything involving data on the home network
Here’s what’s going on:
- Each machine uses Time Machine to back itself up to the NAS. Each machine has its own partition with a quota set on it, because otherwise, Time Machine will grow the backup to infinite size. The quotas are around 3X the size of the backed up data.
- Each machine uses Superduper to write update a disk image on the NAS, kept in the data volume as a Sparse Bundle. I can load that onto drive if I ever need to do a recovery.
- Each machine has access to a personal data volume and a shared data volume we both use. I have my iTunes library out there and shared, and I keep a morgue, which is data I keep but which if I lose, I won’t die, so it doesn’t need to be backed up (I currently do, but as my data set grows, I’ll stop that).
- The NAS backs up to two disks. I don’t need two today, but this gives me breathing room so I don’t have to update this for a while. A second pair of disks lives offsite and is swapped by sneakernet monthly. (for what it’s worth, a full backup of the NAS currently takes about 3 days).
- I have two other disks hanging off my Macbook; these are my travel disks. One is a 500Gb drive that is bus powered (no need for external power); I use that to clone my laptop drive every night when I’m on the road, and I plug it in once every week or two and update the clone via Superduper; one more copy of that data hanging around. The other travel disk is a 500GB mirrored raid that’s bus powered, and I use that to store data on longer trips when the size of my created data is larger than my internal drive can handle. With photos and video, that’s not hard… Both of these drives are from Other World Computing and built like tanks.
What this means is that once it gets copied offsite, all of my data lives in at least three places (NAS, backup, offsite backup). 24 hours after creation, it’s in two places at the minimum. Any data that lives on my laptop drive ends up with at least five copies, and I also use Dropbox for some data, which makes even more copies including at least two computers at work…
That’s my comfort level for trying to prevent data loss. Do you need to do all this? Depends; how bad would it be to lose your data? Choose the pieces that get you to your comfort level. The really good news is that once this is all set up and running, it takes almost no time to keep going; other than swapping the backup disks (which takes up an evening, roughly) on the NAS, it’s all automated.
Setting it up takes time; getting fully running on the NAS too me two and a half weeks. And it takes some money to invest in the gear you need to add to get things going. But those are investments in not having that freak out panic attack later when a disk fails.
And you’ll sleep better at night. I know I do.
How comfortable are you at the thought that someone just grabbed your computer and ran out the front door of that Starbuck’s your sitting in? Will your backups protect you? If not, you have some work to do…
A few weeks ago I grumped about how Apple messed up creating wallpapers for IOS 7 because of their new Parallax motion. I was honestly hoping someone would point me to a technique I missed to solve the problem, but what I got back was mostly grumpy agreement and requests for more info.
I’ve been looking into this on and off since, and I ran into this post at fiftyfootshadows talking about the subject. His experimentation indicates the optimal sizing of an image for iPad wallpapers is 2524×2524, and for iPhone 5 is 744x1392px. Those numbers are what Apple is using for their own images, and that seems to be the “right” answer, at least for now.
So I’ll be adjusting and republishing my wallpapers to match. Real Soon Now…
Apple held it’s last major product announcement before the holiday buying season, and as usual, the stock went up leading up to the sale, and dropped when Apple announced their updated lineup. Some day people will realize that Apple’s stock is manipulated by the people who take advantage of the ramp up to announcement and the rumors to encourage the stock higher, and then if you look behind the numbers, there’s a fair amount of betting on the drop and short selling as the analysts then grump about the results and encourage the stock to drop. and this is all, I guess, legal and fair. Which is why you should (a) never invest in apple around launch times, and (b) certainly should never use stock movement as a justification for how well or how poorly Apple did with their launch and product updates — because the track record the last ten years is the people griping about it are invariably wrong, but folks still listen (for some reason).
If you’ve been around since my days when I worked for Mama Fruit, you may remember that I typically argued that buying on a Keynote was a poor decision, because you were maximizing the margin on the products. With very few exceptions, I typically bought a generation back, and refurb when I could get it, as the best value for hardware. When I did buy into the current product line, it was usually into the middle or lower end where the margins weren’t as juicy; the two exceptions I can think of were the Macintosh II I bought when I joined, and whatever the first generation was that switched from ADB to USB, where the generational shift in capability made a lot of sense.
That’s why, immediately after this keynote, I bought a MacBook Pro. 15″, one step down from top end, because that’s the one I could get the 512G SSD built in without going to a custom build and have it shipped next day. Why? my old machine (3.5 year old 2010 13″ MBP) tries its damndest, but just doesn’t have the horsepower to do heavy lifting with Lightroom and Photoshop. Geekbench indicates the overall performance of the new unit is a bit more than 5 times the old one: 2100ish to just under 11,000. It seems to be a tiny bit faster….
Plus, I see the ability to start migrating to USB3, which will let me start retiring all of my firewire drives over time, to be a really good thing. Thunderbolt2 is nice, but I’m looking more towards moving the heavy storage to NAS over the next year for both data and backups, and that makes the upgrade of wifi to 802.11AC even more useful to me (but the addition of the Thunderbolt interface lets me buy a hub that I can plug all my stuff into, making hooking up or carrying away the laptop a two cable operation, if you count the power cord….). And the battery life, and bigger retina screen…
So after years of making Apple’s marketing happen but not actually following it, I’m buying on the keynote, and I think it’s a good, powerful and cost effective unit. I really like the 13″ form factor, but the performance difference between it and the 15″ is just too great (I’d only get 3 time the boost — 2100 to about 6000 on the geekbench scales).
And then there’s Mavericks. While I won’t upgrade at work until my project ships, I dropped it on the home machine right away. I found it one of the most painless OS upgrades ever, so well-done to the team; this keeps getting more automatic every release (then again, I learned long ago not to geek out under the hood of the OS, so I don’t do things to the system I’ll regret later in the first place…).
There’s definite pain in iWork land, but I don’t use it a lot so that’s pain others are taking — the loss of Applescript is sad, but honestly, I probably would have predicted something like that if I actually thought about Applescript ahead of time. When was the last time Apple did any noticeable improvement to it? The group of users of Applescript is beyond niche. I wonder when it’ll be retired off the OS completely (I admit to never really being an Applescripter or a user of Automator)
But overall, Mavericks seems really nice, really stable, and it’s made a nice improvement to the performance of my old laptop; the new memory management is doing wonders, even when I do something stupid, like tell Lightroom to build a 300 image slideshow on the fly. On the cats, that was asking for a disaster, on Mavericks, it actually could, and it took a memory attack that hefty to force Mavericks into swapping, and the swap file went to a massive 8 megabytes or so… That Apple could build an upgraded memory system that even wrestle’s Adobe’s memory management to a truce? Impressive…
I’m really looking forward to seeing that it’ll do on today’s hardware, which will arrive in about 12 hours…
Yesterday I released my first set of wallpapers, and the response has been quite gratifying. I want to say thanks to those of you who’ve been downloading them and passing along the new images.
But that said, the release of the images was about a week later than I’d originally hoped, and to be honest, I’m not entirely happy with the wallpapers for the IOS devices, especially the tablet. Apple’s made generating wallpapers for IOS 7 a fair pain in the parallax.
The way they implement the new parallax effect on their backgrounds is to scale an image to a larger size, and then as you swing your device up or down or side to side, shift the image around the screen a bit. It is, in a way, a mini “ken burns” effect.
If you do what Apple’s done and build abstract backgrounds, that’s fine. There’s a long tradition of subject-oriented wallpapers, though, and the way Apple’s built things in IOS7 ignores that use case completely, and creates a big case of heartburn for people trying to build it.
As far as I can tell (I’m still experimenting), the image they use for the backgrounds is extended about 200 pixels off each edge of the screen. That means a 2048×1536 image you might have used on an IOS6 device is stretched out to 2448×1936, and then the edges are all stuff out of view. No tall of the image hidden off screen becomes visible when you activate the parallax. My estimate is about 100px is lost around the edge completely.
If you are a photographer building wallpapers out of your images, and one that’s trying to take some care about image quality, just having your images stretched by some unknown algorithm before display is going to cause you to reach for the Maalox. But it gets worse.
Those lost edges can be significant to the image. If you look at this image on your computer screen, and then install it on the IOS device and look at it, suddenly a non-trivial part of the image is gone.
The framing and composition that a photographer probably did in creating that image is damaged. It makes a number of images completely unusable as wallpapers.
I spent a few days experimenting with ways around this, and with work-arounds, and finally decided to release it and explore options later. Right now, three options seem possible:
First, release the image as is and not worry about it. As someone who takes some pride in my images and compositions, I’m frankly not happy with what this change in IOS7 is doing to my images. I would likely just not release for the IOS devices and only for desktops instead (which, since I’m doing this primarily because I want them, I find terribly annoying; that others can enjoy them as well is a bonus).
The second option: wrap everything in a simple frame that fills that extra space, so the original composition shows up on screen and the hidden areas are filled with non-image pixels. So far, my experimentations have shown this is more difficult than it sounds, because the parallax effect makes it hard to find a set of numbers that remove or limit how much image is lost to this. Either that, or you see the frame peeking out at various times as you play with the device, which I also find looks sloppy and unprofessional.
Third option: simply recrop the image to a wider composition and use those “extra bits” that were cropped out during processing as that framing area. That, of course, assumes those bits existed and were cropped off during processing. Not a safe assumption in most cases.
What really makes this painful is that it breaks any wallpapers built and formatted for IOS6 or before. It also implies heavily a need to process and publish separately for IOS 7 and for IOS6 and non-IOS devices. You end up, if you want to do this right, needing to consider building out two or three versions of an image for different devices and releases, and then having to live with the complexity of trying to explain to non-technical types which image to use in what circumstances.
This is a big pain in the parallax. And Apple broke an existing setup and a known way to produce wallpapers because as far as I can tell, they only considered their own use case, and not the existing use cases for their devices by their users. This si the kind of decision we see out of Apple at times that makes my forehead get all scrunched up and wrinkled, because it’s poor product management. there’s no reason they had to break the existing setups to create this parallax effect for their cute backgrounds.
So count me annoyed because Apple and the product managers didn’t think through the use cases, or care about them, and built something that worked for a really neat demo, but didn’t take into consideration existing uses and workflows.
The sound of silence you hear in the distance is how much Apple cares what I think about this…
No, this isn’t a “damn, things have gone to hell since Steve died” grump; this kind of “only thinking about what matters to Apple” product design has been creeping into Apple’s products for years. Ask your favorite friendly Mac or IOS dev about some of the sandboxing Apple sprung on them, for instance.
Users of my wallpapers should stay tuned for updates if and when I decide how I want to work around this. I’m still leaning towards the simple frame, but I’m not completely convinced and I’m exploring what other options exist I might haven’t of thought of (I’m open to suggestions, big bad internet). Or maybe Apple could realize they’ve broken this and add functionality to support existing wallpapers in a non-parallax mode (this would be technically quite tough, I realize: like, say, recognizing a specific EXIF field in the image, or using a special suffix on the file name… it’d take a decent engineer a couple of hours to implement and a couple of days to build a decent UI onto it…)
My recommendation on my images: if you like them on IOS, use them. If you see one you think looks bad, let me know. I’ll see what I can do. And if you’re Apple, stop being so enthusiastic about blowing away existing functionality to implement stuff that gives great demo. Especially when a bit of thought would allow the system to support both rather easily…
I know a lot of pundits love to worry about Apple “Post-Steve”. I don’t, because I like Tim Cook and the executive team he has. But I will admit that the quality of product management on Apple products has increasingly worried me, because the company doesn’t seem to understand the users as well as they need to, and they increasingly show they aren’t understanding how their products are used, or don’t really care. Almost as if they’re starting to design products for the big release demo more than the day to day usage. And that’s not a good trend… And it’s not in the big features, it’s here in the details of the product, things like how wallpapers work and what breaks when you change how you display them.
That’s a minor thing in itself, but it’s a minor thing that can be really annoying to a group of users when you break it, and as that set of minor things that annoy users starts to pile up, that’s how you erode your user loyalty. Apple used to sweat these kind of details. Now they seem more interested in flashy demos…
Update: Macdailynews picked up on this and published a pointer to it, so thanks to them for doing that. Their response is Chuq’s all worried about “framing and composition” of images that, he seems to forget, as wallpaper will have text and/or icons splattered all over them which is both correct and misses the points, one of which is that if you’re a creative the quality of the thing you create matters, and the other is that this was something that worked perfectly in IOS6 and before and was broken for no good reason. If there’s a reason for the rant, it’s the latter, that this is a symptom of Apple’s product management problems. That’s something Macdailynews ignored completely looking for an easy response.
More amusing in its way is the comment stream on their article. One member wrote all of this off with a trivial “first world problem” — someone want to think about anything involving owning an IOS device that isn’t?
Still, it’s nice to see people talking about the issue, and maybe at some point about the larger issue that this is a symptom of.
A couple of things I’d like to see in IOS 7
Back in March, I suggested a couple of features I wanted to see in IOS 7: a password wallet to bring some real security to passwords built into the OS, and a way to make an ICE (In Case of Emergency) app available outside of a pin lock.
One of two arrived in IOS7, and looks to be pretty well implemented.
I still want to see the ICE capability implemented, though. Maybe in IOS 8.
Steve Balmer announced his retirement from Microsoft today sometime in the “coming soon” future. This has, of course, opened the floodgates of opinion, where everyone seems to be chiming in with some witty view or snide comment.
Most of that ‘analysis’ (and I use that term very loosely) is pretty shallow, not very well thought out, shoot from the hip stuff. Mostly it reminds me why I don’t write much about tech any more. Immediate snarky witticisms — content free or not — get pageviews; thoughtful commentary generally doesn’t because you can’t generate a meme-gif from it. (now would be the appropriate place to decry the intelligence level of the internet, except we forget that before the internet, we had the National Enquirer and People magazine to do this kind of writing for us…)
When Balmer came on, Microsoft was a challenged company. As he leaves, it is a challenged company. In the middle part, we tend to forget that he kept the company growing and profitable pretty consistently. Just not growing insanely fast or acting insanely profitable.
It’s hard to think of someone who could have run Microsoft better, however you want to define better, given that the company being run is Microsoft. When you’re captain of a battleship, it doesn’t matter how loud you yell “Turn right” or push on the rudder, physic wins and it takes time to change direction. Perhaps the company would have fared better (or just differently) with more radical decisions and changes, but would the board of directors have allowed them? I’m unconvinced. Balmer’s not alone with the challenge of changing courses in a big company, he can call up Meg Whitman and share a beer or three on that one.
The big challenge is that Microsoft (like HP) has no idea as a company how to be nimble, and the sheer size of those companies makes it hard to impossible for anything to happen quickly. Teaching a company to innovate faster isn’t as simple as writing a couple of memos and heading out to lunch.
Leo was probably right about HP needing to be split up (he needed to use a scalpel and not a hand grenade to do so), and if Microsoft really wants to change it’s culture and speed of innovation, it should consider it as well. Carve it into pieces, give each to its own executive team and board and let each move forward independently as a much smaller organization.
Not gonna happen, either in Microsoft or in HP. But it keeps being proven again and again, large is not a competitive advantage in the technology industry. If there’s a question that sends Tim Cook for the Maalox, it’s that one, not the fight with Android.
My view is whoever Microsoft brings in to replace Balmer, they will likely have the same struggles and I don’t expect significantly different results, not unless they get really radical and break up the company into pieces. I simply can’t see the board agreeing to that.
So my final view of the Balmer era is that he did the best he could, his successor won’t be significantly more successful at solving the challenges, and Microsoft will continue to be Microsoft. A company that makes a boatload of money and doesn’t get any respect for doing so. Give Balmer a C, maybe, but at least he didn’t put the great battleship Microsoft on a rock like Leo did by chasing the strawberries instead of watching the helm…
I come not to praise in-app purchases but to not bury them.
Really, I hate the in-app purchase racket. I hate how it’s abused by so many developers. I will always favor an app that has a list price and no in-app purchases over one that’s going to nickel and dime me or even just make me pay to unlock levels or features.
The in-app purchase racket preys on people like the lottery. Pay another dollar and maybe you could win today! Oops, not today! Well, see you tomorrow!
Turns out, surprise, a lot of people like the freemium model
The thing is, the in-app model can work for both sides. It potentially solves a number of problems for developers. It was a big push for webOS, back in the day, to try to create opportunities for the developers, and it’s turning into a useful tool for developers when used intelligently.
And yes, it can be abused, but that’s true of pretty much everything.
How does it help developers?
It saves developers from the pain of having to deal with the “free trial” app and the “paid full” app. you can ship one app and use in-app to unlock the paid features. the pain of this “one app in two” are legion, starting with the pain of actually convincing users to buy and switch to the full app — and then getting their data from the free app to the full app through all of the security restrictions.
It’s a very effective tool against piracy. It shifts the revenue point so that cracking the app and installing a free, stolen copy isn’t nearly as useful to the pirates. Some app developers I talked to back when I was dealing with this stuff for real found that many of the folks using pirated apps were still doing in-app purchases for things, turning them into part of the revenue stream. Even if they don’t, it’s a lot harder for them to take full use of the app without in-app.
It can be an effective alternative to trying to convince users to pay up for “really good App 2″, given that Apple and pretty much every other app store has decided not to implement paid upgrades. You can offer “really good app 2″ for sale in the store, and for existing users, upgrade “really good app” to include the features of the new app in a way that they can be unlocked with a in-app upgrade at a discount. It may not work for all apps and all code bases, but the option is there. And for some apps, it can make sense to create features and enhancements that are offered through in-app instead of going the “really good app 2″ route.
I in many cases encouraged developers on webOS to think about in-app, especially as a way out of the “free app/paid app” hell that so many of them dealt with. Users generally love the “try and buy” option and shy away from paying for things they aren’t absolutely sure off unless they can test them out. But if you have a free demo app, it can become painful to try to convince them to move to the paid version. A well thought out demo with unlock lowers the pain points of both of these situations. AND screws over the pirates.
That said, yes, some apps abuse in-app, especially off in game land where you’re constantly being nudged into buying more diamonds so you can buy more stuff. My view: I don’t mind that, if I’m getting good value, and by value, I’m talking about game-hours compared to the amount of money I’m spending. I recently played a game for a good number of weeks, and put a fair bit of change into it (let’s just say “I could have bought a really good XBOX game for that….”); and I think it was quite a fair cost to me, given how many hours I got out of it. And then one day I’d decided I’d played it enough, and I thought as it advanced to really advanced levels it lost its game balance somewhat, and so I deleted it and moved on. I don’t regret the dollar amount a bit, given I probably ended up paying something like $0.30 an hour to play, more or less. That seems fair compensation to a developer to me (I know, horrors, to those of you who think $4.00 for a game a developer spent 18 months building is expensive. Those of you who think that’s a ripoff need to get a new hobby)
Now I’m playing the latest version of a game in a series I’ve played in the past. This one has shifted from the price up front to the in-app (aka “buy more diamonds!”) model. It’s okay, but the game play is, IMHO, too heavily biased towards “if you don’t buy this better armor you’ll die a lot, and you can only buy the better armor with diamonds, of course…” and so I’m dying a lot. And I’ve tossed a little money at it to experiment with their game play model and pricing, and to be honest, it’s going to get deleted soon. It’s okay, but…. the balance is too heavily weighted towards “buy more diamonds” for my taste. So they’ll end up getting a lot LESS of my money, because they got too greedy in their game balance.
And that’s the answer here: if you try a game and find it greedy, throw it out. And send them an email to the support folks telling them why. If it’s not fun because they tweaked the game balance in greedy ways, don’t play it. If you don’t play it and don’t send them money, they’ll get the message. If it’s fun and you’re getting a lot of gameplay hours out of it, well, everyone wins, right?
That’s my basic model: I know what I’m willing to fork over to have a good time. I value it as a rough “per hour” cost. The more fun a game is, the more I’m willing to adjust that “per hour” cost towards “sure, I’ll buy more diamonds”. the more they seem to be keeping me from enjoying the game until I fork over money for more diamonds, the faster I trash it and move on. I expect the developers to set a game balance that’s fair to both sides, not just them. If they blow it and get greedy, well, there are another ten bazillion games in the app store waiting for me to try them out…
Developers deserve a good living. in-app gives them opportunities to do so. But that doesn’t mean you should let them hold your fun hostage, either. Push back on that and say no by deleting the app and not wasting your time or money on greedy ones. That’s the way to send the message not to abuse in-app.
(and no, not gonna mention app names. they’re irrelevant for this discussion, and I’m not looking to review or publicize them….)