Posted on Thursday, April 11th, 2013
Trent Walton wrote a pieces on his blog about reorganizing the way we think about proces when it comes to development on the web. He proposes that instead of dividing a project up into phases “where each phase determines the next”, we compile smaller “tactical teams that are capable of executing multiple rounds of planning, design, and code quickly and independently”. This is something I have been promoting for a while now.
Posted on Tuesday, April 9th, 2013
Check out this article by Hugo Giraudel. He brings up some good points about CSS
<select> alternatives. Next time your designers tries to strong-arm you into creating custom drop-downs, bring up some of the points raised in this article to shut them up.
Posted on Monday, April 1st, 2013
jQuery Knob solves one of my bigest pet peeves when it comes to Digital Skeuomorphism. The problem stems from applications like Propellerhead’s Reason. Reason is an integrated music recording and production studio that aims to bring an analogue workflow to a digital environment. While the application is quite successful, the one part that chaps my nuts are the knobs. I understand that they are used to emulate the look and feel of old hardware that some engineers might be familiar with, but there is an inherent problem with using knobs in a GUI.
The first thing anyone tries when seeing a digital knob for the first time is drag it along a circular path as if they grabbed it with their thumb and forefinger. What they notice then is that the knob moves as expected at first, but after geting to the peak of the knob, it begins moving backwards. This is because most digital knobs are programed to be clicked and dragged either straight up or down. Not so intuitive if you ask me.
What makes this even worse is that some programers try to make the knob act like you would think it should (dragging along a circular path). The problem with this is that the area to drag along is quite small on these elements, making the knob even harder to manipulate if you are used to the aforementioned up-and-down method.
What this plugin does is solve the UI problem by making knobs work how you think they should, and they do it well. I’m not saying that all knobs should look plain or ‘flat’, but they should work the way you think they should on first use. Form follows function, sound familiar?
Posted on Tuesday, December 18th, 2012
In June 2012, Facebook decided to abandon their HTML5 web-app (for the moment) to focus on a native app experience. The reason for this is because Zuck thinks, as of now, that HTML5 is to slow. Well, leave it to the guys at Sencha to prove him wrong. In the article The Making of Fastbook: An HTML5 Love Story, over on the Sencha blog, they show how using their framework along with some best practice can provide not only a good, but better mobile experience than native in some ways. HTML5 is definitely a force to be reckoned with.
Posted on Friday, November 30th, 2012
TinyPNG.org is an online PNG compression service described as…
Advanced lossy compression for PNG images that preserves full alpha transparency.
I have been using ImageOptim for a while now and decided to test them against each other.
TinyPNG has an example on their site that uses a picture of a panda. I took the large version (56.9 KB) and ran it through both TinyPNG and ImageOptim. Just like the online example, TinyPNG reduced it by 72% to a minuscule 16.2 KB, while ImageOptim only reduced it by 8.8% to 53 KB. I thought, “Okay, maybe there is some bias trickery going on here and TinyPNG used an image that was set up just perfect to get the most out of its service”, so I tried it with a few more images.
First I went to IconFinder.com and grabbed this PNG file. The original file size was 16.6 KB. After running it through both services, I found that TinyPNG reduced the image by 67% to 5.5 KB while ImageOptim reduced the image by only 20.1% to 13.2 KB. Still quite a difference.
The second image I decided to test was a simple gradient I made in Photoshop from 100% black to 100% transparent 1000 x 1000 pixels at 72 dpi. After exporting the image for web the file size came in at 6.3 KB. After running the image through both services, yet agin, I got some interesting results. TinyPNG actually increased the file size by 18% to 7.4 KB while ImageOptim managed to reduce it by 33.8% to 4.2 KB. This made me pause for a moment, so I decided to test another image.
Back at IconFinder I found another PNG, this time with lots of gradients and complex transparencies. Original files size was 11.1 KB (let’s call this one Transparent Ball). Once again running it through both services, TinyPNG came out in front decreasing its file size by 47% to 6 KB, while ImageOptim managed a smaller yet respectable 27.9% decrease to 8KB.
Another test I decided to tried was running the compressed images through the system a second time. TinyPNG’s results actually made the file larger, while ImageOptim did nothing.
So what’s the reason for TinyPNG’s better numbers overall? Their website states that they can’t spot any visual difference between the original image and their compressed image. However, when I ran the test using the Transparent Ball from IconFinder I noticed that TinyPNG’s compression did have some visual degradation while ImageOptim’s did not, thus explaining the better numbers from TinyPNG.
At this point I decided to experiment by combining compression techniques. Taking the last image I used (Transparent Ball), I ran the compressed versions through their competitor’s service. My results were interesting. From TinyPNG to ImageOptim there was an additional 3.6% decrease for a total file size of 5.7 KB from 6 KB. And from ImageOptim to TinyPNG there was an additional 26% decrease for a total file size of 5.9 KB from 8 KB. Running the image though TinyPNG first, then ImageOptim, yielded the best results in both file size and quality.
So what are the differences? Besides the numbers and one being a web service and the other being a desktop application, both are free and both allow batch compressions. The downsides of TinyPNG are that it only compresses PNGs and there is no batch download, so you will have to download the compressed images one at a time. ImageOptim can compress all types of images, but be careful. By default the application will overwrite your original files. There is a preference that will backup the originals, so be sure to turn it on if you don’t want to lose them.
So which one is better? If I had to pick one, TinyPNG does a better job of compression and most people wont notice the loss in quality, but given the time and resources, I would use both, running everything through TinyPNG first and then ImageOptim.
So there you have it. Now you have no excuse not to compress your images before putting them online.
Note: If you are ever in a situation and need to edit one of your compressed images, because you don’t have the original, you will notice that opening the files in Photoshop will give you an undesired result (inaccurate transparencies). The reason for this is because the compression technique strips out some of the alpha data that Photoshop needs to display the image correctly. To solve this, open the image in a program that displays the image correctly (for example: Preview on the Mac) and save the image as a copy with the alpha information intact. Now you can open it in Photoshop and edit away. Also note that the numbers in this article where for the specific images used and that different images will give different results.
Posted on Friday, May 18th, 2012
The lates from the rumour mill is that Apple is to update the 15″ MacBook Pro in 2012 with the following:
- a thinner MacBook Air type design
- removing the optical drive, ethernet port, & Firewire
- adding USB 3
- and including the beautiful Apple Retina Display
Sweet! New MacBook, Yah – but Wait…
Thinner, faster, better screen – these are all good things, right, but have you ever looked at a website on a Retina iPad? Text and vector graphics look awesome. Super crisp and clean edges, but raster stuff looks like ass!
Most graphics that have been design for digital are done at 72 dpi and look like shit on a high density display like The New iPad or iPhone 4. This is because the Retina hardware interpolates the lower resolution graphics by adding pixels that do not exist using special algorithms based on the pixels that do exist. Sounds smart huh? Yes and no. Yes, because now the graphic is resized to fit your display without looking like some 8 bit Nintendo garbage and no, because now it looks all fuzzy and shit!
So Why Not Just Design at Higher Resolutions?
Why – I’ll tell you why. Because the internet is slow! Web designers and developers are always trying to optimize their sites to load as fast as possible giving their users the best experience possible.
Having larger graphics ads to the ‘weight’ of a webpages total download size making it take longer to view. Remember the days when website images took long to load and you could see them render from top to bottom. Welcome back to 1998! There is already much debate about how to handle this new problem, but as of now there is no clear answer.
What Does This Mean for the Design Process?
If you’re a designer or work with designers you know that designing for the web has its challenges and restriction. There are limits to things like fonts and sizes. In print design, an 8 point font can still be quite legible, but online this text will probably appear muddy and pixelated. “But when I designed it on my new Retina MacBook it looked fine.” Right, it did look fine. That’s because, like I said above, text looks awesome!
Good designers love to experiment with placement and size, nudging and resizing things all the time. They will push the limits and probably push your buttons (especially if they came from the world of print and are new to designing for web). If it looks good on there monitor, they are going to try and use it.
What does this mean? It means that developers will have to take the time to explain this to designers and have them test all of their designs on an older box to see if their ‘vision’ still looks the way they intended.
Apple is know for changing the way we do things, so why should digital design be any different? Should Apple not release the Retina display on their laptops, desktops and monitors? Of course they should. If they don’t, someone else will. New gear with better specs equals future thinking awesomeness, and like all new things, it will take time for the rest of the world to catch up. Not much else we can do about it. So if it sounds like I’m complaining about it, I am. Is it the way of the future? It is. Hopefully Apple has some tricks up their sleeve to address this issue.
Posted on Wednesday, March 21st, 2012
March 7th 2012 Apple announced “The New iPad“. Its new features include a Retina Display, the new A5X processor, iSign 5 Megapixel back facing camera, 1080p HD video, voice diction and 4G LTE.
By no means is this post a review. It’s probably nothing but blather. Don’t get me wrong, I love the iPad. I have one and will probably update it in the future. There’s nothing out there at the moment that even comes close to how well the iPad performs. What this post is really about is the name.
Yes, the name. “The New iPad”, not iPad 3 or iPad 2S. Critics are freaking out about this for some reason. I say, “Who the fuck cares!”
Think about it for a second. Why do we expect it to have a numerical name? Cause the one prior was The iPad 2 and iPhone is also numbered. I have been saying for a while now that Apple should just ditch the numbers, and here’s why…
On July 11, 2008 Apple released the second generation iPhone and called it The iPhone 3G. Like many iPods before it, it had a cleaver word appended to the end of the product name to differentiate it from others in the Apple line up (ex. iPod Photo, Nano, Shuffle, Classic, Touch). One year later Apple announced The iPhone 3GS. This phone was meant to be an upgrade for the people who had the original first generation iPhone, and who’s cellular contracts where due for an upgrade. It was the same form-factor as the 3G, but faster and had a better camera that was capable of shooting video. It was new, but not so different that the people who had just bought the 3G felt that they were out of date. Thus began the “tick-tock cycle” we now know Apple to have given their products.
It wasn’t until June 24, 2010 when Apple released the next iPhone as The iPhone 4 that the number game had begun. It seemed natural, “the last one had a 3 in it, so let’s make this one the 4.” No other product that Apple had made was given a number like this until then (except for some software like their operating system OS X 10.(insert version number here), but that doesn’t count.)
Following The iPhone 4, came The iPad 2, announced on March 2, 2011. Adding the number to The iPad seemed like the cool thing to do at the time, but I had always thought this was a risky game. How far do you take it. “8, 9, 10 – and when it hits 10 do we call it X like the OS and Final Cut and QuickTime?”
For this reason and others is why I think it’s stupid to give products like this a numerical name. It’s not software! Some people think owners are going to have a hard time distinguishing which modle they have. Kind of like the MacBook dilemma, but who cares! There’s always a way to tell. For instance, the iPod Touch has never had a numbered name. It’s just refered to by its generation number. Also, not having such a distinguishing factor will mean that older models of a product will hold value and status longer, and not be given a black mark solely because of a name.
So, that’s what I think. Apple made a smart move and I saw it coming for months. Now all they have to do is drop the X off of their OS name and ditch the stupid cat. Oh, and don’t be surprised when the next iPhone is announced as just “iPhone”.
Posted on Wednesday, March 14th, 2012
The Photoshop Etiquette Manifesto for Web Designers is a collection of ways to improve the clarity of a PSD when transferred. You stay organized, your developer stays happy. Fist bumps all around.
Posted on Tuesday, February 28th, 2012
Adobe Flash! You’re like an ex-girlfriend that just won’t give up. You keep calling and asking if we can be friends, hopping for more. I don’t like you anymore Flash! I’ve moved on.
You used to have a place in my life. You did things for me that no one else at the time could. You made me feel empowered, like I could do anything and had complete control, but now you just get in the way. You’re slow, selfish and you don’t even work on my iPad. I know, I know – it’s Apples fault right. Please! Don’t blame others for your short comings. There’s a reason why they won’t include you in any of their new products. Even Microsoft has started to look the other way.
Your time has come. Adobe is ready to give you the boot. There are new and better things out there. You just don’t cut it anymore. I’m over you. Stop sending me website layouts that look like they were designed in 2004! Even if I did build them, I wouldn’t use you to do it. Besides – HTML5 is much more open minded. She even plays well with others.
I’m not saying there’s no place for you, just not with me. Maybe it’s time you moved one. We can’t be friends. Stop calling. I don’t like you!
Posted on Friday, February 10th, 2012
Merlin & Dan reveal the true cost of things in this episode of Back to Work.
When a client asks for what they consider a “small job” and offers meer penes to complete it, ask your self – “What’s it going to take for me to get this done?” Weight things like time, effort, and above all, life impact. A job that might only seem like a few hours might have a deeper impact on your life as a whole.
Merlin gives the example of a client asking him to fly out to speak for “only 20 minutes”. For Merlin this 20 minutes will cost him packing, travel time, hotel costs and being away from his family for three days. All of a suden the few hundred dollars offered to speak for 20 minutes now seems not so worth it. All these things must be considered when pricing out a job. For fuck sake, charge what you’re worth!