<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:cc="http://cyber.law.harvard.edu/rss/creativeCommonsRssModule.html">
    <channel>
        <title><![CDATA[LeeThree on UX - Medium]]></title>
        <description><![CDATA[This is a blog by @LeeThree9 on topics including user experience, human computer interaction, usability and interaction design. - Medium]]></description>
        <link>https://medium.com/leethree?source=rss----13fd360a780c---4</link>
        
        <generator>Medium</generator>
        <lastBuildDate>Tue, 05 May 2026 06:00:20 GMT</lastBuildDate>
        <atom:link href="https://medium.com/feed/leethree" rel="self" type="application/rss+xml"/>
        <webMaster><![CDATA[yourfriends@medium.com]]></webMaster>
        <atom:link href="http://medium.superfeedr.com" rel="hub"/>
        <item>
            <title><![CDATA[Farewell, PC!]]></title>
            <link>https://medium.com/leethree/slideshare-id-19957037-rel-0-w-500-h-407-fb-0-mw-0-mh-0-style-border-1px-f9fae5513654?source=rss----13fd360a780c---4</link>
            <guid isPermaLink="false">https://medium.com/p/f9fae5513654</guid>
            <category><![CDATA[ubiquitous-computing]]></category>
            <category><![CDATA[post-pc]]></category>
            <category><![CDATA[future]]></category>
            <dc:creator><![CDATA[Sirui Li]]></dc:creator>
            <pubDate>Thu, 25 Apr 2013 13:50:10 GMT</pubDate>
            <atom:updated>2017-07-22T10:39:24.760Z</atom:updated>
            <content:encoded><![CDATA[<iframe src="https://cdn.embedly.com/widgets/media.html?src=https%3A%2F%2Fwww.slideshare.net%2Fslideshow%2Fembed_code%2Fkey%2FJfhYJovZTgTnyJ&amp;url=https%3A%2F%2Fwww.slideshare.net%2Fleethree%2Ffarewell-pc-censored&amp;image=https%3A%2F%2Fcdn.slidesharecdn.com%2Fss_thumbnails%2Ffarewellpccensored-130425082357-phpapp01-thumbnail-4.jpg%3Fcb%3D1366878962&amp;key=a19fcc184b9711e1b4764040d3dc5c07&amp;type=text%2Fhtml&amp;schema=slideshare" width="600" height="500" frameborder="0" scrolling="no"><a href="https://medium.com/media/24a7485f064a35dac26db2fca68c16d7/href">https://medium.com/media/24a7485f064a35dac26db2fca68c16d7/href</a></iframe><p>Just found (from a huge pile of my forgotten files) this 2-year-old presentation I made about how the age of PC was going to end. It’s exactly what has happened in the last two years.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=f9fae5513654" width="1" height="1" alt=""><hr><p><a href="https://medium.com/leethree/slideshare-id-19957037-rel-0-w-500-h-407-fb-0-mw-0-mh-0-style-border-1px-f9fae5513654">Farewell, PC!</a> was originally published in <a href="https://medium.com/leethree">LeeThree on UX</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[The case against Google]]></title>
            <link>https://medium.com/leethree/the-case-against-google-7f55a7fc1d35?source=rss----13fd360a780c---4</link>
            <guid isPermaLink="false">https://medium.com/p/7f55a7fc1d35</guid>
            <category><![CDATA[neutrality]]></category>
            <category><![CDATA[google]]></category>
            <category><![CDATA[facebook]]></category>
            <category><![CDATA[android]]></category>
            <category><![CDATA[apple]]></category>
            <dc:creator><![CDATA[Sirui Li]]></dc:creator>
            <pubDate>Mon, 26 Mar 2012 14:13:00 GMT</pubDate>
            <atom:updated>2017-08-02T16:22:30.412Z</atom:updated>
            <content:encoded><![CDATA[<p><a href="https://gizmodo.com/5895010/the-case-against-google">The Case Against Google</a></p><p>If you haven’t already read it, you probably want to spend some time to read Mat Homan’s thoughtful (and long) piece on what is happening with the new Google. (just click the title link above)</p><blockquote>But while Google was busy holding up the sky, the ground beneath its feet shifted in ways it didn’t anticipate. Our searches have evolved from the merely factual to the deeply personal. We want to find a nice hotel or a good restaurant or a particular person. We want to know what’s happening right now, right here. And increasingly, we turned to smaller, fragmented, platforms to get that stuff.</blockquote><p>Here is my version of it:</p><h4><strong>The Web</strong></h4><p>The old Google is all about the search, and essentially, the World Wide Web. Google is <em>the</em> search engine and it rules the Web by its <a href="http://en.wikipedia.org/wiki/Google_search#PageRank">ranking algorithms</a>. Everything else the old Google had been doing was to either get more people to use the Internet (which was, largely, the Web) or make people use it more. The logic is simple: the more people browse the web, the more they search, and the more advertisements they will click on.</p><p>GMail and Chrome, the rock-star products of the Google as we knew it, have made the Web apps as good as their desktop counterparts, if not even better. They are good. They are free. They made me a huge fan of Google and I have been using them ever since they were released. As a result, the browsers become faster, enabling more sophisticated Web apps that make use of the extra horsepower.</p><p>…until <a href="http://www.wired.com/magazine/2010/08/ff_webrip/all/1">the Web (<em>sort-of</em>) died</a>. Google has to do something about it. And here comes all the controversies about <a href="http://www.google.com/insidesearch/plus.html">Search plus Your World</a>.</p><h4><strong>Neutrality</strong></h4><p>The major reason why people get frustrated about this personalized Google search, in my opinion, is the loss of Google’s neutrality.</p><p>Google, as the search engine, is regarded by some as a type of utility, similar to electricity or water. It should be non-biased towards anyone, which means people should get exactly the same supply of power or water when they plug-in a power cord or turn on the tap. In the same way, people would expect to get the same result when Googling the same keywords.</p><p>If the <a href="http://www.wired.com/wired/archive/11.01/google.html?pg=3">official interpretation</a> of Google’s motto “don’t be evil” is “subverting the power of Google to commercial ends”, its current endeavor to integrate search with other Google products is clearly contradicting this notion of “not evil”. Promoting its own products can be considered a direct threat to its neutrality, for which, Baidu — the Chinese search engine — has <a href="http://usa.chinadaily.com.cn/epaper/2011-02/23/content_12064398.htm">long been criticized</a>.</p><p>At lease one lesson should be learned: a broken promise can hurt people’s feelings <a href="http://parislemon.com/post/15604811641/why-i-hate-android">very badly</a>.</p><h4><strong>The Vision</strong></h4><p>Apart from the trust issue, if the personalized search turn out to be a better experience for users, it might be justifiable. But I’m more concerned about Google’s seemingly lack of commitment to the Web.</p><p>Google is confronting Facebook with its own social network Google+, which also includes a location service, video chat, and <a href="http://en.wikipedia.org/wiki/Google%2B#Features">even more</a>. On the other hand, it is engaging Apple with a <a href="http://www.android.com/devices/">huge fleet</a> of Android smartphones and tablets, together with its newly revamped <a href="https://play.google.com/about/">storefront</a>.</p><p>The problem is: these are someone else’s visions, not Google’s.</p><p>Facebook is all about <a href="http://www.readwriteweb.com/archives/zuckerbergs_letter_to_shareholders_personal_relationships_are_the_fundamental_unit_of_our_society.php">social relationships</a> and making people share more things onto its <a href="http://www.telegraph.co.uk/technology/facebook/8151101/tTim-Berners-Lee-criticises-Facebooks-walled-garden.html">walled garden</a>. Apple is building a whole spectrum of post-PC “<a href="http://www.apple.com/ipad/"><em>resolutionary</em></a>” screens surrounding its iTunes Store ecosystem and iCloud service. It is said that some TV-sized device might soon join these screens.</p><p>With its huge investments into Google+ and Android (and <a href="http://www.theverge.com/2012/2/27/2827591/google-to-double-down-on-android-tablets-in-2012-says-andy-rubin">doubling down</a> on tablets this year), Google has officially endorsed Facebook’s version of future human relationships and Apple’s version of future computing by racing against them.</p><p>Surely, this strategy could work. Microsoft has done it. And Tencent might have already <a href="http://articles.businessinsider.com/2011-08-05/tech/30072946_1_copying-icq-portal">done this a thousand times over</a>.</p><p>But what about Google’s own version of our future? When Chrome OS came out, I thought that must be it. The open Web might reign over the entire Internet again simply because Google were behind it.</p><p>Now, even Google itself doesn’t appear to be convinced…</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=7f55a7fc1d35" width="1" height="1" alt=""><hr><p><a href="https://medium.com/leethree/the-case-against-google-7f55a7fc1d35">The case against Google</a> was originally published in <a href="https://medium.com/leethree">LeeThree on UX</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[From Gamification to Exploitationware]]></title>
            <link>https://medium.com/leethree/from-gamification-to-exploitationware-47b3093b83cb?source=rss----13fd360a780c---4</link>
            <guid isPermaLink="false">https://medium.com/p/47b3093b83cb</guid>
            <category><![CDATA[social-games]]></category>
            <category><![CDATA[exploitationware]]></category>
            <category><![CDATA[facebook]]></category>
            <category><![CDATA[gamification]]></category>
            <category><![CDATA[cow-clicker]]></category>
            <dc:creator><![CDATA[Sirui Li]]></dc:creator>
            <pubDate>Tue, 14 Feb 2012 14:05:00 GMT</pubDate>
            <atom:updated>2017-08-02T16:20:34.341Z</atom:updated>
            <content:encoded><![CDATA[<p>Several days ago, I listened to <a href="http://5by5.tv/hypercritical/52">a Hypercritical episode</a> in which John Siracusa mentioned about <a href="http://en.wikipedia.org/wiki/Cow_clicker">Cow Clicker</a>, a game designed to be a satire of social network games by <a href="http://en.wikipedia.org/wiki/Ian_Bogost">GaTech professor Ian Bogost</a>.</p><figure><img alt="Cow Clicker logo" src="https://cdn-images-1.medium.com/max/400/0*9MHRyfWvDGh0yacW.jpg" /><figcaption>Cow Clicker</figcaption></figure><p><a href="http://www.wired.com/magazine/2011/12/ff_cowclicker/all/1">The story of Cow Clicker</a> on WIRED is a funny and thought-provoking one. In fact, <a href="http://leethree.info/post/1495303950/games-as-reality">the very first post</a> on this blog is exactly about “gamification”, though I did not know about the word at that time.</p><p>While “social gaming” brings some game mechanisms to our social relationships, it is not necessarily a game. Games are more about entertainment and having fun than experience points, scorecards and achievements. But from my personal experience, you can hardly enjoy such a social game unless you either keep bothering your friends or keep paying money.</p><p>I played City of Wonder for about a month on Google Plus when I was preparing for my GRE test. I love playing <a href="http://en.wikipedia.org/wiki/Civilization_(series)">Civilization</a> and <a href="http://en.wikipedia.org/wiki/SimCity_(series)">SimCity</a> series and City of Wonder has many of the elements of both game series. But very quickly I found that it is just another FarmVille. All you have to do with the game is either keeping clicking or just waiting for things to happen. You will find that you have to invite a number of “neighbors” to help you level up. There is also a reward for helping your friends.</p><p>As the creators of these games claim, social gaming is supposed to be a collaboration effort that brings you and your friends together. In order to build several “Wonders” to keep my “citizens” in the game happy, I persuaded several of my personal friends to join the game. Most of them just signed up, helped me with the tasks, and never opened the game again. I felt bad about this because essentially, I just used my friends as a resource in exchange for virtual rewards from the game.</p><p>Then I added several strangers who played the same game to my circles just to keep playing the game. They helped me with the game and I helped them in return. All of us will be rewarded in the game. I built Wonders and buildings in my city and I was trying to make it a reasonably beautiful city.</p><figure><img alt="My City of Wonder" src="https://cdn-images-1.medium.com/max/500/0*MtRlvHWb55UHhWl1.png" /></figure><p>It went very well until one day I found there was nothing more I wanted from the game. With the help of some complete strangers I finished my “city planning” and I think it was sort of an achievement. However, at this point the whole premise of “social” gaming is defeated because there were no social relationships, just business transactions. All the players treated each other as resources and we just keeping clicking to build our cities.</p><p>When reducing this game or any other similar “social game” to bare bones, they are all Cow Clickers. You click the cow and you get a point. You make use of your friends, you get more points. That is how they expand. You pay real money to get a large amount of points. That is how they make money.</p><p>Basically we are the one who pays money but at the same time we are the salesmen for the game. In return, we get some useless virtual rewards to show off and to satisfy our vanity. The game companies only have to make more items you desire and more reasons for you to spam your friends’ news feed.</p><p><em>Serious</em> games (or conventional games) never work this way. When playing games, you are either intellectually or physically challenged and you are rewarded for your performance. This principle is true from Super Mario Bros to Angry Birds, from Starcraft to Call of Duty. Even for chess and football, this is true because it is the actually how the society works.</p><p>Therefore, social games are not games because you are not taking on challenges to get rewarded (*). Instead, you exploit your friends. You friendships are the gold mine and you are the miner for the game company. The company pays you some pixels made of computer generated digits with no cost to keep you happy until you quit the game.</p><p>This is why Ian Bogost suggested a term for this type of game-like practice: “<em>exploitationware</em>” [B].</p><p>Exploitationware may be working well right now with the success of FarmVille and many other very similar games. Even Cow Clicker earned a bit money for Bogost. But in my point of view, it won’t be long before people become immune to this kind of “brain hacks that exploit human psychology in order to make money.” [A]</p><p>When more and more social “games” are competing with each other, it will be much more difficult to get user attention among all the social status. Users will concern about being spammed by friends and they will try not to spam their friends. Cow clicking “games” will likely to be replaced by games that are better-intended and more enjoyable. In other words, the cost to make players exploiting each other will increase until social “games” cannot make easy money anymore.</p><p>So why <a href="http://en.wikipedia.org/wiki/Gamification">gamify</a> your industry? Gamification might make your users/customers/clients addictive by manipulating them and rewarding them. But in the end, if your reward is useless and your product is horrible, no one will continue to play your game because there are tons of better games around.</p><p>You can only trick your users into doing something stupid for a short period of time. Then they will understand why the CEO of <em>the</em> social “game” empire once said [D]:</p><blockquote>I knew that i wanted to control my destiny, so I knew I needed revenues, right, fucking, now. Like I needed revenues now. So I funded the company myself but I did every horrible thing in the book to, just to get revenues right away. (…) We did anything possible just to just get revenues so that we could grow and be a real business.</blockquote><p>(*) unless you think clicking the cow every six hours is a challenge and enjoy every points you get from it.</p><h4><strong>Further reading:</strong></h4><ol><li><a href="http://www.bogost.com/blog/cow_clicker_1.shtml">Cow Clicker — The Making of Obsession</a> by Ian Bogost</li><li><a href="http://www.gamasutra.com/view/feature/6366/persuasive_games_exploitationware.php">Persuasive Games: Exploitationware</a> by Ian Bogost</li><li><a href="http://inessential.com/2011/12/23/gamification_sucks">‘Gamification’ sucks</a> by Brent Simmons</li><li><a href="http://techcrunch.com/2009/11/06/zynga-scamville-mark-pinkus-faceboo/">Zynga CEO Mark Pincus: “I Did Every Horrible Thing In The Book Just To Get Revenues”</a> by TechCrunch</li></ol><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=47b3093b83cb" width="1" height="1" alt=""><hr><p><a href="https://medium.com/leethree/from-gamification-to-exploitationware-47b3093b83cb">From Gamification to Exploitationware</a> was originally published in <a href="https://medium.com/leethree">LeeThree on UX</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Google’s apostrophes]]></title>
            <link>https://medium.com/leethree/googles-apostrophes-81e051b8420c?source=rss----13fd360a780c---4</link>
            <guid isPermaLink="false">https://medium.com/p/81e051b8420c</guid>
            <category><![CDATA[google]]></category>
            <category><![CDATA[typography]]></category>
            <category><![CDATA[details]]></category>
            <category><![CDATA[erik-spiekermann]]></category>
            <dc:creator><![CDATA[Sirui Li]]></dc:creator>
            <pubDate>Wed, 18 Jan 2012 07:17:00 GMT</pubDate>
            <atom:updated>2017-08-02T16:17:40.748Z</atom:updated>
            <content:encoded><![CDATA[<figure><img alt="Google’s apostrophes" src="https://cdn-images-1.medium.com/max/500/0*lF90s92iah8i_2S_.png" /></figure><p>It’s very strange to see all these <a href="http://en.wikipedia.org/wiki/Quotation_mark">single opening quotation marks (‘)</a> in the place of <a href="http://en.wikipedia.org/wiki/Apostrophe">apostrophes (’)</a> on <a href="http://www.google.com/intl/en/about/corporate/company/ux.html">Google’s introduction page of its design principles and user experience philosophy</a>.</p><p>You can see it more clearly when zoomed in.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/413/0*fCkoPwSHN0DPFyVA.jpg" /></figure><p>Maybe this is why I can’t find words like “detail” or “typography” in this entire page?</p><p>Don’t understand what I’m talking about? Then read <a href="http://en.wikipedia.org/wiki/Erik_Spiekermann">Erik Spiekermann</a>’s <a href="http://issuu.com/fontshopsf/docs/type_tips/4">Typo Tips</a>, especially this part (emphasis added):</p><blockquote>A dead giveaway for unprofessional “desktop typography” are wrong quotes and apostrophes. Quotes can have different shapes. They generally look like “this”, and can be remembered as beginning and ending quotes by thinking of “66” and “99”. Beginning quotes are found on the Mac by pressing option-[; closing quotes, option-shift-[. <strong>The apostrophe is simply a raised comma, the shape of a ’9 in most typefaces. It is identical to the closing single quote, while the open single quote looks like a ‘6.</strong> Beginning single quotes are found on the Mac by pressing option-]; the apostrophe and closing single quote, option-shift-].</blockquote><p><strong>Update</strong>: Google has already updated the company about pages some days ago. All apostrophes are fixed. Good job! (Feb 2, 2012)</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=81e051b8420c" width="1" height="1" alt=""><hr><p><a href="https://medium.com/leethree/googles-apostrophes-81e051b8420c">Google’s apostrophes</a> was originally published in <a href="https://medium.com/leethree">LeeThree on UX</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Kevin Fox’s offer to Google Reader]]></title>
            <link>https://medium.com/leethree/kevin-foxs-offer-to-google-reader-ed41ecc86314?source=rss----13fd360a780c---4</link>
            <guid isPermaLink="false">https://medium.com/p/ed41ecc86314</guid>
            <category><![CDATA[google]]></category>
            <category><![CDATA[rss]]></category>
            <category><![CDATA[redesign]]></category>
            <category><![CDATA[social-network]]></category>
            <dc:creator><![CDATA[Sirui Li]]></dc:creator>
            <pubDate>Thu, 03 Nov 2011 14:23:00 GMT</pubDate>
            <atom:updated>2017-08-02T16:14:20.433Z</atom:updated>
            <content:encoded><![CDATA[<p><a href="http://fury.com/2011/11/my-offer-to-google-reader/">My offer to Google Reader - Fury.com</a></p><p>Google revamped Google Reader, together with many other web apps. Its new interface soon become a controversial topic.</p><p>Google Reader is an essential app for people who subscribed to a great deal of news feed. But it’s much more than that. I followed many friends and colleagues. I read their shares on a daily basis. I’ve shared hundreds of items, sometimes with my notes and comments.</p><p>Suddenly, they are all gone, completely. There is no native Google Reader share any more. Even the bookmarklet is also killed. Only a “+1” button is left.</p><p>I can understand why Google did this. There are some fundamental problems with RSS. Most ordinary users don’t know what is RSS. I have tried to explain to others about what is Google Reader. I found it very difficult (if not impossible). People use Twitter and Facebook to share links nowadays and it’s likely to become even more so in the coming years.</p><p>Moreover, with the declaration of death of Buzz, Google is going to consolidate its social features into Google Plus. So the social features of Google Reader (follow and share) have to work together with Google’s new baby.</p><p>But from the viewpoint of an old Reader user, Google is not doing it the right way. Because sharing in Google Reader is an act of curating. It generates a content stream that is filtered and annotated by someone. On the other hand, sharing in Google Plus/Twitter/Facebook is mixed up with personal sentiments, jokes and cat photos. The content stream could be part of our social streams, but they are not the same.</p><p>For example, when you see a product review on Engadget and you want to know whether it’s a good idea to purchase it now, you share it on your social stream and your friends will give you their opinions. But you don’t want to do that with your digital curation. In most cases, you share an article on Reader because you think the article is valuable, relevant and others should read it too.</p><p>Therefore, Google should make this distinction by at least keeping the curated content stream of its users. However, the redesign of Google Reader has effectively stripped this entire level of meaning and combined it with a regular social network. Now we just lost another handy tool to distill some truly inspirational ideas and insightful thoughts from the excessive amount of social streams flooding our screens.</p><p>That’s what I miss most about the old Reader.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=ed41ecc86314" width="1" height="1" alt=""><hr><p><a href="https://medium.com/leethree/kevin-foxs-offer-to-google-reader-ed41ecc86314">Kevin Fox’s offer to Google Reader</a> was originally published in <a href="https://medium.com/leethree">LeeThree on UX</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Thoughts on Windows 8]]></title>
            <link>https://medium.com/leethree/thoughts-on-windows-8-68e770894649?source=rss----13fd360a780c---4</link>
            <guid isPermaLink="false">https://medium.com/p/68e770894649</guid>
            <category><![CDATA[touchscreen]]></category>
            <category><![CDATA[gestural-interfaces]]></category>
            <category><![CDATA[ui-design]]></category>
            <category><![CDATA[software-design-critique]]></category>
            <category><![CDATA[paradigm]]></category>
            <dc:creator><![CDATA[Sirui Li]]></dc:creator>
            <pubDate>Thu, 29 Sep 2011 14:44:00 GMT</pubDate>
            <atom:updated>2017-08-02T16:05:35.126Z</atom:updated>
            <content:encoded><![CDATA[<figure><img alt="Windows 8 Start screen" src="https://cdn-images-1.medium.com/max/500/0*tbJXCYdw9T1eKnUa.jpg" /></figure><p>When Windows 8 was demoed in detail on <a href="http://www.buildwindows.com/">BUILD conference</a>, it soon became the topic of the day. All of a sudden, people start talking about how cool the “Metro style” UI is and how it could become a game changer for tablet market.</p><p>I spent a few hours watching all the keynotes and some talks about “Metro style” UI in the BUILD conference. Microsoft has done an incredible job to show the potential of both Metro UI and Windows. This certainly resolved some concerns about Microsoft’s tablet strategy. However, after I installed and played the Developer Preview in a virtual machine for some time, I found that Windows 8 raised even more doubt than answers.</p><p>(I don’t really know the reason why “Metro” is always followed by “<em>style</em>”. I guess this shows some reservation of Microsoft about Metro UI. Here I call it “Metro” just for the convenience of my discussion.)</p><h4><strong>Metro redesigned and improved touch UI</strong></h4><p>The live tiles of Metro <em>works like nothing else</em>. It’s direct. It’s informative and it’s beautiful. There are no piles of windows. There are no arrays of static icons (except for the legacy applications). There are no menus. Basically, Metro abandoned everything from the <a href="http://en.wikipedia.org/wiki/WIMP_(computing)">WIMP</a> era.</p><p>On the other hand, designed from ground up, Metro incorporated many effective new techniques for touch-screen interfaces.</p><ul><li>Pinching out for overview, which is similar to <a href="http://en.wikipedia.org/wiki/Zooming_user_interface">Zooming UI</a>, would make high-level content arrangement more direct (without entering settings or editing mode).</li><li>Selecting, moving and removing are easier. All these operations would work just as what you would think by intuition. Tiles are like boxes, you could drag them around without explicitly push any button.</li><li>Holding a tile with one finger while panning around the start screen using another finger to move the tile is the most clever multi-touch interaction technique I’ve ever seen for a long time. This is the potential of multi-touch and I can’t imagine how many more we will get in the coming years.</li><li>Edges are claimed as important screen real estate, which brings back a important advantage of the desktop environment we all familiar with: Mouse cursor could position to targets on corners and edges quickly and accurately (by <a href="http://en.wikipedia.org/wiki/Fitts&#39;s_law">Fitts’s Law</a>). Earlier touch interfaces didn’t make full use of screen edges until we gradually realized that swiping from screen edges could achieve a similar result. We’ve seen this gesture to activate features such as Android top-down menu, webOS cards view and BB Playbook App switcher. Windows 8 takes it one step further by reserving left and right edges for system features (go-back and Charms) and up and down edges for app features (show App Bar), which essentially makes it a new paradigm that others could follow.</li></ul><figure><img alt="Edges reveal off-screen UI" src="https://cdn-images-1.medium.com/max/500/0*YOLAvJhlwV7rLXk1.jpg" /><figcaption>From Jensen Harris’s presentation: <a href="http://channel9.msdn.com/events/BUILD/BUILD2011/BPS-1004">8 traits of great Metro style apps</a></figcaption></figure><p>In summary, the Metro UI of Windows 8 is a huge expansion based on the original Metro UI of Windows Phone 7 and it’s built on top of the practice of many other existing touch interfaces. Windows 8 Metro introduced many new interaction techniques to remedy the weakness of iPhone, Android and other touch interfaces, which will help to establish a more complete set of interface paradigms for future touch-based systems.</p><p>Although some of the significant issues are still waiting for solutions, such as undo/redo and lack of discoverability, Microsoft did a pretty good job making great interfaces this time as a latecomer to the game. The user experience competition becomes more and more intense, which we, as customers, would surely love to see.</p><p>As a researcher interested in UX, I welcome Metro UI’s leap forward to better touch UI. However, considering Windows 8 as a whole, I should pointed out some of my concerns.</p><h4><strong>“No compromise” is a compromise in itself</strong></h4><p>People love features. People love cool ideas. People love Sci-Fi style devices that consolidate many amazing functions into one single magical box. But sadly they don’t always work well in the real world.</p><p><a href="http://www.motorola.com/Consumers/US-EN/Consumer-Product-and-Services/Atrix-Accessories-Page/Atrix-Accessories">Moto Atrix</a> is an awesome super phone that could be docked into a laptop, which sounds like a perfect combination. But it turns out a phone-transformed laptop is far from beating a cheaper netbook, which makes Atrix not that much different from any other Android phone.</p><figure><img alt="Windows 8 for all PCs" src="https://cdn-images-1.medium.com/max/500/0*R54E1dkwOa-2YI3D.jpg" /><figcaption>Also from Jensen Harris’s presentation: <a href="http://channel9.msdn.com/events/BUILD/BUILD2011/BPS-1004">8 traits of great Metro style apps</a></figcaption></figure><p>The central issue is that, as I’ve mentioned a few times: technologies diverge, not converge. When you want to do multiple things well at the same time, you’ll always have to make trade-offs between them. This makes products and services generally more and more specialized. These basic understanding of how world works makes Microsoft’s “no-compromise” promise just an illusion.</p><p>I’m worrying that, Microsoft seems to be afraid of making commitment to tablets. “Windows reimagined” becomes a weird two-faced monster which is trying to support two instruction set architectures, both touch and pointing input, various screen resolution, as well as potentially thousands of hardware with different form factors range from small tablet to wall-sized flat display.</p><p>Is Windows 8 really capable of optimizing everything with its current design? Although I agree that Metro UI is highly elastic with its grid-based design (grid is one of most common element of modern design), I can’t imagine that tweaking around the processes and file systems within the desktop mode of a tablet would be a good idea.</p><h4><strong>Tablets and desktops are inherently different</strong></h4><p>The Metro principle is “touch first” while the desktop environment is designed for keyboard and mouse. You can’t just put them together to make a system that is good for both types of input. It can’t be as simple as that.</p><p>When people are using tablet or desktop computer, the contexts are different so users have very difference expectations. Tablet users usually need just a handful of simple features such as tweeting or checking emails. But the experience needs to be absolutely “fast and fluid”. Desktop users want to get things done. But their task is often complicated and requiring multitasking with several heavy-duty applications.</p><p>Desktop environment and touch don’t work well together. This is proved by our disappointment from <a href="http://en.wikipedia.org/wiki/Tablet_personal_computer">“old-style” tablet PCs</a> and touchscreen PCs with Windows 7 on them. People don’t want to be bothered by partitions and drive-letters or even folder names. Just think about how frustrated a tablet user would be when he has to open Task Manager to see what’s going on with the system.</p><p>Metro and keyboard/mouse won’t work well together either. Why would anyone want a “touch first” version of flight tracker if a full featured web app is already available in any browser you choose? You feel like to touch the interface, but you can’t. Then you want to make things work like what you normally do with your PC. But you won’t be able to open two Word documents for reviewing side by side. (But maybe you could “Snap” a 400px-wide Excel next to your PowerPoint window with touch-based big buttons that auto-hide it self to make the interface more “chromeless.”) Sure you’ll miss the old Windows desktop.</p><figure><img alt="Snap mode of Metro style" src="https://cdn-images-1.medium.com/max/480/0*TowvaEfIu3xGoBfB.png" /><figcaption>Snap Mode</figcaption></figure><p>When the demonstrator taped on a small Excel tile, the Metro UI dissolved and desktop Excel pop out. At the moment, I was shocked and very confused. With the two completely different UI on the same device, user will constantly face dilemmatic problems like “am I going to buy an antivirus software for my tablet?” “should I use the Facebook app or the Facebook website?”(Which is the real Facebook?) or something even more puzzling like “how could I insert my Endnote bibliographies into a Metro style email client?”</p><p>Eventually you’ll either give up on Metro or you’ll never want to enter the desktop mode. You can’t live with a “schizophrenic” device for a very long time without becoming mad yourself.</p><p>It’s really awkward to watch a Microsoft designer killing a non-responding app inside the desktop UI while he is evangelizing their “Metro style Design Principles”, which including two bullet points saying “Do more with less” and “Win as one”. I wish he really believes what he was talking about (or maybe he shouldn’t).</p><figure><img alt="Metro style Design Principles" src="https://cdn-images-1.medium.com/max/500/0*iX-eX1NIpeZiIVDd.png" /><figcaption>From Samuel Moreau’s presentation: <a href="http://channel9.msdn.com/Events/BUILD/BUILD2011/APP-395T">Designing Metro style: principles and personality</a></figcaption></figure><h4><strong>Metro style is too stylish as a framework</strong></h4><p>When I showed to one of my friends the Metro style Start screen with colorful moving tiles, his first comment is: “It looks like advertisements took over your computer.”</p><p>It was embarrassing, isn’t it?</p><p>It makes me wonder that: What pops up into your vision when thinking about “Metro style”? Colorful moving tiles? Horizontal scrolling? 120px left margin? Big 42pt Segoe headings? Huge full-screen background pictures?</p><p>What about iOS style? Rounded square icons, glowing buttons, left pane on iPad apps, or Helvetica Neue font? What about Android?</p><p>I believe Metro style is the most distinct one. No other OS design guideline is as specific as the Metro style. In fact, Metro style is so distinct that most Metro apps looks virtually the same. It will look rather bizarre if you want to show some personality in your app within the framework of Metro style. In other words: Metro style could be a big obstacle to the personalities of apps.</p><p>There was actually a session called “Designing Metro style: principles and <em>personality</em>” in the BUILD conference. But curiously, the word “<em>personality</em>” is never used throughout the talk except for the title and final slides.</p><p>This sounds like an overstatement. But I just did a bit comparative study to show that it’s not:</p><figure><img alt="Apps on different platforms" src="https://cdn-images-1.medium.com/max/500/0*M3OA3UjBbRzrHptJ.png" /></figure><p>To comply with the restrictions of Metro style, these apps have to compromise their own style of interface design. Frankly speaking, there’re two ways to prevent this: 1) Completely ignore Metro style. 2) Apply Metro style to all apps on other platforms.</p><p>The former one would be a great choice if you’ve developing a game such as Angry Birds. But for Windows 8, you’re still required to make a narrow version of your game in “Snaped” view, which could be quite challenging for a hardcore game like Need for Speed. But for regular apps, prepare to redesign your app with Metro style and struggle between it and your own style.</p><p>On the other hand, Metro style haven’t really proved its capability. When Apple demoed the iWork suite and iMovie on the iPad, everyone wowed. Whether people will actually work with an iPad is secondary. What matters is that these big apps showed the developers the potential of what you can do with tablet-sized touchscreen, they proved that there’s enough spaces in its interface platform for apps to shine, and they set the reference standard terribly high. Microsoft probably should do the same with Word, Excel and PowerPoint to show that they work well with Metro style UI, which I highly doubt.</p><p>Metro style is stylish. However, the sad truth about being too stylish is that style is not timeless. Styles emerge and decay, while the underlying principles remain and develop. That’s why “<a href="http://www.vitsoe.com/en/gb/about/dieterrams/gooddesign">good design is as little design as possible</a>”.</p><figure><img alt="IKEA model room vs. Smart-ologic Corian® living" src="https://cdn-images-1.medium.com/max/500/0*f6Ty8OvfrDcchIkP.jpg" /><figcaption>Left: IKEA model room, photo by <a href="http://www.flickr.com/photos/anshu_si/">anshu_si</a>. Right: Smart-ologic Corian® living by DuPont.</figcaption></figure><p>A stylish and futuristic living room may looks much cooler with vibrant colors and kinetic lighting. Entirely new designs could make your everyday life exciting. But when we put your everyday routine in this context, we would find ourselves alienated by its style. We can’t find furniture which perfectly matches this style. It will become very strange for any ordinary object even to exist in this room, including ourselves when we’re bored with this design.</p><p>This could be applied to Metro style. It is futuristic but dictatorial. It is intuitive but unnatural. It could be a perfect system on its own. However, it’s so stylish that other styles are not welcomed or even not tolerable.</p><p>On the opposite, a good platform or framework should be simple and non-intrusive, like a well-designed living room. Even though we should still try to make the furniture and decorations match each other, it wouldn’t be inappropriate for anyone to place an extra chair in the room. It wouldn’t be weird for anyone to change the colors of curtains or sofas. Because the design is flexible and tolerant. It can adjust itself to the new changes and makes everything work together.</p><p>Similarly, the design of operating system should welcome changes and diversity. Consistency is important, just like our language. But every app that is running on the system should be given a chance to fully express themselves with their own ideas.</p><p>Only with diversified styles of software, we could explore more possibilities in the universe of interfaces. In this way, new design or technique could be invented and developed. The platform, in turn, could constantly renew itself and evolve.</p><p>In this way, the entire system could be truly alive.</p><p><em>For more on standard and diversity, please read:</em></p><p><a href="https://medium.com/leethree/a-step-back-for-a-leap-forward-8da3a84d5d2">A step back for a leap forward</a></p><p><strong>Update: </strong>I just came across this video from IKEA, which perfectly illustrate how simple things could be more flexible and <em>smarter</em>.</p><iframe src="https://cdn.embedly.com/widgets/media.html?src=https%3A%2F%2Fwww.youtube.com%2Fembed%2FBQjBrt9LriY%3Ffeature%3Doembed&amp;url=http%3A%2F%2Fwww.youtube.com%2Fwatch%3Fv%3DBQjBrt9LriY&amp;image=https%3A%2F%2Fi.ytimg.com%2Fvi%2FBQjBrt9LriY%2Fhqdefault.jpg&amp;key=a19fcc184b9711e1b4764040d3dc5c07&amp;type=text%2Fhtml&amp;schema=youtube" width="854" height="480" frameborder="0" scrolling="no"><a href="https://medium.com/media/3f059990b3146c9927a8a023655b3eda/href">https://medium.com/media/3f059990b3146c9927a8a023655b3eda/href</a></iframe><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=68e770894649" width="1" height="1" alt=""><hr><p><a href="https://medium.com/leethree/thoughts-on-windows-8-68e770894649">Thoughts on Windows 8</a> was originally published in <a href="https://medium.com/leethree">LeeThree on UX</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[On the dark side of the Internet: Part II]]></title>
            <link>https://medium.com/leethree/on-the-dark-side-of-the-internet-part-ii-35a2103c2169?source=rss----13fd360a780c---4</link>
            <guid isPermaLink="false">https://medium.com/p/35a2103c2169</guid>
            <category><![CDATA[copyright]]></category>
            <category><![CDATA[internet]]></category>
            <category><![CDATA[privacy]]></category>
            <category><![CDATA[notebook]]></category>
            <category><![CDATA[essay]]></category>
            <dc:creator><![CDATA[Sirui Li]]></dc:creator>
            <pubDate>Tue, 24 May 2011 11:37:34 GMT</pubDate>
            <atom:updated>2017-08-02T15:55:59.319Z</atom:updated>
            <content:encoded><![CDATA[<p><em>Continued from On the dark side of the Internet: Part I:</em></p><p><a href="https://medium.com/leethree/on-the-dark-side-of-the-internet-part-i-5f2583781e63">On the dark side of the Internet: Part I</a></p><h4><strong>Copyright and culture</strong></h4><p>Keeping emphasizing about “200 years of intellectual property law” (p.145) is a joke. It’s like talking about hundreds years of monarchy to justify the king against the republicans. Arguing that something exists for a long time does not make it more legitimate.</p><p>I think there’s no doubt that intellectual property should be protected and plagiarism is immoral. I hate someone repost articles from other blogs without any permission or even with no source link, which is commonly seen across Chinese websites.</p><p>However, this does not imply that our current mechanism of copyright protection is the perfect solution to the problems. I believe that copyright is now overly used as a tool for big companies making money by dominating the market, instead of promoting culture in order to benefit everyone including the creators.</p><p>This is not a problem of the Internet. In other words, the Internet (or Web 2.0) does not provoke the theft of intellectual property. It’s the drawback of copyright policies that created piracy, then the efficiency of communication over the Internet makes big media companies much more difficult to maintain their dominance.</p><p>Here’s a graph showing how the copyright term in the US is expanded again and again and again over the last centuries (courtesy of <a href="http://en.wikipedia.org/wiki/File:Copyright_term.svg">Wikipedia</a>):</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/500/0*rGRKR-6jQaUiP5CB.png" /></figure><p>We really need to reconsider our way of protecting creativity. There’re plenty of books on this topic [3] and I’m not an expert on that.</p><p>The media companies described in the book were so obsessed with copyright protection that they totally forgot that there were more possibilities. <a href="http://www.hulu.com/">Hulu</a>, for example, is the answer for the Internet by several US media giants, followed by <a href="http://www.vevo.com/">VEVO</a>. People can enjoy the premium content on the Internet legally and for free. And media companies can still generate revenue from advertisements.</p><p>Another <a href="http://www.ted.com/talks/margaret_stewart_how_youtube_thinks_about_copyright.html">example</a> is about how YouTube, which is considered the public enemy of media companies, works with copyright holders like Sony to monetize from illegal uses of copyrighted material. This sounds like a surprisingly crazy idea. But it’s clever enough to make everybody happier than we otherwise could have been.</p><p>Moreover, the huge success of the movie Avatar showed that with the help from 3D and IMAX technology, cinemas can still attract customers by its experience.</p><p>Copyright law is a old way to protect intellectual property. If it won’t work any more, we should be able to find a better way.</p><p>[3] To give an example: <a href="http://www.amazon.com/exec/obidos/ASIN/1594200068"><em>Free Culture: How Big Media Uses Technology and the Law to Lock Down Culture and Control Creativity</em></a> by Lawrence Lessig.</p><h4><strong>Privacy issues</strong></h4><p>An age of total digital surveillance is coming, said Keen (p.183). Now we have Cloud Computing and we take our GPS data trackers everywhere, are we pushing ourselves into jeopardy?</p><p>I agree that privacy is important to everyone of us. Unless the world became ideally transparent (everyone knows everything about everyone else), we need this kind of information asymmetry to protect us from potential intrusion.</p><p>But there’re two sides of the problem: what is considered privacy and to what extent should we protect it?</p><p>For the first part, I would argue that there’s actually no distinction between privacy information and non-privacy information. Every detail of everything about us is telling a story of what this person is like. Sherlock Holmes could find out the occupation of some passers-by with just a glimpse. The reality is perhaps much less legendary but an experienced theft might know if you’re on a vacation by looking at your mailbox.</p><p>Everything you’ve done made a difference in this world and they are the evidence of how you’re living in this world. Some information could be less straightforward than others. But when a large amount of information is gathered, nothing could be too private to be found out.</p><p>That is to say, the only way to guarantee absolute privacy is to isolate ourselves from the civilized world entirely, which, however, is unlikely to be a good choice for anyone. We have to give away part of our privacy to enjoy the benefits from our family, our friends, our community and our society. A close friend who knows you more could help you more than a stranger. The more connection we have with the rest of the world, the more we can get from others, but the more information about ourselves are available to the world.</p><p>Therefore, when we are making friends with our classmates, we are risking some of our privacy. One of them might gossiping around the school about how you fell asleep under the dean’s nose. Similarly, when we are opening an account in a bank, we are risking part of our privacy since an ill-natured clerk could give your information to some travelling salesmen. These unfortunate situations are rather unlikely to happen. But you don’t really have to trust people around you or any organization, though this can make your life much more difficult.</p><p>The same logic holds true for the Internet. Google claims that they collect your information because they need it to personalize your experience using their product and making their products better. This is true. But it’s also possible that their database could be hacked some random day, just like Sony PlayStation Network [4]. So if you don’t trust Google, why are you still using it?</p><p>Because we all want to live a better live. By collecting your personal information, you’ll see less irrelevant advertisements from Google, and Netflix or Amazon can recommend movies or books you might be interested. The Internet is becoming one of the supporting industries of our world, just like financial institutes.</p><p>I’m not saying that Google can do whatever it wants. In fact, people at the Internet companies are more and more aware of the significance of the privacy issue. If consumers no longer trust their product, they would lost their users and the company would end up in serious trouble. As a result, more trustworthy companies will arise and substitute irresponsible ones.</p><p>In summary, the Internet is collecting our personal information. While we embrace the convenience technology brings to our daily life, people will learn that we are not supposed to take candies from strangers. If we take our privacy very seriously, the Internet companies will also take it seriously, and the age of total digital surveillance would never become true.</p><p>[4] Personal details from approximately 77 million accounts were stolen as the result of an <a href="http://www.telegraph.co.uk/technology/news/8475728/Millions-of-internet-users-hit-by-massive-Sony-PlayStation-data-theft.html">“external intrusion”</a> on Sony’s PlayStation Network and Qriocity services in April 2011.</p><h4><strong>What we should really worry about</strong></h4><p>I couldn’t agree more with Keen on one part: “we are easily seduced, corrupted and led astray.” (p.196)</p><p>There’s no fundamental difference between the Internet and any other type of technology. Yet it’s the newest one and probably the most powerful one we’ve ever invented.</p><p>The communication is faster than ever. One video, one picture or one phrase can go viral in just several hours or even less. The greatest ideas can travel around the world at the speed of light as well as the most dangerous conspiracies.</p><p>Our respect and appreciation to talents and intellects won’t die because of this. On the contrary, more people could enjoy better work with less cost. But the problems of humanity could also be scaled by the Internet into big troubles of our world. A few of them is already discussed in the book: gambling, pornography, addiction, junk information, narcissism etc.</p><p>However, the solutions to these problems are never simple. Regulations and rules of the Internet work, but this is only an expedient.</p><h4><strong>Warp-up</strong></h4><p>“We need to find a way to balance the best of the digital future without destroying the institutions of the past.” (p.185)</p><p>This book is good at pointing out the problems of the Internet. But the mistake is, there’s no trade-off or balance between “past” and “future”. Technology revolutions are one-way tickets. There’s no turning back, or half-way back.</p><p>The past could never be wiped-out by technology advancement, because every piece of technology is invented on the basis of the legacy of the past. But the past way of thinking is destroyed as soon as the rule of the game is fundamentally changed. What we need is to rethink the past, to find out what we could learn from this revolution. It’ll become clear that some problems and doubts are just misunderstanding of the present and the future. We have to think beyond the past and present to find the real solutions to real problems.</p><p>The Internet is not an alien invention. It’s deeply rooted in the human society which is developing and improving continually. The dark side of the Internet reflects the dark side of our society. Imperfection is highlighted by the rapid change of our world. Even if we could blame the Internet for these issues, technology alone could not solve them.</p><p>Therefore, the Internet is not a new threat to our culture, nor is rock’n’rock, TV, computer games. Our major threat remains the same: ourselves.</p><p>(All page numbers from <em>Currency 2007 edition</em>)</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=35a2103c2169" width="1" height="1" alt=""><hr><p><a href="https://medium.com/leethree/on-the-dark-side-of-the-internet-part-ii-35a2103c2169">On the dark side of the Internet: Part II</a> was originally published in <a href="https://medium.com/leethree">LeeThree on UX</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[On the dark side of the Internet: Part I]]></title>
            <link>https://medium.com/leethree/on-the-dark-side-of-the-internet-part-i-5f2583781e63?source=rss----13fd360a780c---4</link>
            <guid isPermaLink="false">https://medium.com/p/5f2583781e63</guid>
            <category><![CDATA[internet]]></category>
            <category><![CDATA[mass-media]]></category>
            <category><![CDATA[essay]]></category>
            <category><![CDATA[professionalism]]></category>
            <category><![CDATA[notebook]]></category>
            <dc:creator><![CDATA[Sirui Li]]></dc:creator>
            <pubDate>Wed, 18 May 2011 12:26:00 GMT</pubDate>
            <atom:updated>2017-08-02T15:56:24.417Z</atom:updated>
            <content:encoded><![CDATA[<figure><img alt="The Cult of the Amateur" src="https://cdn-images-1.medium.com/max/99/0*ADJKG7D2SToeb8B3.jpg" /><figcaption><em>The Cult of the Amateur</em></figcaption></figure><p>We live in a world of ongoing digital/information revolution and we can truly feel its impact on our everyday life. But are we changing in a good way or a bad way?</p><p>While there’re thousands of anthems for the Internet, much fewer books are arguing that the Internet (and Web 2.0) is evil. <em>The Cult of the Amateur</em> is one of them.</p><p>For me, it’s a rather strange experience during the last weeks when I was reading this book by Andrew Keen. On the one hand, there’re lots of examples, figures and compelling arguments in the book that show the dark side of the Web 2.0 world we’re living. They are true stories. However, on the other hand, I’ve found that I do depend on these new inventions like Wikipedia to work more efficiently and to live a better life.</p><p>This mixed feeling kind of forced me to rethink about various issues about the Internet which were discussed in the book. In the end, the Internet in the real world convinced me that the world is not working as the one Keen has suggested.</p><p>But this is not a easy answer as it may seems.</p><h4><strong>Truth and authority</strong></h4><p>Wikipedia is a major leading force of Web 2.0 and also the focus of related controversies for the most of time. The problem is: can a social worker really be considered credible in arguing with a trained physicist over the string theory? (p.44)</p><p>Although Keen thought this is obviously “chilling and absurd”, I think the answer should be “yes”.</p><p>Because the truth is credible in itself, regardless of the way it’s expressed. And a POV(Point of View) is valuable if it’s well reasoned, regardless of the authority of the speaker.</p><p>I think Keen must have misunderstood Wikipedia in a big way. Wikipedia does <em>not </em>allow original research [1], that is to say, all the materials on Wikipedia should be verifiable with published sources that directly supports it. That’s why Wikipedia is not built for everyone to say whatever he wants. It is designed to be a free compilation of reliable information that is at least as credible as <a href="http://www.britannica.com">Encyclopædia Britannica</a> [2].</p><p>By empowering the amateur in this way, instead of undermining the authority of the experts (as Keen suggested), we are helping revealing the truth as it is.</p><p>That is the mission of Wikipedia. However, that is not the status quo and it is keep improving. There’re so many entries on Wikipedia that is not sourced and some of them could be totally wrong. But most of the information on Wikipedia is more accurate that you could have ever imagined in a world without it. It’s good enough for most of the tasks we are facing everyday.</p><p>What we get from Wikipedia is a completely free encyclopedia which is <em>almost</em> reliable. Every user of Wikipedia should know about that. If you saw the warning below, but failed your exam by using unverified information from Wikipedia, who should you blame?</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/429/0*Hn8IRWiQDE4OBbDH.png" /></figure><p>There’re no such things as “professional standards of truth” (p.205), there’s only one “standard of truth”.</p><p><strong>Update:</strong> Here’s an illuminating article on the reliability of Wikipedia: <a href="http://nextbison.wordpress.com/2011/05/18/should-you-believe-wikipedia/">Should you believe Wikipedia?</a></p><p>[1] Please refer to <a href="http://en.wikipedia.org/wiki/Wikipedia:No_original_research">Wikipedia:No original research</a>.</p><p>[2] Britannica, interestingly, allows users to contribute to articles now.</p><h4><strong>When we say “democratization”, we mean it</strong></h4><p>According to Keen, Web 2.0 democratization “is costing us a fortune”, “real businesses” are going to die. The reality is: that is the price for democracy. We must pay for it and we will earn it back.</p><p>There’re people in Middle-East fighting for democracy now. The situation is difficult and serious. But they keep fighting at very huge expenses including lives of people. Why do they want civil liberties so much?</p><p>Because democratization is not about record company losing their money or television networks are cutting their expenses. It’s about what is right and what is wrong. It’s about the future.</p><p>When a country is controlled by a monarchy or a group of noblemen, the country is not free. When culture is controlled by a minority of people, culture is not free.</p><p>It’s lucky for Keen to live in the United States where he could enjoy the beloved “mainstream media”. It’ll be my pleasure to welcome anyone to enjoy the “mainstream media” in a country where almost every TV station, every radio station and every newspaper is owned or directly controlled by the government. This may help you better understand what an “Orwellian” culture really looks like and why it should be democratized.</p><p>We may miss the good old days. But I’m afraid that there’s no way back.</p><h4><strong>Professional and expertise</strong></h4><p>So how are we going to earn things back?</p><p>Despite the fact that democratization is inevitable, real talents and geniuses will find their ways to make a living out of their expertise and skills in a better way. Because the notion of UGC(user generated content) is to remove the barrier between <em>everyone</em> and <em>professionals</em>.</p><p>If you are bored, you can either watch a series of hilarious YouTube videos or watch a funny show on a cable channel. What is the difference of the effectiveness between the two choices? Does it really matter if it’s home-made or professionally-produced? If I can read a joke and get entertained to the same extent for free, I don’t understand why should anybody pay for it.</p><p>The general point is, the quality of work is not determined by how professional it is. With UGC sites like blogs, Flickr and YouTube, we now have an open arena where the ideas and work by everyone can compete with each other regardless of the profession and authority of the creators, with a nearly free entrance fee.</p><p>There used to be a big amount of “entrance fee”. If you think your idea is valuable, you’ll have to either pay the publishers and make them happy to get it published, or become an expert in this field, publishing papers and attending conferences to get noticed. Before all these were invented, the priceless ideas could only be transmitted by word of mouth, or forgotten.</p><p>The history of how culture flourishes is basically a history of how communication become easier and faster. The problem we face is not “YouTube or Joost”, “Wikipedia or Citizendium” (p.189). This choice does not even exist because we are moving from Britannica only to Wikipedia+Citizendium+Britannica.</p><p>It’s not a life-or-death struggle, it’s the expansion of spectrum. It’s nothing about replacing, it’s all about complementing.</p><p>The only problem is that the experts and professionals who are not good enough but get used to be protected by the “entrance-fee” barrier are worrying about how to defend themselves against the more talented amateurs in the newly-opened arena.</p><p>In contrary, the truly excellent professionals should not be worrying at all. If the Tower clerks, whom were important to Keen (p.105), are “deeply knowledgeable” enough, they could start a collective blog or online magazine and they would still be cultural tastemakers.</p><p>For the big record companies, you can’t survive this revolution if you are too greedy to change. The premium you got from resisting everyone else from entering the game is gone forever. If you don’t want to lower the price, then spending more money to make better music, instead of wasting money suing the rest of the world.</p><p>This sounds cruel. But this is more fair.</p><p><strong><em>Continued on:</em></strong></p><p><a href="https://medium.com/leethree/on-the-dark-side-of-the-internet-part-ii-35a2103c2169">On the dark side of the Internet: Part II</a></p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=5f2583781e63" width="1" height="1" alt=""><hr><p><a href="https://medium.com/leethree/on-the-dark-side-of-the-internet-part-i-5f2583781e63">On the dark side of the Internet: Part I</a> was originally published in <a href="https://medium.com/leethree">LeeThree on UX</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Is HP Veer too small?]]></title>
            <link>https://medium.com/leethree/is-hp-veer-too-small-201d312a3317?source=rss----13fd360a780c---4</link>
            <guid isPermaLink="false">https://medium.com/p/201d312a3317</guid>
            <category><![CDATA[smartphones]]></category>
            <category><![CDATA[review]]></category>
            <category><![CDATA[screen-size]]></category>
            <dc:creator><![CDATA[Sirui Li]]></dc:creator>
            <pubDate>Wed, 11 May 2011 12:52:00 GMT</pubDate>
            <atom:updated>2017-08-02T15:53:13.848Z</atom:updated>
            <content:encoded><![CDATA[<figure><img alt="" src="https://cdn-images-1.medium.com/max/500/0*ZiBQirPYcZVjycAW.jpg" /></figure><p>HP is recently going to launch a small cute phone, Veer “4G”. While it’s unfortunate for me being outside the U.S., I’ve come across <a href="http://thisismynext.com/2011/05/09/veer-4g-review/">Joshua Topolsky’s review</a> of this smartphone. I was really surprised by the final score “5/10”, which, according to the review, is partially because Joshua thinks “the phone is simply too small”.</p><p>But I don’t think Veer is too small. It’s just because other smartphones are mostly too big.</p><h4><strong>Mobile phones are mobile</strong></h4><p>If you want a full-featured computing device, get a laptop. If you want a powerful and on-the-go Web-browser experience, get a tablet. If you want a device that is always available anytime anywhere, get a mobile phone.</p><p>That’s why size matters. That’s why there’s a phrase called “form factor”. That’s why different sizes of devices are provided for customers to choose from.</p><p>A relatively smaller size is essential for any mobile phone. It does not provide additional value to phones, because it <em>is</em> the value of a mobile phone.</p><p>Furthermore, when phone-calling is “degraded” to the level of a simple app inside the smartphones, “mobile phones” has already become something more about “mobile” than “phone”.</p><h4><strong>How small is “too small”?</strong></h4><p>Different people have very different preferences about sizes. Mobile phones should fit our hands and people should feel comfortable with them. We can still argue that no one would like a tiny phone like Veer.</p><p>Well, is Veer tiny? Comparing to other smartphones, Yes. But comparing to mainstream feature phones, I’d say not really.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/500/0*SbPNv0XfgJ9SgDrc.png" /><figcaption>(Screenshot from <a href="http://www.phonearena.com/phones/HP-Veer-4G_id5199/size">Visual Size Comparison on phonearena.com</a>)</figcaption></figure><p>This is a simple comparison among three small phones and a huge one. Veer is a small smartphone. But from the viewpoint of all the mobile phones including feature phones, I think it’s unfair for Veer to be considered “too small”.</p><p>It’s really interesting to see huge Samsung Infuse will launch the same day as Veer. If it was true that no one would buy a tiny phone like Veer, how could I imagine any one buying such a gigantic smartphone?</p><h4><strong>Trade-off between “smart” and “mobile”</strong></h4><p>As Joshua pointed out, obvious compromises are made for the size of Veer. When talking about smartphones, phone makers and tech reviewers love to discussing the processors they use, the size of memory and internal storage, just like what they did with PCs.</p><p>I’ve got no problem with all these performance comparisons. However, I just don’t get the idea of “you could get a more powerful device for the same price”. Does <em>everyone </em>need a dual-core processor or 720-whatever-p video playback capability on their <em>phones</em>?</p><p>Don’t forget that after all smartphones are mobile phones, no matter how “smart” they are. Technology enthusiasts will always dream about “smarter”-phones until they get the “smartest”-phones. They don’t really care about sizes unless performance is affected, which often leads to super-sized super-powerful “alien” devices.</p><p>That’s fine. But for an average user, a “not-so-smart”-phone is good enough. It’s perfectly reasonable for anyone to get a “more mobile” smartphone instead of a “smarter” one. “The screen is very small” is a feature, not a bug.</p><p>There’s a clear trade-off between “smart” and “mobile”. Don’t assume that more people are going to choose the “smarter” end immediately. And don’t forget that more than half people are still using feature phones for now, even in the U.S. (<a href="http://blog.nielsen.com/nielsenwire/consumer/smartphones-to-overtake-feature-phones-in-u-s-by-2011/">Nielson, 2010</a>)</p><h4><strong>Other aspects about Veer</strong></h4><p>Veer is cute and Veer is small. There’re big numbers of people who love that, including me. There’re reviews in great detail with a completely different conclusions, such as <a href="http://www.precentral.net/review-ATT-hp-veer-4g">the one on PreCentral</a>. But I have to make it clear that I’m not saying Veer is going to be a huge hit (or not).</p><p>Though I’ve never used a WebOS phone, WebOS seems to have a very neat user interface as far as I know. It’s still not a popular mobile OS, probably not as mature as iOS and Android. Its performance and stability can greatly influence the user experience of Veer. And on the other hand, there’re currently much fewer apps for Veer than its rivals.</p><p>Therefore, in my opinion, if HP really did a good job on the software of WebOS, esp. the upcoming WebOS 3.0, Veer could be a really nice choice for people who wants a smartphone with a relatively smaller size than other smartphones.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=201d312a3317" width="1" height="1" alt=""><hr><p><a href="https://medium.com/leethree/is-hp-veer-too-small-201d312a3317">Is HP Veer too small?</a> was originally published in <a href="https://medium.com/leethree">LeeThree on UX</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[How would I change Picasa]]></title>
            <link>https://medium.com/leethree/how-would-i-change-picasa-6dbe36c6f208?source=rss----13fd360a780c---4</link>
            <guid isPermaLink="false">https://medium.com/p/6dbe36c6f208</guid>
            <category><![CDATA[software-design-critique]]></category>
            <category><![CDATA[google]]></category>
            <category><![CDATA[picasa]]></category>
            <dc:creator><![CDATA[Sirui Li]]></dc:creator>
            <pubDate>Sun, 01 May 2011 14:54:42 GMT</pubDate>
            <atom:updated>2017-08-02T15:50:18.342Z</atom:updated>
            <content:encoded><![CDATA[<figure><img alt="Google Picasa logo" src="https://cdn-images-1.medium.com/max/500/0*u9r6sYXfuSimBH4Y.jpg" /></figure><p>I don’t think I’m a photography enthusiast. I don’t take photos too often. But still I have thousands of photos in my hard drives. I believe organizing photos is a headache for many of us, esp. people who always take their cameras with them.</p><p>I’m using Picasa to organize and upload my photos to Picasa Web Album. Generally speaking, Picasa is quite friendly and easy to use. However, it’s not really a pleasant experience for me to get tons of photos in order.</p><h4><strong>Name tags</strong></h4><p>Face detection is a great feature for newer versions of Picasa. We can group photos of friends based on their faces. But this feature becomes terribly confusing when I started to use it.</p><p>The first problem is: the name tags are linked to Google Contacts by default. This could cause a lot of trouble because they don’t have the same implication.</p><p>People love to take photos with celebrities. And I would be very proud to tag Michael Schumacher in my photos (instead of just “ignore” him). But he is very unlikely to be one of my contacts and it’s also pointless for me to make a personal album for him. Because I might not be able to take another picture with him for the rest of my life.</p><p>Generally, there’re people that we know. We’d like to tag them in my photos. But they are not our contacts, they are not part of our lives. We meet people by chance. Their photos are part of our memories but it’s simply impractical to create an album for every single person we’ve ever known about.</p><p>And moreover, relationships change over time. You may not really know someone when you met him for the first time. But you might want to make a personal album for him from the previously ignored face tags of him.</p><p>This leads to another major problem of name tags: you can either tag someone or ignore him or her. That’s not a big problem though. The fatal flaw is that: you can’t easily bring back a face tag you’ve ignored.</p><p>All the ignored faces are grouped into one album. If you accidentally ignored a face, the only way for recovery is to locate it in the “ignored faces album”(#) . But there could be tons of ignored faces because a majority of faces in your photos are passers-by, according to my experience. Just think about the photos you took at exhibitions or downtown streets.</p><p>(#) It’s very strange that sometimes you can’t find this album after every face is tagged. I don’t know if this is a bug or a feature.</p><h4><em>My suggestion:</em></h4><p>What we need is simple: show all ignored faces in a photo or a folder. If we could easily find a face we ignored before, we could tag them anytime. This is a feature that should be built-in when the face detection was first released.</p><p>In addition, add a “reset” or “rescan” button for photos and folders. Then make name tags deletable, thus users could completely delete name tags they don’t want if they messed things up. Moreover, some detected faces are simply not faces. They should be deleted, not “ignored”.</p><p>The major design fault behind these problems is that: face tagging is not an one-off action. Relationships change, so do name tags. Users need to keep maintaining their name tags to make the most out of face detection.</p><p>This seems to be much more complicated than current design. Yes it is. But when the face detection and clustering mechanism is not reliable enough, it’s definitely more reasonable to empower users and provide them more control.</p><h4><strong>Folders and modifications</strong></h4><p>My mum asked me the same question several times: why she can’t find the modified photo in Windows Explorer while she can see them in the Picasa photo library.</p><p>Because Picasa doesn’t automatically save your modifications to disks. You’ll need to click “save” to save and make backups for the original photo.</p><p>Picasa acts like an abstraction layer above the physical files. If you keep your actions inside Picasa, everything should be perfect. But if you want to do something with the actual files (rotate them or move them), you’ll find that things are messed up inside Picasa. Maybe you’ll get a 180-degree-rotated photo or lose all your modification over many photos.</p><p>I understand that this abstraction is necessary because it’s difficult to maintain the consistency between what’s inside the application and what’s actually in the file system. This will work fine if we had a closed file system. But sadly, that’s not true.</p><p><em>My suggestion:</em></p><p>Make a compromise, highlight unsaved photos and prompt for saving when the user tries to navigate away from the application, show the conflicts and inconsistency between the application and file systems and let the user fix them.</p><p>Or take a brave leap, encapsulate the folders of photos, make them inaccessible to average users, and provide convenient ways to allow users export them or modify them with external applications.</p><p>I don’t think there’s a good way in between.</p><h4><strong>Version control</strong></h4><p>I was really surprised when I find out that Picasa made a backup after I rotated a photo and saved it. That is to say, the space needed to save a portrait photo is twice as large as a landscape one. Thus, I’ll have to manually delete all .picasaoriginals folder after I fixed up all the orientation of my photos to prevent Picasa from wasting my disk space.</p><p>On the other hand, this type of version control doesn’t work with external photo editors. When I’ve modified a photo with Photoshop, I’ll have to save the original photo manually and then I have 2 very similar photos in my Picasa library: the modified one and the original one. Then I have to tag the faces twice, and I must be very careful with them when I want to create a Picture Collage or start a slide show.</p><p><em>My suggestion:</em></p><p>Photo stacks will be a much better idea to control versions of a photo. Sometimes we take several pictures to make sure we have the best shot. These photos should be grouped together instead of scattering around. And we should be able to choose a cover photo for this stack. Thus, the stack of photos could be treated as a single photo with different version (including different shots, originals, enhanced version or even different sizes).</p><p>This could be done automatically or manually. But either way, the user should be allowed to make changes to stacks easily.</p><p>(I’ve been told that there’s stack feature in Adobe Photoshop Lightroom, but I haven’t got a chance to try it yet.)</p><h4><strong>External drive</strong></h4><p>When you have tons of photos, it’s not very likely that all of them are in your built-in hard drive. Picasa supports photos on external drives, but a big issue is neglected, which caused my a lot of troubles: the problem of drive-letters.</p><p>Picasa watches the specified folders for changes. It’ll import new photos and remove photos which no longer exist. Picasa will reserve the information when an external drive is disconnected and the photos will appear immediately when the drive is back on-line.</p><p>However, drive-letters will change in Windows.</p><p>For example, you’ve got an external drive with a folder for photos namely “F:Photos”. It was imported into Picasa and Picasa took hours to scan all the photos inside the folder. You’ve disconnected the external drive afterwards.</p><p>One day, you tried to copy some photos from your friend’s hard drive. The hard drive is connected and you opened Picasa, then tragedy happened: Picasa found that there was no such folder named “Photos” in “F:” because that was your friend’s disk. Thus Picasa deleted all the information about “F:Photos” from its database. The next time you connect your external drive, Picasa will take several hours to rescan everything again. And you’ll have to “ignore” all the faces you’ve previously ignored after face detection is finished.</p><p>This is true and it happened three times to me. I felt very terrible when Picasa throws my data away again and again and again simply because the drive letters have changed.</p><p><em>My suggestion:</em></p><p>Don’t make any stupid assumption about the drive letters of Windows file system. They’re unreliable and change from time to time. When an external drive is disconnected, show the folders as “off-line”. If the folder is disappeared like what happened when your friend’s hard disk is connected to your computer, don’t just prune the database without asking the user. And the most importantly, provide options that allows the user to link off-line photos back to its actual folder on disk in case the drive-letter is changed.</p><p>Again, I think features for external drives are <em>supposed to be</em> designed like this in the first place.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=6dbe36c6f208" width="1" height="1" alt=""><hr><p><a href="https://medium.com/leethree/how-would-i-change-picasa-6dbe36c6f208">How would I change Picasa</a> was originally published in <a href="https://medium.com/leethree">LeeThree on UX</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
    </channel>
</rss>