Category: Tech Opinion


By Max Neopikhanov

(The Article has been edited to correct the 116ms figure quoted from Eurogamer.)

Nintendo is poised to achieve another hit with its upcoming Wii U console, thanks in part to their banking on the consumer becoming enthralled by Wii U’s tablet-like game controller.  Likely due in part to the overwhelming success of tablets in the consumer electronics market and the growing interest in cloud based gaming; Nintendo wants to bring in these consumers to the console market in the same way that it brought new consumers with the original Wii’s motion controller.

Sony and Microsoft soon tried to emulate Nintendo’s success with their own version of motion gameplay.  This time around, it seems that Nintendo has taken a page out of Sony’s playbook and turned what was originally a novel, complementary remote play feature of Sony’s Playstation Portable, into a prime selling point.

The idea is simple enough: play all your favorite games straight through the controller.  No television required, just maintain close proximity to the console and all the content will be streamed directly to the controller’s screen.  This was the original promise of remote play when Sony released the service a few years ago.  With the expensive cost of the Playstation 3, the PSP’s lack of a second analog stick, additional trigger buttons and an underpowered wireless 802.11b wireless card, remote play never quite reached its potential as efficient local game streaming.

Nintendo has designed the Wii U to overcome all the issues Sony initially had with the PSP’s implementation of the technology with the gamepad controller being and important and ancillary peripheral as opposed to the PSP’s supplementary and inefficient function as a mobile screen.  It is a much more efficacious approach and also much more expensive and risky, and risky moves have become Nintendo’s forte over the past few years with the Wii and the 3DS.

That’s not to say that Sony has thrown in the towel and given up its technology.  Sony has taken steps to improve remote play capabilities in its latest PS Vita handheld, but is it too late?

The struggling electronics giant has, at least in theory, all the tools and hardware, in the form of the PS Vita and the PS3, to make the Wii U gamepad selling point moot if it can overcome a few hurdles.

The Technology

The PS Vita communicates with the PS3 through 802.11n over a 2.4 GHz band – effective at about 25-30 feet.

The tech behind the Wii U pad has not been revealed but a Nintendo representative has gone on record to say the controller will have optimum performance at less than 8 meters, about 26 feet, which is around the same ball park of a Vita using remote play.

The range at which the device can communicate is largely unimportant if the latency is so great as to make games unresponsive and ultimately unplayable.  How will the Vita and the Wii U pad stack up?

Based on preliminary tests done by Eurogamer, the Wii U has latency of about 116ms ahead of an HDTV when playing New Super Mario Brothers about three or four feet away from the console, which is quite frankly, incredible.

My personal tests with the Vita through playing God of War over remote play yielded approximate results that, while somewhat higher, still allow the game to be enjoyed fairly lag free with an occasional, very minor lag spike.

Ultimately, the Wii U edges out against the PS vita when it comes to input latency, but is closely tied in signal strength, at least according to preliminary reports.

With games actually being playable and enjoyable on both devices, the next important factor is the image quality.  After all, most gamers wouldn’t appreciate low bit rate content at a very low resolution on their brand new device in the year 2012 – especially when high definition video can be streamed through cellular broadband to pretty much any device.

The Wii U tablet has been reported to boast excellent image quality that appears free of artifacting or other eye-sores often associated with streaming video.

Picture quality on the Vita isn’t as great but is generally serviceable

The device has three options that range from near perfect, albeit bandwidth demanding image quality that effectively requires a 15-20 feet distance from the console, to  very low bitrate, early 2006 Youtube-ish quality for poor signal conditions or when playing at more than 35 feet.

The middle option is functionally ubiquitous enough to be selected when playing at different ranges from the console yet provides a nice balance in quality between the other options.  Fast moving scenes may show some artifacts but everything generally looks pleasing to the eye.

The Wii U gamepad has been reported by Nintendo representative to have three different power levels based on range from the console, although the exact specifics of their effect on image quality have not been revealed.

Superior image quality during optimal conditions goes to the Nintendo Wii U, based on empirical tests and preliminary reports.

Perhaps the one area where Sony’s handheld can outperform and outshine the dedicated streaming technology in Wii U’s gamepad is the Vita’s support for remote play over a wireless internet connection.  If the distance between the PS3 console transmitting the content and the Vita receiving it isn’t more than a few miles, to avoid high latency, and the internet connection for both is fast and stable, then remote play can offer a relatively smooth and enjoyable gaming experience when far away from your home console.

You may not get decent performance using public Wifi while having a coffee at Starbucks but a dedicated connection at a friend’s house can possibly eke out enough performance to enjoy a game or two while away from home.

Nintendo could in theory announce similar support for the Wii U gamepad but nothing has been mentioned or discussed yet.

The Software

Sony’s remote play may have been around since the middle of the Playstation Portable’s life cycle, yet it seems that the company has only just elevated the feature from the hazy clouds of “neat concept” into the burning stratosphere of potential “system seller” with the release of the PS Vita.

With that said, Sony largely abandoned the feature in the Vita’s early months on the market and has only recently released updates for the God of War and the Ico and Shadow of the Colossus collections.  The company initially demonstrated remote play support on other major titles like Killzone 3, yet none have been released.

With the already supported remote play titles released in the PSP era, such as Lair and LEGO Batman, this brings the total of remote play compatible titles to only a handful PS3 titles, and all Playstation Network PS1 titles – which can be downloaded on the PS3 and transferred to be played natively on the PS Vita, erasing the need to stream them wirelessly.

It may be the right start to a future of growing support and improvements or it could be the unfortunate reality of too-little-too-late.

Most if not all of Nintendo’s first party efforts and many third party titles on the Wii U will support gameplay through the gamepad alone, though some games, like ZombieU and Assassin’s Creed curiously will not. Original Wii games have been reported to not be playable exclusively through the gamepad.  Still, it’s much more support than what Sony’s remote play is currently offering.

Cost and Value

Nintendo recently revealed a $299.99-$349.99 price range for the Wii U, a price point that is higher than any other console they have yet produced.

The gamepad, which is slated to be sold in Japan for about $173 in after currency conversion to USD, will be the most expensive first party controller ever released in the console market.  Still, a Wifi only PS Vita will set you back $250 at major retailers and with the cheapest PS3 SKU available at $250, the total cost of a remote play set up, at least at face value, could cost the consumer nearly $500 USD.  That cost could be lowered to around $450 if Sony reveals the all but confirmed PS3 revision which is rumored to retail at $200, at this week’s Tokyo Game Show.

If playing games remotely on a controller is your primary goal, then the Wii U, which can be bought for as little as $300 is likely the more cost effective option.  The more expensive PS Vita and PS3 combo benefits from Sony’s cross-buy initiative, where purchasing certain titles on the PS3 will accord a download code for the PS Vita version of the game, and from the PS Vita’s own library of software and features that the Wii U gamepad doesn’t have and can’t replicate due to the lack of on-board hardware processing.

The Future

Having just finished God of War through remote play on the Vita, I must say that the feature afforded me an opportunity to play a game I probably wouldn’t have played otherwise. Not having to turn on the television – At least on the Vita, the remote play feature allows the user to remotely turn on their console – is a huge incentive for those who, like myself, don’t always want or even have the time to set everything up and enjoy games in the living room.  Such a feature is what Nintendo is hoping will galvanize its target audience to give its brand a chance.

Studies have shown that the majority of users of mobile gaming devices, Android tablets, and the Apple iPad use their devises at home more than they do outside.  The Wii U pad may never take the place of a dedicated tablet device as a premier mobile platform but it may fulfill similar functions within the home when near the wii U console at a relative fraction of the cost of a fully featured tablet.  With that said, Nintendo has to be adamant to explain the difference to the average consumer who may buy the console thinking that the gamepad is a complete and portable gaming tablet.

Come launch day, Wii U owners will be able to experience playing games remotely at a larger capacity than PS Vita owners can right now.  But a couple things should be kept in mind as Sony looks to the future of the PS Vita and their next home console.  Firstly, the much tooted and presently underutilized 3G feature in appropriately equipped PS Vitas can potentially be used to access a Playstation home console from anywhere you have good signal, so long as there is sufficient bandwidth and low latency – two considerations that admittedly pose a challenge with current cellular infrastructures but that can be resolved and improved in the future.

Secondly, Sony will likely include the remote play feature with the PS3’s successor, allowing the next generation of Playstation content to be played remotely.  Cost to the consumer will likely be even more prohibitive in such a combination but improved encoding and transmission algorithms and better wireless hardware could make for a better experience.   Not to mention the prospect of experiencing next generation graphics over a wireless connection.

Sony is currently in a precarious situation where they have the technology and the means to disrupt Nintendo’s primary selling point of its upcoming console but don’t want to brand the Vita as a cloud gaming device at a time when publishers are having trouble moving software on the system.

Ultimately, the Wii U is set to become biggest and most comprehensive example of local wireless remote gameplay. It features a controller designed specifically to stream content as efficiently as possible and has the support of several high profile developers and publishers to provide content truly worth experiencing.

But if Sony continues to invest in and expand its technology and software support, like it has these past few weeks with updates of two PS3 titles for use with remote play, it may sway consumers to experience cloud gaming using Playstation hardware.

And with the future of cloud gaming services like the beleaguered Onlive in purgatory, support for “local cloud gaming” from major console manufacturers couldn’t be more welcome.

BY MAX NEOPIKHANOV

Several gaming news sites have recently released the first footage of NeverRealm Studios’ upcoming PS Vita port of Mortal Kombat.  The rather poor looking footage shows off some of Mortal Kombat’s gameplay, including 150 new challenges and various motion controls used in the game’s challenge tower mode.

At a glance, the demo featured graphics largely reminiscent of its console big brothers; albeit without persistent blood splatter or global lighting.  Upon further inspection, particularly during the up close and personal fatalities, Mortal Kombat on the Vita loses its semblance to the current gen systems and instead appears more akin to the last generation and the Nintendo 3DS.

It’s expected for the PS Vita to have downgraded graphics from its bigger sibling, the PS3, considering it reportedly has half the pixel fill rate and a much slower, albeit still relatively speedy CPU.  The PS3 drives the differential wedge in performance even further with its ability to use the Cell processor for post processing and shading effects – a feature that has allowed it to keep up with the Xbox 360 which has considerably more raw performance potential. Polygon and fill rate counts aside, the PS Vita should be able to support most if not all the shaders and lighting techniques used in home consoles by virtue of the now industry standard shader model 3 support and a multi-core CPU.

Why then is Mortal Kombat on the Vita using hardly any of these DX9 functions?  Instead of the gorgeous skin shaders complete with specular highlights and global lighting in the console version , the Vita appears to be using  fixed function non programmable shaders similar to those used on the PS2, Nintendo Wii, and most recently the Nintendo 3DS.

Fixed function shaders can look pretty good, even imitating the look of real programmable shaders as is demonstrated by Nintendo’s gorgeous Super Mario Galaxy on the Wii and Capcom’s Super Street Fighter IV on the 3DS.  The vita doesn’t need to utilize these shaders; just look at how great Uncharted: Golden Abyss looks.  Of course that title is developed under supervision of Sony and with the aim of making their hardware look damn good, in effect putting pretty much every other demonstrated Vita game to shame aesthetically.

The problem of nearly nonexistent shaders in Mortal Kombat is further compounded by the much lower polygon count, lack of lighting effects or permanent blood, and much lower texture quality.  If anything positive can be said about the game’s visual presentation, it’s that the backgrounds have retained much of their quality from the home version; each stage is an animated vista with captivating atmosphere to boot. It’s just a shame that the primary focal point of a fighting game – the characters – hasn’t been fully realized.

With all said and done it is important to point out that…it isn’t all said and done. The game has a few months before release for NetherRealm Studios to tighten up the graphics engine and fix some of these visual shortcomings; though in all likelihood, and especially considering the game has been in development since 2010, the preview shown is fairly close to the retail product.

The most important thing is of course the gameplay. And judging from the demonstrated footage, Mortal Kombat sure plays the part, even if it doesn’t quite look it yet.

Image credit to IGN.com

Before the current generation of consoles there was no such thing as ‘DLC’ (downloadable Content), at least not in the sense that the term is used today.  Developers tried give gamers the most out of their product through expansion packs with the purpose of adding additional ‘content’ to the core experience of a game.  Usually this equated to several new levels and an expanded/alternate story line.  Expansion packs usually cost about a half or a third of the original game and like DLC require you to have the original installed.

Aside from the obvious key difference of the delivery method of the content (just try to download 1GB of content on a 56k modem in 2000!) the quality of the content has dramatically changed.  Note I’m not referring to the few traditional expansion packs out there such as those for the Warhammer Dawn of War series, Dragon Age, or those for the many MMOs available.  The type of micro DLC popularized by Microsoft and Sony on their home consoles has found it’s way into computer games over the past few years, both with good and bad effect -though honestly…mostly bad.

Cost of Content

The first negative to DLC on PC’s is the cost to content ratio, or the value of the content.  Microsoft charges gamers an average of 800 MS points (or about $10) for the majority of downloadable content, though some small DLC packs like extra weapons or character skins can go for around 200-500 MS points.  $10-15 sometimes nets you a few multi-player maps or single player levels though rarely add a significant amount of game-play or additional story.  Whereas Half-Life 2 episode 1 and 2 offered several hours of new and expanded game-play and added insight to the series mythos. I would gladly be willing to pay $15 or $20 for 6-7 hours of quality content that expands a games story arc or adds significant multi-player enhancements.  In the case of multi-player, Valve software have given away an enormous amount of multi-player content for Team Fortress 2 (though have also introduced one of the most egregious and overpriced micro-transaction systems to date) Sadly, much some publishers have the nerve to charge $5-10 to play as a particular character or unlock certain stages or extra’s included on the disk.  Luckily this is a mostly console exclusive effect as I have seen few PC publishers with the audacity to attempt something so low.

Delivery

The way the content gets on your computer has changed considerably thanks to much higher internet speed, but is it really all that much more convenient?  The problem with most DLC delivery methods is that most companies offering DLC do it through the less than stellar Games For Windows Live.  Having to buy MS points while wrestling with the in-game GFWL application can range from arduous to downright painful when something goes wrong with it.  Some publishers have easier methods involving CD keys -which can work better in many cases- but still lack an easy and universal method.  Even steam users who purchase a Games for Windows title must go through Micosoft’s app to get the DLC.

Much of the DLC content available on consoles never makes to to the PC version of a game, especially if it isn’t a GFW title.  In some cases it simply takes a long time. Minerva’s Den and the Protector Trials DLC packs for BioShock 2 took more than a year to finally arrive on the PC.  for GTA IV, The Ballad of Gay Tony and The Lost and the Damned also took over a year to make it to the PC.  Obviously Microsoft and the other publishers do not see a big enough market for DLC on the PC front and have devoted resources to making the most from the consoles.  It’s a sad state of affairs but with the way GFWL is going I predict it won’t get any better any time soon.  The one hope for easy DLC remains with Steam if they can somehow convince publishers to avoid GFWL.

Digital delivery is the future of content whether we like it or not; hopefully publishers will push out meaningful content in a streamlined way in the near future.  There’s always the modding communities for us PC gamers who don’t want to jump on the micro-transaction bandwagon, though even that unfortunately has a questionable future given the small amount of modifiable games out in the market now.


Microsoft has released its highly anticipated PC version of Fable 3 a few days ago and it’s largely good as far as action RPGs go, though not without problems.  PC gamers are notoriously critical of console ports for good reason: more often than not they are simply quick cash-ins as opposed to full featured PC experiences.  There is probably no worse an indicator of this than poor performance due to a lack of optimization and polish.  The good news for Fable fans is that the game can perform decently on most computers, the bad news is that unless you own a fairly recently built rig you probably need to do some tweaking to get this game to run fairly well while looking  good.

The in-game graphical tweaks are fairly robust though not as much as most PC gamers would like; various detail sliders control the effects quality, texture quality, anisotropic filtering, model detail, terrain detail, shadow detail, and view distance.  The problem with these sliders is that they don’t really let you know what graphical effects are being changed at the different slider levels.

A quick look at the VideoOptionsConfig file in the game’s main directory with Wordpad reveals the different details and the slider level needed to switch on the effect or change the value. Setting any value above 5 in the effects section of the config will disable the effect completely.

Most of the game’s effects do not have a large impact on performance if using a graphics card made in the past 3 years.  The two exceptions being draw distance, which is largely limited by your processor, and shadow quality, which even in its most basic form cuts the framerate nearly in half.

For a low end configuration such as a laptop or one with an older video card (Geforce 8800gts, radeon 3850..etc) the following values for the VideoOptionsConfig file should make the game look like its Xbox 360 counterpart but with higher resolution shadows.  All effects are included with the exception of depth of field and temporal anti-aliasing, both of which make the game look like a horrid and blurry mess from the bygone era of awful ghosting on early LCD screens.

If any of these visual effects settings in the beginning of the config file (motion blur, bloom, SSAA, etc) are desired, they can be enabled simply by lowering their respective values in the config file to bellow 5.  Any value above 5 will disable them.

The shadow and view distance has been tweaked to give a good balance between viability and playable frame rates. This config, coupled with a reasonable resolution and hardware should allow the game to run at 30+ fps.

1) Copy these values to the VideoOptionsConfig file in the Fable 3 Directory:

<?xml version=”1.0″ encoding=”utf-8″?>
<ConfigSettings>
<IntSettings>
<BloomMinimumEffectsDetail>2</BloomMinimumEffectsDetail>
<RadialBlurMinimumEffectsDetail>2</RadialBlurMinimumEffectsDetail>
<DepthOfFieldMinimumEffectsDetail>5</DepthOfFieldMinimumEffectsDetail>
<MotionBlurMinimumEffectsDetail>2</MotionBlurMinimumEffectsDetail>
<TemporalAAMinimumEffectsDetail>5</TemporalAAMinimumEffectsDetail>
<TemporalAAMinimumEffectsDetailMGPU>99</TemporalAAMinimumEffectsDetailMGPU>
<SSAAMinimumEffectDetail>5</SSAAMinimumEffectDetail>
<SaturationMaskMinimumEffectsDetail>2</SaturationMaskMinimumEffectsDetail>
<SpecularMinimumShaderDetail>1</SpecularMinimumShaderDetail>
<EnvMappingMinimumShaderDetail>5</EnvMappingMinimumShaderDetail>
<RainAndSnowMinimumEffectDetail>1</RainAndSnowMinimumEffectDetail>
<FresnelMinimumShaderDetail>2</FresnelMinimumShaderDetail>
<TieredOcclusionMaximumModelDetail>4</TieredOcclusionMaximumModelDetail>
</IntSettings>

<FloatSettings>
</FloatSettings>

<DetailSettings>
<!– Shadow detail –>
<DirectionalShadowBufferSize>256,512,1024,2048,4096</DirectionalShadowBufferSize>
<SpotLightShadowBufferSize>256,512,1024,2048,4096</SpotLightShadowBufferSize>
<InnerShadowRangeMultiplier>0.6, 0.6, 0.6, 1.4, 1.6</InnerShadowRangeMultiplier>
<OuterShadowRangeMultiplier>0.6, 0.6, 0.6, 2.0, 2.5</OuterShadowRangeMultiplier>

<!– Water detail –>
<ReflectionTextureResolution>0,256,512,1024,2048</ReflectionTextureResolution>
<ReflectionOceanWaterResolution>0, 0, 0, 64, 256</ReflectionOceanWaterResolution>
<ReflectionPatchBoundsClip>1, 1, 1, 1, 0</ReflectionPatchBoundsClip>
<ReflectionTreesFlag>0, 0, 0, 1, 1</ReflectionTreesFlag>

<!– Tree detail –>
<TreeLODDistanceMultiplier>0.6, 1.0, 1.3, 1.6, 2.0</TreeLODDistanceMultiplier>
<TreeAnimationDetailLevel>0,1,2,2,2</TreeAnimationDetailLevel>
<TreeDrawDistanceMultiplier>0.9, 1.0, 1.2, 1.4, 1.6</TreeDrawDistanceMultiplier>

<!– HeightField detail –>
<HeightFieldLodFactor>0.5, 0.5, 0.7, 1.3, 1.5</HeightFieldLodFactor>
<HeightFieldFadeFactor>0.5, 0.7, 1.5, 3.0, 4.0</HeightFieldFadeFactor>

<!– Draw distance –>
<StaticEntityDrawDistanceMultiplier>1.0, 1.5, 3.0, 4.0, 6.0</StaticEntityDrawDistanceMultiplier>
<AnimatedEntityDrawDistanceMultiplier>1.0, 1.2, 3.0, 4.0, 6.0</AnimatedEntityDrawDistanceMultiplier>
<VillagerDrawDistanceMultiplier>1.0, 1.0, 2.0, 3.0, 4.0</VillagerDrawDistanceMultiplier>
<CreatureDrawDistanceMultiplier>1.0, 1.0, 2.0, 3.0, 4.0</CreatureDrawDistanceMultiplier>
<StaticMultipleMeshDrawDistanceMultiplier>1.0, 1.0, 2.0, 2.5, 3.0</StaticMultipleMeshDrawDistanceMultiplier>
<RepeatedMeshDrawDistanceMultiplier>0.0, 1.0, 2.0, 3.0, 4.0</RepeatedMeshDrawDistanceMultiplier>

<!– Model detail –>
<LodScreenFractionMultiplier>1.0, 2.0, 3.0, 4.0, 5.0</LodScreenFractionMultiplier>
<BoneLODDistanceMultiplier>2.0, 2.0, 2.0, 3.0, 3.0</BoneLODDistanceMultiplier>

<!–Texture anisotropy –>
<MeshTextureAnisotropy>2, 2, 4, 8, 16</MeshTextureAnisotropy>
<GroundTextureAnisotropy>2, 2, 4, 8, 16</GroundTextureAnisotropy>
<GroundNormalAnisotropy>2, 2, 4, 8, 16</GroundNormalAnisotropy>

<!– Texture pool sizes –>
<TexturePool0>1024</TexturePool0>
<TexturePool1>1024</TexturePool1>
<TexturePool2>64,  96, 128</TexturePool2>
<TexturePool3>64,  96, 128</TexturePool3>
<TexturePool4>96, 160, 256</TexturePool4>
<TexturePool5>32,  80, 128</TexturePool5>
<TexturePool6>16,  96, 192</TexturePool6>
<TexturePool7>12,  48,  64</TexturePool7>
<TexturePool8> 6,  32,  96</TexturePool8>
<TexturePool9> 4,  24,  32</TexturePool9>

<!– Benchmark scores for auto-calibration–>
<ModelDetailDefaultBenchmarkScoreCPU>0,2,3,4,5</ModelDetailDefaultBenchmarkScoreCPU>
<WaterDetailDefaultBenchmarkScoreCPU>0,3,4,6,10</WaterDetailDefaultBenchmarkScoreCPU>
<ShadowDetailDefaultBenchmarkScoreCPU>0,3,4,6,10</ShadowDetailDefaultBenchmarkScoreCPU>
<DrawDistanceDefaultBenchmarkScoreCPU>0,3,4,6,9</DrawDistanceDefaultBenchmarkScoreCPU>

<TextureDetailDefaultBenchmarkScoreGPU>0,2.5,3,4,6</TextureDetailDefaultBenchmarkScoreGPU>
<ModelDetailDefaultBenchmarkScoreGPU>0,2,3,4,6</ModelDetailDefaultBenchmarkScoreGPU>
<WaterDetailDefaultBenchmarkScoreGPU>0,3,5,8,14</WaterDetailDefaultBenchmarkScoreGPU>
<ShadowDetailDefaultBenchmarkScoreGPU>0,4,8,12,16</ShadowDetailDefaultBenchmarkScoreGPU>
<EffectsDetailDefaultBenchmarkScoreGPU>0,2,3,4,4</EffectsDetailDefaultBenchmarkScoreGPU>
<EnvironmentDetailDefaultBenchmarkScoreGPU>0,2,4,6,8</EnvironmentDetailDefaultBenchmarkScoreGPU>
<TreeDetailDefaultBenchmarkScoreGPU>0,2.5,4,6,8</TreeDetailDefaultBenchmarkScoreGPU>
<DrawDistanceDefaultBenchmarkScoreGPU>0,3,6,10,14</DrawDistanceDefaultBenchmarkScoreGPU>

</DetailSettings>

</ConfigSettings>

2) Set the Graphics to the following options (Set textures at high if your video card has less than 1GB of VRAM)

3) The game should look like this: 

This config was tested on an Acer 3820TG notebook with a Radeon 6650m and an Intel core i5 CPU and a desktop with a Radeon 4850 and a Phenom II x4 Processor.   V-sync was enabled on both machines and limits the framerate to 30fps.  If your computer is equipped with a more modern card like a Geforce 480GTX or a Radeon 5870 then it may be worth using D3DOverider to force the V-sync to render the game at a locked 60fps, though I found 30fps to be adequate for this type of action RPG.

A full review of the PC version of Fable 3 will be posted soon.

If you have any further tweaks for Fable 3 or suggestions for future tweak guides, post them in the comments section. 

This is the first article in an upcoming series of articles analyzing the status of PC gaming in 2011.

Talks of piracy, development costs, MMO dominance, eastern vs. western markets, the port syndrome, and digital distribution have become all too common as the downfalls and saviors of PC gaming.  Probably the most important factor for both gamers and publishers/developers is undoubtedly cost, and in that regard nearly all of the discussions about the future of PC gaming stem from that.

Cost to develop and publish a product, like in all entertainment industries, must not exceed the profit margin.  That’s simple economics.   What’s not simple economics are the factors involved in collecting revenue in our current PC market space.  Retail games have been steadily on the decline with the advent of digital distribution platforms such as Steam and Direct2Drive.  Production costs are minimized in the digital market, as is the shelf space competition of traditional brick and mortar, and to a lesser extent online warehouse retailers.

Quarter year sales aren’t quite as important in the digital market since the games will have a long shelf life. In addition, older titles can be (and are) discounted as part of a sale.  The downside to this is the expectation for big 50-75% off sales that gamers, myself included, have come to expect from the big digital distribution platforms.  In the process of fueling sales and demand through temporary slashed prices publishers ‘train’ gamers to wait for but price cuts.  Big bundles including an upwards of 20 games are offered for less than the cost of one recently released title at full price.  Several people I’ve spoken to have amassed huge collections – think 100+ games – primarily through such sales.

This strategy of digital sales is not unlike the one featured in the mobile games market on the iPhone and Android marketplaces.  Whereas $1-2$ seems to be the sweet spot for mobile games and apps, Steam and direct2drive seem to move a lot of software in the $10-15 range during sales.  I have in fact bought games on sale for $5 (after a 75% discount) simply to add to my collection.  Some of the games haven’t even seen the embrace of an installation and test run.

Statistics from digital distribution are unfortunately absent as big companies like Steam and Direct2Drive do not release such information to the public.  Still, if the current retail sales are to be compared with my findings and the nearly 2 million players online on steam at any one time, profits however small, are being made. In the wake of rampant piracy, publishers want every potential bittorent download to becomes a legitimate Steam purchase, even at the cost of marking down the product by 75%

The other school of though in the PC industry is quite the opposite: increase prices to make up for lower sales.  The three big publishers – EA Games, Activision – Blizzard, and Ubisoft – have all launched products in flagship franchises at $60 a pop both at retail and through digital download.  Call of Duty, Splinter Cell, Assassins Creed, Star Craft , and Crysis have been marked out, because of their success and popularity amongst PC gamers, as premium franchises.  Extensive anti-piracy measures have been taken by the companies to protect their products, most of which have been received with ardent rejection and stark criticism by the PC community. Ubisoft’s attempt to keep gamers tethered to an internet connection in order to play several games was particularly unnerving and served as one of the reasons for me skipping out on the latest Splinter Cell and Assassin’s Creed games.

Somewhere between these two polar opposites of $60 juggernauts and $60 twenty game collections there must exist a happy medium where publishers and gamers can coexist happily.  Companies that have traditionally passed aside PC gaming, such as Capcom, have offered more support over the past two years – and in the case of Capcom, promise to improve even more.  Microsoft has made a monumental decision of releasing it’s Flagship role-playing game on Steam, a sign of their intent in seeing the growth of the PC gaming industry as a whole and not just their Games for Windows digital distribution platform.

My one hope is that PC gaming doesn’t devolve into an industry of only  AAA $60 blockbusters and $5 indie games.  Growing hardware sales and increased interest in developing markets will assure that PC gaming will never ‘die’ as some have been quick to point out. Game experiences on the PC platform such as the ever-growing Team Fortress 2, Warhammer Dawn of war II, Star Craft 2, the upcoming The Witcher 2 (so many sequels!)  and Star Wars: The old Republic  stand out for PC users. The PC market is an ever evolving one, and with hundreds of millions of capable computers on the market it’s only a matter of coming up with new ways to harness that potential and make a good profit.

Next topic in the series: PC DLC (download content)

While on my quest to find the ultimate handheld gaming device, I decided to go down a different avenue than the manufactured hardware most of us usually buy: I wanted a homemade “portibalized” console.

Now I’m certainly no slouch when it comes to electronics repair – sans soldering of course – but one quick look at the guides posted online on the portable video game hacking cite Benheck.com  made me reconsider my abilities and dampened my prospects of making my own portable console.  Still intent on owning one, I found a respectable modder whom I commissioned a portable GameCube from for about $750.

Yes that’s right, $750 big ones.  As shocking as it may sound, this is is the standard figure for a good looking portable GameCube.  Most of it going to cover the labor costs as opposed to the actual hardware as a GameCube system can be found on eBay for around $30.  The other big contributor is the screen and case, the former usually being an out of print PS1 LCD screen and the latter being  molded plastic or CNC (computer numerical control) cut plastic/MDF wood.  My particular ‘cube was to be cut on a CNC machine – which can go for an upwards of a thousand  dollars or more to buy or build yourself.

I chose the GameCube not only because I’m partial to Nintendo games, but also because the PS2 slim is already fairly portable with a very nice attachable screen, an Xbox is simply impossible to shrink down, and the other older consoles are capable of being emulated on other handhelds.  Being a true Zelda fan at heart I couldn’t help but be excited at the prospect of playing Twilight Princess or Wind Waker on a portable.

Because of space inside the unit and monetary constrains I could only have a 2500mah battery included, which roughly translates to about an hour battery life.  Still, an hour of my favorite Zelda game on the go would be worth it; unfortunately things didn’t work out that way and in the world of portable consoles that is simply a part of the deal.

I received the GameCube from overseas about a month after the project was commissioned.  It’s pretty incredible to see a full-fledged console squeezed into such a small form factor, and despite its size compared to a Nintendo DS I was extremely impressed. Aesthetically, the GameCube looked almost like a manufactured product, albeit a little rough around the edges.  Ergonomics aside it felt great playing Twilight Princess for the little time I had it.    Unfortunately the impression did not last and the unit immediately fell prone to constant disk read errors and the battery refused to charge.  To add insult to injury the memory card did not work and one of the triggers wouldn’t register being pressed.  Cue the massive disappointment.

Having browsed through the forums I knew that there could be a few issues somewhere down the road; what I didn’t know is that I would get them so quickly.  Portibalizing requires a lot of modification of internal components, which increases their chance of failure in the process.  Compounded with using hardware that likely has seen several years of use, these issues make it difficult to really own a portable GameCube and even more difficult to justify spending $750 bucks — especially if you really have no idea how to fix it.

Some of the older hardware like the SNES or the N64 probably makes for better portables considering they have less moving parts, thus fewer variables to break.  It doesn’t hurt that the parts for the older consoles are so easily acquired on eBay. So if you have the technical know-how to attempt at making your own portable then making and owning one is easier due to your ability to maintain it should something go wrong.

Especially with something as advanced as a GameCube or PS2, I wouldn’t advise anyone to spend large amounts of money on buying one, and even building one can become prohibitively expensive.  Sometimes it’s not even a matter of skill and the quality of workmanship – although that can contribute greatly – as it is a simple fact that the homemade hardware  has a much greater chance of failure than your typical manufactured console.  Of course Red Ring of Death affected Xbox 360’s need not apply – although it’s worth noting that Microsoft will gladly fix your console or send you a replacement.  It’s much harder to do that when your homemade handheld console is one of a kind and needs to be shipped halfway across the world to be repaired.

Ah well, there’s always the 3DS.

Thousands of brave men and women – and a few, hopefully, supervised children – braved the weather and fierce competition to get a Nintendo Wii at launch.  I was amongst such a crowd for over eight hours at the Nintendo World Store in New York City.  One year later.  That’s right, a year later the system was sold out everywhere and was going for an upwards of $400-$500 on eBay.

While it’s certainly not Christmas time right now, the 3DS simply isn’t flying off the shelves.  It’s certainly reasonable to assume that if the demand for the system is exceeded by the supply, prices will remain fairly close to MSRP near launch.  The thing is, brand new 3DS’ are consistently going for $225-$230 including shipping.  That’s with buy it now prices, auctions can sometimes be even lower.

Now granted retailers make up the vast majority of 3DS sales yet one can’t quite grasp how so many are available at nearly $25 off MSRP.  When you factor in sales tax it can be closer to $40-50 off of MSRP. When you factor in shipping cost plus eBay fees, an eBay seller would receive something like $210 for the 3DS.   Sites like Craiglist also offer a plethora of people offering to sell new 3DS’ for sub-retailer prices.  Is Nintendo selling the 3DS to retailers for a lower cost?  If so why? Even sales of games must be low considering Amazon is offering a $10 coupon after purchasing a 3DS game – excluding a few such as Samurai Warriors for some reason.

Somehow I feel that the 3DS might not be as much of a success as the original DS was, and in fact still is.

Having played video games since I was 7, I have had fond memories of the Playstation/N64/Saturn era, the Ps2/Gamecube/xbox era, and Sega’s little console that unfortunately could not – a console stuck between generations, the Sega Dreamcast.  All of those consoles captivated me at one point or another, each having it’s moment in the spotlight and a few games worth remembering, until being replaced by a newer, faster successor.  Five years seeming to be the magic number for each new iteration to be released unto the public.  Sure the Xbox 360 has served me well for the past 6 years but I’m sure as hell glad to have my brand spanking new  ‘NeXbox’.  Wait…what?

Yes folks, if you would take the time to stop playing the latest Call of Duty installment (Black Ops while this article is being written), you’d see that you’ve been playing that trusty Xbox 360 since 2005.  Sure Microsoft and Sony both have released updated ‘slim’ models to their consoles but that’s not quite the same thing. Nintendo hasn’t  made any revisions to the Wii and instead have chosen to focus on their 3DS product in the hand-held market.

Sure near constant yearly updates of the Call of Madden, errr ‘Duty’ series, has kept Ps3 and 360 owners content but what about Wii owners?  Unless your idea of gaming involves burning calories or playing a few first party titles – bones thrown to us gamers – you have probably switched to one of the other consoles.

Playstation creator and madman Ken ‘Kurtz’ Katugari claimed back in 2005 that the Playstation 3 will have a 10 year lifespan, no doubt in an effort to entrench the Playstation brand in the market and put a Playstation in every home. (Oh the horror! the horror!) Katuragi has since been replaced at the company yet the mantra has remained. Sony is currently in third place in sales.

For all their billions in stocks and assets, Microsoft has made egregious errors since launching the 360 back in 2005; chief amongst them being the poor hardware design resulting in overheating and the subsequent dreaded ‘Red Ring of Death’. Having since revised the 360 hardware, it should be smooth sailing from here on out.  how long is that you may ask?  They too have held the position that the console life-cycle could last much longer than previous generations, well into 2015.

In the mid 90’s Sega tried to prolong the life of its popular Genesis system through the release of several add-ons and hardware revisions – The 32x and Sega CD being the most widely known.   The game support for the two expansions was not extensive and instead detracted Sega from working harder on its Saturn system.  Nintendo released an expansion to it’s N64 system dubbed N64DD yet quickly scrapped support in favor of focusing on their next system, the Gamecube. Both Sega and Nintendo struggled for dominance in a market beginning to be dominated by Sony and their Playstation.

As history has shown, console hardware add-ons are usually not successful in the long term.  Microsoft is currently in second place behind Nintendo with 50 million, or more than twice as many consoles sold than their previous Xbox system, yet is spending resources and focus on its new Kinect motion sensor add-on.  It’s a good product in the few demo’s I have tested but not one that justifies holding off the next generation of consoles for another three or so years.

Sony has released an add-on for their system as well, albeit one that is not as expensive or expansive as kinect: Playstation move.  Obviously an imitation of Wii’s motion controls at an effort to prolong the system’s life and competitiveness against Nintendo’s Juggernaut.

The real travesty comes from the leader of this generation, Nintendo.  As a recast and overclocked Gamecube, the Wii  barely qualified as being a next gen console; and yet despite all that sold the most as of February 2011. A look at the resent releases chart for the Wii sinks your heart and reminds you why the little white console has been collecting dust ever since you either beat Super Mario Galaxy 2 or lost 10 pounds on Wii fit. 

With falling prices for Sony and Microsoft consoles, the extended console cycle is not all that bad.  The upside is that more folks can afford the systems and enjoy the many games already released.  The wii is not so fortunate in that regard.  No new hardware revisions and only a measly $50 price cut gives core gamers little reason to support the system.

The industry and market are changing and perhaps the five year console cycle is a thing of the past.  Nintendo has an opportunity to reap their profits and catch the other two manufacturers unawares with a new console.  Waiting another 2-3 years will certainly spell their doom if their stagnating sales are anything to go by.    Change is a good thing.  I for one would like to play a Zelda game in 720p without having to rely on an emulator.

source used: http://www.gamepro.com/article/news/99528/sony-stays-the-course-on-its-ten-year-plan/

Ever since Microsoft proposed it’s ORIGAMI UMPC (ultra mobile personal computer) initiative back in 2005, people were skeptical about the practicality and cost of such small computers.  The fears were ‘largely’ realized as the early devices were bulky, over priced, under-performing, and lacking sufficient battery life to get through more than a single commute.  Manufacturers tried to remedy the complaints, and certain devices received much needed revisions – the mostly excellent Samsung Q1UP coming to mind.  Unfortunately at an average of $1000-1500, the revised products came as too late and too expensive.  With the release of Apple’s Ipad, manufacturers have seemingly become galvanized with the prospect of an (re)emerging mini tablet market.  Note I did not say mini Tablet “PC” market.

The seemingly bare and simple iOS powering apple’s latest craze, the ipad, doesn’t look very appealing on paper – sure thousands of apps exist on the appstore, yet very little feature the productivity and control featured in a traditional Windows and OSX operating systems.  What was considered to be exemplary in a handheld phone/music/video player seems  less innovative and productive on a full sized tablet.  Granted, software aside there are many perks to running such an operating system and Intel is certainly not too pleased.

One of the complaints many consumers had with early UMPC’s was their poor battery life and high cost for – naturally – higher spec parts.  Aside from Intel’s core solo line, the CPU’s powering the devises were inadequate from anything besides the most rudimentary windows tasks, and yet consumed the battery very quickly.   This compounded with the inadequate supply of system ram made the machines nearly incapable of running, at the time, Microsoft’s newest OS, Windows vista; an operating system that some manufacturers chose to ship with these devices.  The issue of battery life was mitigated to a lesser extent when Intel began to ship their Atom Cpu’s – unfortunately many manufacturers had already began to abandon the UMPC market to focus on netbooks.  Apple clearly paid attention to market trends, deciding against competing in the netbook market, and instead  chose to expand their ipod/iphone product line to fill the void left by UMPC’s.

Sales don’t lie, and at $500, the ipad sold (and is still selling!) like hot cakes.  Numerous imitators have since emerged, such as Samsung’s Galaxy, Dell’s Streak, Archos, etc – all running Google’s Android OS, and none with Intel chips.  Nvidia is poised at grabbing market share with its upcoming Tegra 2 platform, which features an ARM Cortex A9 and a powerful integrated GPU.  We’ve seen what the A9 is capable off last month at SONY’s NGP handheld video game system unveiling.  Granted the NGP is running a quad core A9, the dual core still should  be no slouch.  With a reported power draw of 1.9 watts at peak operation, the dual core should provide plenty of battery life while still flexing its muscles.  The Android platform is also getting an update and Android 3.0 is looming around the corner.

Both Microsoft and Intel need to get their act together: improve user accessibility and reduce cost while at the same time maintaining the strengths of the aging x86 platform, compatibility and user control.  Entertainment and accessibility aside the x86 program is still the choice of software for professionals on the go.  If current trends continue, we consumers will have to rely exclusively on an ‘appstore’ for our software.  As iOS  and android continues to evolve that can change we must remember : choice is key in the computer industry, even when applied to the ultra mobile sector of the market.   I for one am not looking forward to the $.99 menu maze that is looming over the horizon should the current trends completely and utterly dominate the entire mobile computing market.

Source Used:  http://siliconangle.com/blog/2011/02/07/android-spurs-tablet-os-innovation-motorola-xoom-has-high-hopes/

Having owned the first fat Gameboy back in the heyday I can safely say that hand-held gaming has come a ways since those dark, monochrome, 8 bit days.   After few minor revisions of Nintendo’s hand-held wonder I was excited for some progress.  Sure there was that brief stint with SEGA’s little black hand-held that could… but I didn’t really own one, nor could I afford the massive upkeep in batteries to keep that bad boy running.  The Atari Lynx, a technical marvel at the time, never really garnered too many developers or sales and eventually leaving its place in history as a collector’s item.   My Pokemon addiction certainly did not help wean me off of Nintendo’s allure of inexpensive portable heaven.  Like many young gamers of the Pokemon era, Nintendo had me hook line and sinker.  And then came the DS, a first month purchase no doubt, but something was different. Either I was growing long of tooth or short of imagination – something was missing.

In 2005 Sony met it’s naysayers head on with the release of the PSP, an incredible device and one that has inadvertently shaped my views on what a hand-held gaming device is and what one could be.  In a response to the lackluster games library arose an enormous community of homebrewers and coders to bring emulators and other apps to a new audience.  PC gamers, myself included, have toyed with emulators and mods long before the PSP’s release; but now it was portable fit in your hand goodness for $200. Barring the inevitable and unfortunate side effect of piracy, the PSP is a gateway drug for not just hand-held gaming, but hand-held computing as well.  After some research and a few hundred dollars in pocket I set out in a journey to rediscover portable gaming.

The first stop was the now infamous Gizmondo, run almost quite literally into the ground by it’s mobster CEO, the device had powerful hardware yet a limited library.  The homebrew community never quite took off the ground as it mostly adopted the projects of other windows based devices.  Tapwave’s Zodiac, Gamepark holdings’ GPX, and their later GP2X were other homebrew centric devices that never quite caught my eye due to their lack of commercial games. Never quite attaining the level of success that Microsoft billed, the Ultra Mobile PC initiative called the origami project never quite took off.  Two particular devices stood out due to their portability/gaming potential:  Sony’s UX series – in my case the UX180p – and Samsung’s Q1 Ultra.  Aside from the many PC games released over the past 20 years, emulators up until the 128 bit Dreamcast era stand out as a feature surpassing the likes of the PSP and other dedicated homebrew devices. Having a fully featured version of windows really helps.  Unfortunately price is another issue altogether.

One of the most portable – with a 4.5″ screen – UMPCs, the handheld UX series cost a fortune at $1500-$2500 depending on the model.  Luckily the prices of used models have fallen to nearly $300-500 on auction sites.  Mostly ergonomic and quite capable at running many PC games of the early to mid 2000’s, the device remains one of the best UMPC’s to this day.  Micro PC talk is a community forum dedicated to the UX and featuring How-to guides and game/emulator compatibility lists.

Samsung’s Q1 series began its life unsuccessfully as one of the first UMPC’s under the Origami project; Bulky, difficult to hold and short on battery life the original was not a success.    Luckily Samsung quickly revised their product and re-released it under the name of Q1 Ultra.  Still an imperfect device it was a major improvements and featured and analog nub/face buttons along with a blackberry styled keyboard.  Barring the top end $2000 model, the ‘mainstream’ Q1 Ultra was not the powerhouse that is the Sony UX series, yet was available at a much more reasonable $1000 – still an astronomically high figure for portable gamers, although is available now at auction sites for about $300 used.

Other UMPC’s capable of portable gaming exit, their makers all hoped that the UMPC’s would catch on in the public eye and eventually become mainstream.  Apple had other ideas, and it’s their innovations with the iPod, iPhone, and the recent iPad that really took the market, and gamers by storm.

Nintendo has recently announced the price point of its 3DS’ software: $40-50.  This would fall completely in line with their competitor’s – that is Sony’s – pricing strategy.  Unfortunately in the year 2011 things have changed thanks to apple – for better or for worse.  The app store launched with the iPod touch and iPhone devices, and features much of its software library at $ .99.  Arguments of quality aside the game has changed. (pun intended) The devices themselves are no slouches either; receiving yearly revisions the hardware has come to rival the psp in performance. A homebrew community exists on apple devises provided they are jailbroken.  The lack of buttons can be off-putting but not deal-breaking since many developers have optimized the onscreen controls to emulate real buttons as best as possible.  Still, the incredible portability and inexpensive (or often free) software has proven to be appealing and very lucrative for both gamers and would-be-developers.

Worth a mention is the completely open source by gamers for gamers Pandora hand-held gaming system.  Running similar innards to that of the iPod and iPhone the system sets out to be a successor to the GP2X and is running on open source Linux distribution.

Stepping away from the mainstream portable gaming devices one can find a plethora of gaming possibilities on the go; some requiring deep pockets- both figuratively and literally – others can be found cheap on the net.  There is a world beyond that shelf one’s local Gamestop.  It may not be for everyone but whoever embarks on that journey to seek to Holy Grail of portable gaming goodness will find that it’s an immensely rewarding journey…most of the time at leastIf all else fails, you could just make your own 🙂