Tuesday, May 31, 2011

Intel Core i3

Intel Core i3 microprocessor family

 Core i3 line of entry-level Core-branded microprocessors was introduced on January 7, 2010 at Consumer Electronics Show in Las Vegas. Performance and price-wise these are middle-class CPUs, positioned between more expensive and more powerful Core i5 and Core i7 microprocessors, and budget Pentium and Celeron processor families. Based on Westmere (enhanced Nehalem) micro-architecture, Core i3 CPUs integrate dual-channel DDR3 memory controller, separate DMI interface to peripheral devices, HD-capable graphics controller, and incorporate all basic and some advanced micro-architecture features, such as per-core 256 KB level 2 cache, large level 3 cache shared between two cores, SSE4 instructions, and support for Virtualization and HyperThreading technologies. As common with entry-level and budget families, Core i3 line doesn't include advanced features, or have some of its features crippled:

  •  Currently (February 2010), the processors include only two CPU cores, as opposed to 4 cores in more expensive Core i5 and Core i7 families.
  • Core i3 CPUs have Turbo Boost Technology disabled.
  • Advanced Encryption Standard (AES) instructions are not supported;
  • Processors do not support Virtualization for directed I/O (VT-d) and Trusted Execution Technology features.
Intel Core i3 line up currently consists of desktop and mobile Core i3 families. Desktop Core i3 microprocessors are packaged in 1156-land Land-Grid Array (LGA) package, and require socket 1156 motherboards. Mobile Core i3 CPUs are manufactured in 1288-ball BGA or 988 micro-PGA package. BGA processors are soldered on directly on motherboards, and PGA processors use socket 988

2nd Generation Intel® Core™Processor Family for Desktops

Product Overview

Selecting the right processor is key when purchasing or upgrading your business PCs. The processor must keep pace with new trends in e-commerce, complex applications, and security that are applicable to your business Meet your business needs with a processor from the 2nd gen Intel® Core™ processor family, which offers improved adaptive performance and built-in visual capabilities to bring more intelligent performance to your business PC.


The Intel® Core™ i3 processor provides the basis for an affordable PC. This dual-core processor with 4-way multitasking capability has built-in performance headroom for software upgrades, providing an excellent return on investment.
FEATURES AND BENEFITS
With 2nd generation Intel Core i3 processors, you get the following features built in:
·         Intel® HT Technology◊ allows each core of your processor to work on two tasks at the same time.
·         Intel® Smart Cache is dynamically allocated to each processor core based on workload, which significantly reduces latency and improves performance.

Featured Technologies

Get more responsive multitasking with select processors in the 2nd gen Intel Core processor family, which feature Intel® Turbo Boost Technology12.0 and Intel® Hyper-Threading Technology1, enabling required security applications and protocols to run efficiently in the background without compromising your productivity.Information security is vital to business. Securing information and data requires complex encryption software, which slows PC performance. With Advanced Encryption Solution New Instructions (AES-NI) integrated into the processor, the encryption and decryption operation is accelerated, saving you time.

With the increase of video conferencing and social media, business communication has become more visually sophisticated. Intel® HD Graphics 2000, integrated into the 2nd gen Intel Core processors, allows PC hardware to stay in step with these new visual media needs. The need for a discrete graphics card is eliminated, reducing power consumption and system cost.

   2nd gen Intel Core processors with Intel® vPro™ technology 2  can help reduce costs and increase efficiency by taking advantage of intelligent performance and unique hardware-assisted security and manageability features. Remote, automated manageabilityfeatures make PC maintenance easier and keep costs low. The system can be remotely configured or diagnosed, and an infected PC can be isolated or repaired. An Intel vPro  technology-based system minimizes disruptions to daily business operations. 

These 2nd gen Intel Core processors are also power efficient, enabling more energy efficient platforms which can meet ENERGY STAR* 3 and other global environmental  requirements. 

Intel’s technologies and innovations can increase the longevity of the computer, protecting your investment and supporting long-term business growth

Intel® Core™ i7 Processor—Best-in-Class Performance

The Intel® Core™ i7 processor delivers  best-in-class performance for the most  demanding applications. This quad-core  processor features Intel Turbo Boost  Technology 2.0, 8-way multitasking capability,  and additional L3 cache.


Intel® Core™ i5 Processor—The Next Level of Productivity


The Intel® Core™ i5 processor delivers the  next level of productivity. With Intel Turbo  Boost Technology 2.0, this quad-core processor with 4-way multitasking capability delivers extra speed whenever you need   it as well as security features to help  protect information and data. 

Intel® Core™ i3 Processor—Affordable Business PC

The Intel® Core™ i3 processor provides the basis for an affordable business PC. This  dual-core processor with 4-way multitasking  capability has built-in performance headroom for software upgrades, providing an  excellent return on investment

3 Step Guide To Overclock Your Core i3, i5, or i7 – Updated!


so many users are searching around the net these days looking for advice on how to over clock their new systems but don’t know where to start.  To help everyone out, I decided a how-to guide was in order.  Searching around forums can be confusing and intimidating.  There are so many people willing to give advice, but who can you trust?  It’s hard to know, and I’ve seen many users sent on wild goose chases because they are following advice that doesn't solve or even address their specific problem. I’ve also seen too much trial and error overclocking. What I will attempt to do is create a very simple three step guide to “one-size-fits-all” over clocking.
Methodology
My goal here is for this overclocking guide to be useful for anyone with a newer Intel based system, i3, i5, i7, LGA1156 or LGA1366. With the same basic principles applying to all of them, the basic process doesn’t change whether you are planning to use your system as an everyday system, gaming or if you want to push the limits for a single benchmark.
This guide is also independent of your cooling system.  Whether you are using the stock Intel cooler or if you’re pushing to the extreme with liquid nitrogen, the basic steps remain the same.  One thing that is far too common is errors in mounting your cooling system, specifically the application of the thermal interface material (TIM).  If you don’t have much experience mounting a cooling apparatus, please refer to this excellent guide from Arctic Silver.
Determining methods for finding a stable overclock are highly controversial, and my suggestion is that we agree to disagree.  Everyone has their individual definition of a stable system, but when I refer to “stable” in this guide, I am referring to the stability of your selected “stability test.”  So for a power user or gamer who wants a reliable system that won’t ever crash due to an overclock pushed too far, you’d need to test with a program that will load all of the cores and threads applicable to your CPU, OCCT and Prime95 are two popular choices.  For a benching team member looking to squeeze every last MHz out of their chip for a 7 second SuperPI 1M run on liquid nitrogen cooling, SuperPI 1M would be the ideal test. In my examples below, my verbiage will obviously be geared more towards those running tests like Prime95.  Super PI 1M only takes a few seconds to complete, so when I say “run your stability test for five minutes” obviously you will have to tailor that instruction for your individual situation.
So with that in mind, we will attempt to isolate each portion of the system and overclock one piece at a time.  This may seem time consuming at first  glance, but rest assured this can potentially save you hours of troubleshooting and frustrationSo go slow, and follow each step very carefully.
Micro-architecture
The CPU micro-architecture has taken a huge leap from the 65nm Core to the new generation 45 and 32nm technology, it has brought many changes not only to the CPU’s but also to the chipset and motherboard design and functioning.  This is what make overclocking the i3/i5/i7 CPU’s so much different to their predecessor LGA 775 CPU’s.
The naming convention can be a bit confusing so let us look at the various CPU and their names:
First we have the Nehalem family which are all 45 nm CPU’s that included i7 1366 Bloomfield (i7-920 i7-975) and Lynnfield socket 1156 i5/i7 (i7-750 to i7-860). These are all quad cores with HT except for the i5 which has no HT.
Then the next family is Westmere which is essentially die shrink 32 nm CPU version of Nehalem and again you have the Clarkdale (socket 1156) in flavors of i5 and i3, both dual core processors with HT.
Gulftown which is the hex-core CPU’s and not available yet are also part of the Westmere family and features 6 physical cores with HT and will be socket 1366 only.
The above are all desktop chips, then to you get  the Arrandale and Clarksfield CPU’s which are mobile processors and the Gainstown which is the server equivalent of Bloomfield.
So, to summarize, we have socket LGA 1366 and LGA1156 which are essentially the board platforms that carry certain 45 and 32 nm CPU variants. Both platforms are DDR3 where the 1156 is dual channel only.

Cloud Computing



In the last few years, Information Technology (IT) has embarked on a new paradigm — cloud computing. Although cloud computing is only a different way to deliver computer resources, rather than a new technology, it has sparked a revolution in the way organizations provide information and service.
Originally IT was dominated by mainframe computing. This sturdy configuration eventually gave way to the client-server model. Contemporary IT is increasingly a function of mobile technology, pervasive or ubiquitous computing, and of course, cloud computing. But this revolution, like every revolution, contains components of the past from which it evolved.
Thus, to put cloud computing in the proper context, keep in mind that in the DNA of cloud computing is essentially the creation of its predecessor systems. In many ways, this momentous change is a matter of "back to the future" rather than the definitive end of the past. In the brave new world of cloud computing, there is room for innovative collaboration of cloud technology and for the proven utility of predecessor systems, such as the powerful mainframe. This veritable change in how we compute provides immense opportunities for IT personnel to take the reins of change and use them to their individual and institutional advantage.

Cloud computing is a comprehensive solution that delivers IT as a service. It is an Internet-based computing solution where shared resources are provided like electricity distributed on the electrical grid. Computers in the cloud are configured to work together and the various applications use the collective computing power as if they are running on a single system.
The flexibility of cloud computing is a function of the allocation of resources on demand. This facilitates the use of the system's cumulative resources, negating the need to assign specific hardware to a task. Before cloud computing, websites and server-based applications were executed on a specific system. With the advent of cloud computing, resources are used as an aggregated virtual computer. This amalgamated configuration provides an environment where applications execute independently without regard for any particular configuration.
There are valid and significant business and IT reasons for the cloud computing paradigm shift. The fundamentals of outsourcing as a solution apply.
  • Reduced cost: Cloud computing can reduce both capital expense (CapEx) and operating expense (OpEx) costs because resources are only acquired when needed and are only paid for when used.
  • Refined usage of personnel: Using cloud computing frees valuable personnel allowing them to focus on delivering value rather than maintaining hardware and software.
  • Robust scalability: Cloud computing allows for immediate scaling, either up or down, at any time without long-term commitment.
The cloud computing model is comprised of a front end and a back end. These two elements are connected through a network, in most cases the Internet. The front end is the vehicle by which the user interacts with the system; the back end is the cloud itself. The front end is composed of a client computer, or the computer network of an enterprise, and the applications used to access the cloud. The back end provides the applications, computers, servers, and data storage that creates the cloud of services.
The cloud concept is built on layers, each providing a distinct level of functionality. This stratification of the cloud's components has provided a means for the layers of cloud computing to becoming a commodity just like electricity, telephone service, or natural gas. The commodity that cloud computing sells is computing power at a lower cost and expense to the user. Cloud computing is poised to become the next mega-utility service.
The virtual machine monitor (VMM) provides the means for simultaneous use of cloud facilities (see Figure). VMM is a program on a host system that lets one computer support multiple, identical execution environments. From the user's point of view, the system is a self-contained computer which is isolated from other users. In reality, every user is being served by the same machine. A virtual machine is one operating system (OS) that is being managed by an underlying control program allowing it to appear to be multiple operating systems. In cloud computing, VMM allows users to monitor and thus manage aspects of the process such as data access, data storage, encryption, addressing, topology, and workload movement.


Figure . How the Virtual Machine Monitor works

These are the layers the cloud provides:
  • The infrastructure layer is the foundation of the cloud. It consists of the physical assets — servers, network devices, storage disks, etc. Infrastructure as a Service (IaaS) has providers such as the IBM Cloud. Using IaaS you don’t actually control the underlying infrastructure, but you do have control of the operating systems, storage, deployment applications, and, to a limited degree, control over select networking components.
  • Print On Demand (POD) services are an example of organizations that can benefit from IaaS. The POD model is based on the selling of customizable products. PODs allow individuals to open shops and sell designs on products. Shopkeepers can upload as many or as few designs as they can create. Many upload thousands. With cloud storage capabilities, a POD can provide unlimited storage space.
  • The middle layer is the platform. It provides the application infrastructure. Platform as a Service (PaaS) provides access to operating systems and associated services. It provides a way to deploy applications to the cloud using programming languages and tools supported by the provider. You do not have to manage or control the underlying infrastructure, but you do have control over the deployed applications and, to some degree over application hosting environment configurations.
  • PaaS has providers such as Amazon's Elastic Compute Cloud (EC2). The small entrepreneur software house is an ideal enterprise for PaaS. With the elaborated platform, world-class products can be created without the overhead of in-house production.
  • The top layer is the application layer, the layer most visualize as the cloud. Applications run here and are provided on demand to users. Software as a Service (SaaS) has providers such as Google Pack. Google Pack includes Internet accessible applications, tools such as Calendar, Gmail, Google Talk, Docs, and many more.                           






Figure. Cloud computing layers embedded in the "as a Service" components
Cloud formations
There are three types of cloud formations: private (on premise), public, and hybrid.
  • Public clouds are available to the general public or a large industry group and are owned and provisioned by an organization selling cloud services. A public cloud is what is thought of as the cloud in the usual sense; that is, resources dynamically provisioned over the Internet using web applications from an off-site third-party provider that supplies shared resources and bills on a utility computing basis.
  • Private clouds exist within your company's firewall and are managed by your organization. They are cloud services you create and control within your enterprise. Private clouds offer many of the same benefits as the public clouds — the major distinction being that your organization is in charge of setting up and maintaining the cloud.
  • Hybrid clouds are a combination of the public and the private cloud using services that are in both the public and private space. Management responsibilities are divided between the public cloud provider and the business itself. Using a hybrid cloud, organizations can determine the objectives and requirements of the services to be created and obtain them based on the most suitable alternative.

Let us consider the probability that management and administration will require greater automation, requiring a change in the tasks of personnel responsible for scripting due to the growth in code production. You see, IT may be consolidating, with a need for less hardware and software implementation, but it is also creating new formations. The shift in IT is toward the knowledge worker. In the new paradigm, the technical human assets will have greater responsibilities for enhancing and upgrading general business processes.

The growing use of mobile devices, the popularity of social networking, and other aspects of the evolution of commercial IT processes and systems, will guarantee work for the developer community; however, some of the traditional roles of development personnel will be shifted away from the enterprise's developers due to the systemic and systematic processes of the cloud configuration model.
A recent survey by IBM, New developerWorks survey shows dominance of cloud computing and mobile application development demonstrated that the demand for mobile technology will grow exponentially. This development, along with the rapid acceptance of cloud computing across the globe, will necessitate a radical increase of developers with an understanding of this area. To meet the growing needs of mobile connectivity, more developers will be required who understand how cloud computing works.
Cloud computing provides an almost unlimited capacity, eliminating scalability concerns. Cloud computing gives developers access to software and hardware assets that most small and mid-sized enterprises could not afford. Developers, using Internet-driven cloud computing and the assets that are a consequence of this configuration, will have access to resources that most could have only dreamed of in the recent past.

Administrators are the guardians and legislators of an IT system. They are responsible for the control of user access to the network. This means sitting on top of the creation of user passwords and the formulation of rules and procedures for such fundamental functionality as general access to the system assets. The advent of cloud computing will necessitate adjustments to this process since the administrator in such an environment is no longer merely concerned about internal matters, but also the external relationship of his enterprise and the cloud computing concern, as well as the actions of other tenants in a public cloud.
This alters the role of the firewall constructs put in place by the administration and the nature of the general security procedures of the enterprise. It does not negate the need for the guardian of the system. With cloud computing comes even greater responsibility, not less. Under cloud computing, the administrator must not only ensure data and systems internal to the organization, they must also monitor and manage the cloud to ensure the safety of their system and data everywhere.
The function of the architecture is the effective modeling of the given system's functionality in the real IT world. The basic responsibility of the architect is development of the architectural framework of the agency's cloud computing model. The architecture of cloud computing is essentially comprised of the abstraction of the three layer constructs, IaaS, PaaS, and SaaS, in such a way that the particular enterprise deploying the cloud computing approach meets its stated goals and objectives. The abstraction of the functionality of the layers is developed so the decision-makers and the foot soldiers can use the abstraction to plan, execute, and evaluate the efficacy of the IT system's procedures and processes.
The role of the architect in the age of cloud computing is to conceive and model a functional interaction of the cloud's layers. The architect must use the abstraction as a means to ensure that IT is playing its proper role in the attainment of organizational objectives.
The main concerns voiced by those moving to the cloud are security and privacy. The companies supplying cloud computing services know this and understand that without reliable security, their businesses will collapse. So security and privacy are high priorities for all cloud computing entities.
Governance is the primary responsibility of the owner of a private cloud and the shared responsibility of the service provider and service consumer in the public cloud. However, given elements such as transnational terrorism, denial of service, viruses, worms and the like — which do or could have aspects beyond the control of either the private cloud owner or public cloud service provider and service consumer — there is a need for some kind of broader collaboration, particularly on the global, regional, and national levels. Of course, this collaboration has to be instituted in a manner that will not dilute or otherwise harm the control of the owner of the process or subscribers in the case of the public cloud.
If you are going to adopt the cloud framework, bandwidth and the potential bandwidth bottleneck must be evaluated in your strategy. In the CIO.com article: The Skinny Straw: Cloud Computing's Bottleneck and How to Address It, the following statement is made:
Virtualization implementers found that the key bottleneck to virtual machine density is memory capacity; now there's a whole new slew of servers coming out with much larger memory footprints, removing memory as a system bottleneck. Cloud computing negates that bottleneck by removing the issue of machine density from the equation—sorting that out becomes the responsibility of the cloud provider, freeing the cloud user from worrying about it.
For cloud computing, bandwidth to and from the cloud provider is a bottleneck.
So what is the best current solution for the bandwidth issue? In today's market the best answer is the blade server. A blade server is a server that has been optimized to minimize the use of physical space and energy. One of the huge advantages of the blade server for cloud computing use is bandwidth speed improvement. For example, the IBM BladeCenter is designed to accelerate the high-performance computing workloads both quickly and efficiently. Just as the memory issue had to be overcome to effectively alleviate the bottleneck of virtual high machine density, the bottleneck of cloud computing bandwidth must also be overcome, so look to the capabilities of your provider to determine if the bandwidth bottleneck will be a major performance issue.
Because a sizable proportion of the cost in IT operations comes from administrative and management functions, the implicit automation of some of these functions will per se cut costs in a cloud computing environment. Automation can reduce the error factor and the cost of the redundancy of manual repetition significantly.
There are other contributors to financial problems such as the cost of maintaining physical facilities, electrical power usage, cooling systems, and of course administration and management factors. As you can see, bandwidth is not alone, by any means.
Consider these possible risks:
  • Adverse impact of mishandling of data.
  • Unwarranted service charges.
  • Financial or legal problems of vendor.
  • Vendor operational problems or shutdowns.
  • Data recovery and confidentiality problems.
  • General security concerns.
  • Systems attacks by external forces.
With the use of systems in the cloud, there is the ever present risk of data security, connectivity, and malicious actions interfering with the computing processes. However, with a carefully thought out plan and methodology of selecting the service provider, and an astute perspective on general risk management, most companies can safely leverage this technology.
In this revolutionary new era, cloud computing can provide organizations with the means and methods needed to ensure financial stability and high quality service. Of course, there must be global cooperation if the cloud computing process is to attain optimal security and general operational standards. With the advent of cloud computing it is imperative for us all to be ready for the revolution.


Monday, May 30, 2011

Microsoft Windows 8 Leaked



We knew Microsoft is working on Windows 8 operating system and is planning to release it in 2012. Word on internet has it the latest version of Windows OS installs faster than the current version, and offers couple of interesting features.
Windows 8 has a "factory restore" feature which returns the computer to its factory state in under 2 minutes. So in case something goes horribly wrong and you cannot even recover the system to previously known good state, factory restore feature will save your day. And no, you're not required to install a disk.
The upcoming OS also has an app store and lets user log-in using Windows Live ID.
The "source" also reports that he was able to install Windows 8 OS in just 8 minutes -- approximately 2.5x faster than Windows 7. It is another matter that his computer has an 8-core processor, 24GB memory, and 2TB hard drive. So normal users will not be able to install Windows 8 in flat 8mins, but it should certainly take less time when compared to Windows 7 install time.Windows 8 is due in 2012.
Microsoft Looks to Apple
Included in these presentations is a rather telling (but obvious) slide which shows that Microsoft is clearly paying attention to Apple while planning Windows 8. Titled, “How Apple does it: A virtuous cycle,” Microsoft has broken down Apple’s UX/Brand Loyalty cycle and cited its value. Though it’s fairly obvious, the takeaway here is that Microsoft is aiming to give Windows the very same “it just works” status that Apple’s products are known for:

Windows 8 Prototype Machine
Speaking of Apple, I think the following prototype looks like some rejected Mac prototype (i.e. I don’t like it very much — at least from this angle). The wallpaper is the old Windows 7 beta wallpaper (as you can see by the beta fish in the center of it) and there is clearly some build information on the bottom right-hand corner of the desktop. This may well be something left over from Windows 7 planning, but being included in Windows 8 planning documentation, I figured it was worth tossing in. Here’s the machine and below it, its specifications:


Windows 8 Product Cycle
The following slide isn’t too telling in and of itself, but it serves to show how Microsoft has chosen to divide its Windows 8 product cycle into 3 main phases:
·         Planning (from Framing to Vision): Big picture thinking, themes then scenarios, and feature identification list.
·         Development (from Vision to Beta): Design and build features, refine SKUs (shelf-keeping units) and value propositions, and begin sharing code.
·         Readiness (from Beta to GA+90): Feature complete and bug-fixing, establish and track readiness metrics, and focus on creating great Dell + Windows experiences.
Of note, these slides were apparently leaked or inadvertently released after being given to one Derek Goode at HP. Likewise, many of the discussions throughout the slides address HP, so the 3rd phase above making reference to Dell interests me. Anyway, here is the slide of note:

Windows 8: Identity Evolved
There appears to be considerable planning taking place as to how a user will access Windows. Right off the bat, one of my favorites is the following prototype which shows a user logging in via facial recognition! Basically, you enroll your face, then all you should have to do from that point forward is sit down, have your webcam get a look at you and then log you in based on facial recognition:

The following slide details other considerations for Windows 8 where identity is concerned. Namely, user accounts will still be the primary method of accessing Windows for individuals, fast user switching is a continued focus, and most notably, Windows accounts could be connected to the cloud which would allow for roaming settings/preferences between PCs and devices and PCs to log on to websites on the user’s behalf — all marking an evolution of Windows identity from being machine-centric to user-centric.

Trends Shaping the Planning of Windows 8
Shaping the planning of Windows 8 are explosion of form factors (laptops, netbooks, slates, etc.), assumed connectivity (focus on software + services for end-user scenarios), collision of enterprise and personal worlds (aiming to help customers have a seamless experience across their personal and professional lies), personal content experience, and more. The following slide elaborates:
Windows 8 Consumer Target Audiences
As we see detailed in the slide below, enthusiasts and mainstream consumers are the two main consumer target audiences for Windows 8:
Windows 8 Default Business Assumptions
 Windows 8 Developer Market
No surprise here that Microsoft’s addressable developer market for Windows 8 spans from hobbyist/non-professional developers to professional developers to science, technology, engineering, and math developers:
Windows 8 Differentiation Goals
As for form factors, Microsoft’s 3 main focuses for Windows 8 appear to be Slate, Laptop, and All-in-One (all detailed in slides below). Additionally, customization areas include Applications, Devices, Multimedia, Help and Support, and UI and Theming (all also detailed in slides below). One of the key takeaways here is Microsoft detailing “Slate” as a major form factor focus. This means the Windows-based Slate devices are still likely to make an appearance at some point:











Windows 8: Energy Efficiency Areas of Focus
All of the following slides are highly-detailed and quite self-explanatory, but my favorite takeaway is a newly-planned feature that combines Logoff + Hibernate to result in a new off state. It will apparently give the look and feel of boot/shutdown but will be much faster. This feature is detailed in the next set of slides, titled “Windows 8: Fast Startup:”

Windows 8: Fast Startup

These slides give great detail on the Logoff + Hibernate feature mentioned above. There isn’t a name for the feature yet as it will be exposed to the user:

Windows 8 Help and Support

Help and Support was going to be a major focus in Windows 7, but it was dropped. Now, it looks like that focus is back for Windows 8 and the aim is to help users “know with confidence how to respond and what actions to take” when an issue arises.

Windows 8 Push Button Reset

This is an interesting one. Microsoft appears to be planning functionality for a reset button that will essentially reinstall Windows while maintaining all of your personal files, applications, settings, etc. without the need for the user to back all of that stuff up. A scenario is presented in one of these slides to demonstrate how it would work.

1 – Jon notices that his Windows 8 PC is starting to perform poorly and he can’t figure out what to do. He presses the reset button and chooses to reset his Windows 8 PC.

2 – Wanting a fresh start, he chooses to reset his PC knowing that all his stuff is safe.

3 – Windows 8 automatically retains files and personalization settings, and migrates the user accounts.
4 – Windows is restored to the factor image and restarts.

5 – After restarting, Jon can launch the App Store to reinstall applications he purchased there and see a list of other applications that he had installed outside of the App Store.

Internet Explorer 9

There’s an entire slide deck dedicated to Internet Explorer 9 discussion. There isn’t much contained within that we don’t already know, but there is an announcement for the beta to take place in August 2010:

Windows 8 Introduces “Windows Store,” Microsoft’s App Store for Windows

Though Microsoft has already attempted some semblance of this with Windows Marketplace, Windows 8 will introduce “Windows Store.” Yes, it will be an application store which will allow you to purchase applications for Windows (and perhaps Microsoft mobile devices as well, such as Windows Phone, Zune HD, etc.). Microsoft has a solid foundation on this and as the slides below note, they’re anxious to bring this to fruition A.S.A.P. Detailed below are the customer experience, developer experience, and channel experience (for partners). For customers, it looks like Microsoft is interested in integrating Windows Store results into Bing’s Web and Local SERPs as well as Windows Search. For developers, there appears to be a great panel for tracking just about every metric you could want to track as a developer.

In conclusion, Microsoft defines Windows Store’s success for consumers as, “getting applications they want, that they can feel confident in, that they can use on any Windows 8 device.” Have a look as there is much more information in these slides that I did not elaborate on. Oh, and be sure to keep your eyes onhttp://www.windowsstore.com/ as it is mentioned in one of the slides below and is indeed currently registered (as are the .net and .org URLs).












I