Monday, April 25, 2011

Latest Walking Robots World | Popular Walking Robots In The World | Images








Latest Robot World | Latest Chinese Robo Beauty Imgeas | Talking Robots

 Chinese Latest Cute Robots





This Robots are symbol of Chinese latest technology and shows the Engineers working ability..

Robo World | Latest Robots |ATHLETE: NASA’s Six limbed Robot prototype to support human exploration on Moon






This is something big from NASA, really big.A joint project between NASA’s JPL, Stanford University and the Boeing Company, NASA is carrying out the tests in a move to check out whether the prototype meets the milestone of traveling at least 40 kilometers (25 miles) over 14 days under its own power. The robot is not so fast and moves with a top speed of 2 kilometers per hour.
 

Engineers at Jet Propulsion Laboratory (JPL) are putting this All-Terrain, Hex-Limbed, Extra-Terrestrial Explorer (ATHLETE) under a series of long-drive tests on long dirt roads. ATHLETE is a half-scale working prototype of NASA’s actual robot under development to transport habitats and other cargo on the surface of the Moon or Mars.

Monday, January 24, 2011

New Price Of i7 processors in India | Now cheap




Intel i7 processors now at your nearest dealers at mouth watering price just 15k in India.
Best price i7 920 at 12k (INR)
Best price i7 930 at 14k (INR)

+ vat 5% -so go and grab one and make your pc super fast with new Intel i7 processors

If you recently bought an Intel Core i7 950 processor and shelled out the full 20K -25K asking price prepare to regret. It turns out the chatter was true and now the high-end chip is selling for 48% less at $294. The move essentially renders both the Core i7 920 and 930 obsolete as well, since those chips are selling for roughly the same.

Old price of i7
Intel Core i7 920 Processor Price – Rs. 15,000/-

* Speed: 2.66 GHz
* Core:Nehalem
* no of Core : Quad-Core
* L2 Cache: 4 x 256KB
* L3 Cache: 8 MB
* Process Type : 45 nm

Intel Core i7 940 Processor Price - Rs. 30,500/-

* Speed: 2.66 GHz
* Core:Nehalem
* no of Core : Quad-Core
* L2 Cache: 4 x 256KB
* L3 Cache: 8 MB
* Process Type : 45 nm

Intel Core i7 Extreme Edition 965 Processor Price – Rs. 53,000/-

* Speed: 3.2 GHz
* Core:Nehalem
* no of Core : Quad-Core
* L2 Cache: 4 x 256KB
* L3 Cache: 8 MB
* Process Type : 45 nm

Acer Laptop Price List – 2011

List of Acer Laptops with latest price and review

Listed here are the latest Acer Laptops with it’s cheapest prices available in India. While we are reaching the end of this year 2010, Acer is continuously bring us the latest technology laptop configuration models under the very reasonable price range. In India Acer is becoming so popular every day which is highly visible to every Indian customers who are searching to buy a new laptop. We tried here to bring the latest and widely available Acer Laptop models with the latest market price which will be of great help to find the best Acer Laptop Prices from you local computer shop or from online shopping websites. If you have any questions regarding any particular laptop model, let us know about it. We are ready to give you the best suggestion possible to your email ID.


Acer Aspire 4741z – Windows 7


Rs 29750

If you are looking for a good quality laptop with Windows 7 Operating system pre-installed with reasonably priced less than Rs-30000, this laptop from Acer is the right choice for you. This light weight and thin laptop is not only comes with the Windows 7 Operating system, but with all the necessary latest communication technologies built into it. This same Acer Aspire 4741z laptop is also available with Linux operating system, but we recommend this laptop with Windows 7 operating system pre-installed to our customers. The reason behind our recommendation is, unless otherwise you are very familiar with the Linux operating system, you will definitely need to look for an alternate operating system. In such case you will have to go for Windows XP or Windows 7. Officially there are no drivers available for Windows XP for this laptop, so the choice still narrows down to the Windows 7 which …



Acer Aspire 5741z – Windows 7


Rs 31700


This Acer Aspire Laptop model 5741z comes with pre-installed Windows 7 operating system and powered by the Pentium Dual core processor. The 15 inch Screen ( Actually 15.6 inch ) HD Screen is good for watching HD movies and the 3GB of DDR3 RAM provides the smooth operation. Al the latest ports like HDMI Port, VGA Port, Audion In/out Port, USB 2.0 ports, Multi card reader are available. 320GB of Hard disk drive and DVD writer drive good enough for normal computer user.



Acer Aspire 4745 – i3 370M – Windows 7

Rs 31700

This is the 14 inch screen laptop from Acer available in India with Intel Core i3 – 370M processor. This Acer Aspire 4745 model is available in two other configurations. The other one with the pre-installed Windows 7 operating system is coming with the Core i3-350M processor which is 2.26GHz clock speed. The Core i3-370M is 2.40GHz Clock Speed. There are no other major differences between these two processors. The price difference is also not so big. This Acer Aspire 4745 (i3-370M) comes with 2GB DDR3 RAM where as the one available in market with Core i3 350M (2.26GHz) processor is coming with 3 GB of DDR3 RAM. Actually the price difference is only Rs-500 between these two models available in the market. But Acer comes out with these two models in India to choose the laptop which best suit your need. If you prefer high clock speed and 2GB …



Acer Aspire One AOD 260 – With Broad Band


Rs 17999


The new Acer Aspire One AOD 260 Netbook is available with the built in Broad Band Internet connection from TATA Photon on selected cities. This Netbook is released and available from TATA Offices at Delhi and Mumbai cities at first and will be expanded to other cities. Acer Aspire One AOD260 Specification 10.1? screen with LED back-lit and 16:9 aspect ratio Intel Atom N450 CPU 1.3 megapixel camera 1 GB DDR2 RAM 160 GB HDD at 5400 RPM 6cell battery with 8 hours of life Windows 7 Starter Operating System 3 USB 2.0 Ports, Bluetooth connectivity and in-built WiFi



Acer Aspire 4745 – Linux



Price: Rs 32000

This Acer Aspire 4745 Laptop is same like the one comes with Windows 7 operating system. Most of the hardware configuration are same with same build. But, this one comes with Linux operating system and slightly less powered processor. The difference is very minimal. If you have a plan to Install the operating system separately, this one is the right choice with great savings. While writing this review, there are no drivers available for Windows XP and Windows Vista for this Laptop model. So, the only option left to you is the Windows 7 operating systems and Linux OS. Acer Aspire 4745 Coe i3-350M Specification Processor: Intel Core i3-350M (2.26GHz, 3MB L3 Cache, 1066MHz, 2 Core, 4 Threads) Operating system: Linux Memory: 3GB DDR3 RAM Screen: 14 inch HD LED Hard disk drive: 320GB DVD Writer, HDMI output port, VGA Port, 3 USB 2.0 Ports, Audio in/out ports, Muliti card …



Acer Aspire 5740G

Price:
Rs 38000 (Corei3,512MB VRAM)
Rs 44000 (Corei5,1GB VRAM)

Acer Aspire 5740G is a 15.6 inch display laptop from Acer with dedicated ATI Radeon Graphics card, good 16:9 aspect ratio display. Comes in two price, one with 512MB Graphics and the other with 1GB Graphics. With the price difference for these graphics card difference you will not notice great performance difference for your day to day regular office work activities and Web browsing activities. As this is a gaming laptop, gamers will love this laptop which comes under their budget with great gaming performance. Comes with wide set of ports including HDMI. The missing part is the combo USB/eSATA port. The WiFi range and data transfer speed is really good with the Acer’s SignalUp technology. In total, this is a great laptop for gaming and watching HD videos in a mid sized room with 3 to 4 persons without any external speakers.



Acer Aspire 4740
Price:
Rs 31000 (Linux)
Rs 35000 (Windows7 HP)

This Acer Aspire 4740 Laptop comes in two price specs. One with the Linux operating system and the other with Windows7 Home Premium. This is a Core 13-330M processor powered 14.1 HD LED display budget laptop from Acer. With side shortcut buttons and good battery life within the budget price range, this Acer laptop is one of the highly recommended laptop to our customers.


Acer eMachine e732Z

Price: Rs 22500

Acer eMachine 732z is the cheapest 15.6 inch Screen laptop with Dual core processor and Linux operating system. Comes with 1GB RAM and 250GB HDD. Multi card reader, HDMI port, DVD writer Wifi and Gigabit LAN are the other important features. Best deal from Acer now available in India.


Aspire One Laptop-AOD250
 



Price: Rs 19000

This Acer Aspire One Laptop (Mini Laptop) Model:AOD250 comes with N280 Atom processor which is 1.66GHz frequency. The Hard drive capacity is 160GB. The display is 10.1 inch wide. Come with in built multiple card reader, 10/100 Ethernet and Crystal web cam.

Computers, Printers, Scanners and more - Internet Connections & Networking Solutions - Upgrades & Data Backup Solutions

Smart Computers are proud to have been providing software and hardware solutions to clients in the Sheffield and South Yorkshire area since 1988. We are a Microsoft Small Business Specialist.

XP Pro / Vista systems from - £259.00 + vat

You'll find lots more offers on computer systems, software, hardware and peripherals in ur "Latest Offers" section.
Go to latest offers on Desktops > All offers are subject to change without notice. Please check that any offer indicated within this site is still current and that stock is available,prior to order.

Linux Hosting Made Clear



Instead of using Microsoft Windows-based technology for operating a website, many people rely on Linux hosting as a really good alternative. As most of you probably know by now, the code that makes Linux run is publicly available, which means that it is an open-source operating system. This is why people all over the world have the possibility to make the system better and better each day! Windows is indeed more easy to operate than Linux, but the chances of failure that Linux provides are very much smaller than for the Windows operating system. This is what makes Linux very useful for running websites.

There are several technologies currently available on the market, all of them being used for Linux hosting purposes. The first technology that is worth your time is PHP. This is a server-side language, meaning the program runs on the server, that is the computer that physically stores the website, rather than the computer being used to view the page. PHP is a programming language which produces dynamic web pages. These are pages which can change in appearance and content in response to something the person viewing it does, for instance filling in a form.


One of the programming languages which are being very popular among many programmers is the one called Python. It is relatively sophisticated, but is easier to understand than many programming languages. Python also gives programmers the ability to make a draft alteration to a program and check the effects immediately without having to make the change permanent. Python often uses common English words where other languages simply use symbols.

MySQL is a database system used for websites. It can be used for many different features such as a real estate company letting site visitors search for properties of a certain size and cost.

A more flexible method of organizing the information that makes up a website is XML, that is Extensible Markup Language. XML basically means that a website owner has much more control over the information on their site. With XML, any type of label can be used. For instance, in a page containing a recipe, a piece of text could be labeled as ?ingredient? or ?safety warning?. With the old HTML system, each piece of information was only labeled for appearance, such as bold or italic. This is why XML brings an obvious progress from the techniques that were previously used.

So, no matter which technology you choose, the single thing that really matters is having a clue on what one intends doing.


Fixing quotaon issues in Linux based registro servers

As cPanel became market leader in the hosting market, We have seen some daily issues.

Every aged Linux webhosting sysadmin should have ever heard about Quotaon, a linux common hospedagem application that helps kernel to control disk limits.

As the box is new and empty, you won't have major problems with quotas. But as soon as it gets full, let's say 200 hosting accounts, some problems begin to happen, such as Quotaon causing very high loads, iowaits and consequently some downtime for your users.

A easy way to solve this matter, is setting "Safe quotas" hospedagem Off, in your WHM "Tweak settings". But why would this tiny tool, Quotaon, may cause such trouble? Quotaon runs over all your harddisk and rebuild all the quotas, everytime an user or reseller changes its quota. Imagine you have a 400GB harddisk, 50% of disk space used. Running over the entire disk (smaller or bigger), would cause a colapse for your system.

Usually, the biggest directory is /var . It holds config files, temp files, spool, mysql databases, and some others. Sysadmins recommend you to mount /var in a separate partition, with NOQUOTAS directive under fstab. This will disable quotas under /var, and then prevent your server from running over /var to rebuild such user quotas.


All-Computer-Brands
There are many choices of all-computer-brands that are widely available. Dell, Toshiba, and Sony are amongs the many, but which one do I choose from? Which is the one for me? All-computer-brands come in many ranges from types and models to prices and sizes and can be a daunting task in picking and choosing them, especially if your a first time buyer. To ease through the selections, it is highly recommended to first determine what your daily activites, needs and wants are when considering a computer. Do you need the computer only for school and work? Do you only want the computer for entertainment such as playing music and games and watching movies? Do you want a desktop computer to keep at home or a laptop computer to suit your lifestyle and daily travels? By already determining the major aspects of your computing, all-computer-brands hardware and specifications selections should be the next consideration.

All-computer-brands hardware and specifications components varies upon the one that fits and suites you. As a guideline, there are three vital hardware components to all-computer-brands. The CPU, the RAM, and the hard drive. The CPU will determine the speed of your computer. Avid computer gamers will need the highest CPU available as possible in all-computer-brands where as basic computer user using the computer only for surfing the internet and checking email can rely on basic CPU selections in all-computer-brands. The RAM is your computers memory. The higher the RAM the more applications and programs that could be installed and performed stimultaneously. A high amount of Ram are most commonly utilized by multi taskers which like to run various applications such as surfing the internet and playing movies and music at the same time.


The hard drive is the storage of the computer. If planning on storing a lot of files such as media files do not hesitate to opt for a vast hard drive. Knowing your computing needs and wants and determining which hardware component specifications will best suit and ease your selections of all-computer-brands. There are also specific computer models suited just for gaming, entertainment, working, travel, or all purpose computing uses by all-computer-brands that encourages in picking the right one when considering a computer.

latest inventions in computers

The following are the new inventions:-
1. blue ray discs:- in the coming days blue ray dvd are going to become popular bcoz the can store about 10 to 50 gb data .
2. dna microchips :- dna microchips will replace the traditional silicon microchips in the coming future and will make the computing faster and reliable.
Some other things include better graphics and speed in computers

3) Double processor capable mother boards
4) Water cooled systems
5) Oil submerged computers
6) Chipsets that link high performance video cards and processors together for improved performance or improved energy reduction

One of the more recent developments in an actual computers component, is the development of the 45nm processor. This is a manufacturing process that allows a processor to be only 45nm thick. The previous size was 90nm and 65nm. This allows the processor to run cooler, and be more efficient.

Actually a smaller processor will not run cooler, as it will be doing more work with less space, and will need much better cooling. i dont know if this applies to computers, but the most recent invention is the blu-ray disk, which holds 25GB of space on each layer, and they are making 2 layered bluray disks now.
Note: There are comments associated with this question. See the discussion page to add to the conversation.

Latest Product


When users registered with AirSet, they each get a Personal Cloud Computer that helps to manage his/her personal life. The personal cloud computer helps individuals keep track of personal calendars and contacts on the web and across different devices (cell phones, PDAs, desktops, etc.). The personal cloud computer also makes it possible for an individual to access his/her files anywhere anytime.
To manage other groups like family, work, and community in ones life, a user can add as many Group Cloud Computers to their AirSet network. A Group Cloud Computer comes with all the apps and features of a Personal Cloud Computer. But it also helps members of a group find out what’s going on in the group, make plans together, and contribute their input to the group.
Once a user sets up more than one cloud computer in their network, all of these cloud computers are networked together. The user can band together with other individual users in a number of groups that are of varying degrees of permanence. When the user leaves one group, he/she may lose membership to that one particular group, but retain all his/her other relationships.
One comment in the AirSet user community forum compared AirSet with other competitors. Creating new groups in Google Apps, Zoho Business, Blackboard, and Moodle follows a pyramid structure. If anyone wants to create a new group, they have to ask the person above them in the hierarchy. AirSet, on the other hand, is more like “an ideal 18th century town. Everyone is queen of her own castle, and people join together in different overlapping groups, needing nobody else's permission. I am the head teacher in the school and I rule the school group. Jeff is the head volunteer fireman and I am a member of his group. And so on. Importantly, groups can come and go. Some of us decide to hold a summer fair. The originator of the idea starts a temporary group. We join. The fair is held. The task is done and the group is disbanded. In this model there is no King, and groups are tools through which people come together for as long as necessary – and groups interact and overlap as needed
.


Cloud computing is Internet-based computing, whereby shared servers provide resources, software, and data to computers and other devices on demand, as with the electricity grid. Cloud computing is a natural evolution of the widespread adoption of virtualization, service-oriented architecture and utility computing. Details are abstracted from consumers, who no longer have need for expertise in, or control over, the technology infrastructure "in the cloud" that supports them.
Cloud computing describes a new supplement, consumption, and delivery model for IT services based on the Internet, and it typically involves over-the-Internet provision of dynamically scalable and often virtualized resources.It is a byproduct and consequence of the ease-of-access to remote computing sites provided by the Internet.This frequently takes the form of web-based tools or applications that users can access and use through a web browser as if it were a program installed locally on their own computer.
The National Institute of Standards and Technology (NIST) provides a somewhat more objective and specific definition here.The term "cloud" is used as a metaphor for the Internet, based on the cloud drawing used in the past to represent the telephone network,and later to depict the Internet in computer network diagrams as an abstraction of the underlying infrastructure it represents.Typical cloud computing providers deliver common business applications online that are accessed from another Web service or software like a Web browser, while the software and data are stored on servers.
Most cloud computing infrastructures consist of services delivered through common centers and built on servers. Clouds often appear as single points of access for consumers' computing needs. Commercial offerings are generally expected to meet quality of service (QoS) requirements of customers, and typically include service level agreements (SLAs).The major cloud service providers include Amazon, Rackspace Cloud, Salesforce, Skytap, Microsoft and Google.Some of the larger IT firms that are actively involved in cloud computing are Fujitsu, Dell,Red Hat,Hewlett Packard,IBM,VMware, and NetApp.

The fundamental concept of cloud computing is that the computing is "in the cloud" i.e. that the processing (and the related data) is not in a specified, known or the same place(s). This is in opposition to where the processing takes place in one or more specific servers that are known. All the other concepts mentioned are supplementary or complementary to this concept.
 

Generally, cloud computing customers do not own the physical infrastructure, instead avoiding capital expenditure by renting usage from a third-party provider. They consume resources as a service and pay only for resources that they use. Many cloud-computing offerings employ the utility computing model, which is analogous to how traditional utility services (such as electricity) are consumed, whereas others bill on a subscription basis. Sharing "perishable and intangible" computing power among multiple tenants can improve utilization rates, as servers are not unnecessarily left idle, which can reduce costs significantly while increasing the speed of application development. A side-effect of this approach is that overall computer usage rises dramatically, as customers do not have to engineer for peak load limits. In addition, "increased high-speed bandwidth" makes it possible to receive the same. The cloud is becoming increasingly associated with small and medium enterprises (SMEs) as in many cases they cannot justify or afford the large capital expenditure of traditional IT. SMEs also typically have less existing infrastructure, less bureaucracy, more flexibility, and smaller capital budgets for purchasing in-house technology. Similarly, SMEs in emerging markets are typically unburdened by established legacy infrastructures, thus reducing the complexity of deploying cloud solutions..

Cloud Computing


Cloud computing is location independent computing, whereby shared servers provide resources, software, and data to computers and other devices on demand, as with the electricity grid.Derived From the American High Masters Education Department (AHMED) cloud computing has skyrocketed in recent years. Cloud computing is a natural evolution of the widespread adoption of virtualization, service-oriented architecture and utility computing. Details are abstracted from consumers, who no longer have need for expertise in, or control over, the technology infrastructure "in the cloud" that supports them.

Cloud computing describes a new supplement, consumption, and delivery model for IT services based on the Internet, and it typically involves over-the-Internet provision of dynamically scalable and often virtualized resources. It is a byproduct and consequence of the ease-of-access to remote computing sites provided by the Internet. This frequently takes the form of web-based tools or applications that users can access and use through a web browser as if it were a program installed locally on their own computer.


The National Institute of Standards and Technology (NIST) provides a somewhat more objective and specific definition here. The term "cloud" is used as a metaphor for the Internet, based on the cloud drawing used in the past to represent the telephone network, and later to depict the Internet in computer network diagrams as an abstraction of the underlying infrastructure it represents. Typical cloud computing providers deliver common business applications online that are accessed from another Web service or software like a Web browser, while the software and data are stored on servers.

Most cloud computing infrastructures consist of services delivered through common centers and built on servers. Clouds often appear as single points of access for consumers' computing needs. Commercial offerings are generally expected to meet quality of service (QoS) requirements of customers, and typically include service level agreements (SLAs).[9] The major cloud service providers include Amazon, Rackspace Cloud, Salesforce, Skytap, Microsoft and Google. Some of the larger IT firms that are actively involved in cloud computing are Huawei, Cisco, Fujitsu, Dell, Red Hat, Hewlett Packard, IBM, VMware, Hitachi and NetApp.

Comparisons
Cloud computing derives characteristics from, but should not be confused with:

Autonomic computing — "computer systems capable of self-management"
Client–server model – client–server computing refers broadly to any distributed application that distinguishes between service providers (servers) and service requesters (clients)
Grid computing — "a form of distributed computing and parallel computing, whereby a 'super and virtual computer' is composed of a cluster of networked, loosely coupled computers acting in concert to perform very large tasks"
Mainframe computer — powerful computers used mainly by large organizations for critical applications, typically bulk data-processing such as census, industry and consumer statistics, enterprise resource planning, and financial transaction processing.
Utility computing — the "packaging of computing resources, such as computation and storage, as a metered service similar to a traditional public utility, such as electricity";
Peer-to-peer – distributed architecture without the need for central coordination, with participants being at the same time both suppliers and consumers of resources (in contrast to the traditional client–server model)
Service-oriented computing – Cloud computing provides services related to computing while, in a reciprocal manner, service-oriented computing consists of the computing techniques that operate on software-as-a-service

Characteristics


 
The fundamental concept of cloud computing is that the computing is "in the cloud" i.e. the processing (and the related data) is not in a specified, known or static place(s). This is in opposition to where the processing takes place in one or more specific servers that are known. All the other concepts mentioned are supplementary or complementary to this concept.

Generally, cloud computing customers do not own the physical infrastructure, instead avoiding capital expenditure by renting usage from a third-party provider. They consume resources as a service and pay only for resources that they use. Many cloud-computing offerings employ the utility computing model, which is analogous to how traditional utility services (such as electricity) are consumed, whereas others bill on a subscription basis. Sharing "perishable and intangible" computing power among multiple tenants can improve utilization rates, as servers are not unnecessarily left idle, which can reduce costs significantly while increasing the speed of application development. A side-effect of this approach is that overall computer usage rises dramatically, as customers do not have to engineer for peak load limits. In addition, "increased high-speed bandwidth" makes it possible to receive the same. The cloud is becoming increasingly associated with small and medium enterprises (SMEs) as in many cases they cannot justify or afford the large capital expenditure of traditional IT. SMEs also typically have less existing infrastructure, less bureaucracy, more flexibility, and smaller capital budgets for purchasing in-house technology. Similarly, SMEs in emerging markets are typically unburdened by established legacy infrastructures, thus reducing the complexity of deploying cloud solutions.

Architecture

Cloud architecture, the systems architecture of the software systems involved in the delivery of cloud computing, typically involves multiple cloud components communicating with each other over application programming interfaces, usually web services. This resembles the Unix philosophy of having multiple programs each doing one thing well and working together over universal interfaces. Complexity is controlled and the resulting systems are more manageable than their monolithic counterparts.

The two most significant components of cloud computing architecture are known as the front end and the back end. The front end is the part seen by the client, i.e. the computer user. This includes the client’s network (or computer) and the applications used to access the cloud via a user interface such as a web browser. The back end of the cloud computing architecture is the ‘cloud’ itself, comprising various computers, servers and data storage devices.

History

The underlying concept of cloud computing dates back to the 1960s, when John McCarthy opined that "computation may someday be organized as a public utility." Almost all the modern-day characteristics of cloud computing (elastic provision, provided as a utility, online, illusion of infinite supply), the comparison to the electricity industry and the use of public, private, government and community forms was thoroughly explored in Douglas Parkhill's 1966 book, The Challenge of the Computer Utility.

The actual term "cloud" borrows from telephony in that telecommunications companies, who until the 1990s primarily offered dedicated point-to-point data circuits, began offering Virtual Private Network (VPN) services with comparable quality of service but at a much lower cost. By switching traffic to balance utilization as they saw fit, they were able to utilize their overall network bandwidth more effectively. The cloud symbol was used to denote the demarcation point between that which was the responsibility of the provider from that of the user. Cloud computing extends this boundary to cover servers as well as the network infrastructure. The first scholarly use of the term “cloud computing” was in a 1997 lecture by Ramnath Chellappa.

Amazon played a key role in the development of cloud computing by modernizing their data centers after the dot-com bubble, which, like most computer networks, were using as little as 10% of their capacity at any one time, just to leave room for occasional spikes. Having found that the new cloud architecture resulted in significant internal efficiency improvements whereby small, fast-moving "two-pizza teams" could add new features faster and more easily, Amazon initiated a new product development effort to provide cloud computing to external customers, and launched Amazon Web Service (AWS) on a utility computing basis in 2006.

In 2007, Google, IBM and a number of universities embarked on a large scale cloud computing research project.[34] In early 2008, Eucalyptus became the first open source AWS API compatible platform for deploying private clouds. In early 2008, OpenNebula, enhanced in the RESERVOIR European Commission funded project, became the first open source software for deploying private and hybrid clouds and for the federation of clouds . By mid-2008, Gartner saw an opportunity for cloud computing "to shape the relationship among consumers of IT services, those who use IT services and those who sell them"[36] and observed that "[o]rganisations are switching from company-owned hardware and software assets to per-use service-based models" so that the "projected shift to cloud computing ... will result in dramatic growth in IT products in some areas and significant reductions in other areas.


Key features

Agility improves with users' ability to rapidly and inexpensively re-provision technological infrastructure resources.
Application Programming Interface (API) accessibility to software that enables machines to interact with cloud software in the same way the user interface facilitates interaction between humans and computers. Cloud Computing systems typically use REST based APIs.
Cost is claimed to be greatly reduced and capital expenditure is converted to operational expenditure. This ostensibly lowers barriers to entry, as infrastructure is typically provided by a third-party and does not need to be purchased for one-time or infrequent intensive computing tasks. Pricing on a utility computing basis is fine-grained with usage-based options and fewer IT skills are required for implementation (in-house).
Device and location independence enable users to access systems using a web browser regardless of their location or what device they are using (e.g., PC, mobile). As infrastructure is off-site (typically provided by a third-party) and accessed via the Internet, users can connect from anywhere.
Multi-tenancy enables sharing of resources and costs across a large pool of users thus allowing for:
Centralization of infrastructure in locations with lower costs (such as real estate, electricity, etc.)
Peak-load capacity increases (users need not engineer for highest possible load-levels)
Utilization and efficiency improvements for systems that are often only 10–20% utilized.
Reliability is improved if multiple redundant sites are used, which makes well designed cloud computing suitable for business continuity and disaster recovery. Nonetheless, many major cloud computing services have suffered outages, and IT and business managers can at times do little when they are affected.
Scalability via dynamic ("on-demand") provisioning of resources on a fine-grained, self-service basis near real-time, without users having to engineer for peak loads. Performance is monitored, and consistent and loosely coupled architectures are constructed using web services as the system interface. One of the most important new methods for overcoming performance bottlenecks for a large class of applications is data parallel programming on a distributed data grid.
Security could improve due to centralization of data, increased security-focused resources, etc., but concerns can persist about loss of control over certain sensitive data, and the lack of security for stored kernels. Security is often as good as or better than under traditional systems, in part because providers are able to devote resources to solving security issues that many customers cannot afford. Providers typically log accesses, but accessing the audit logs themselves can be difficult or impossible. Furthermore, the complexity of security is greatly increased when data is distributed over a wider area and / or number of devices.
Maintenance of cloud computing applications is easier, since they don't have to be installed on each user's computer. They are easier to support and to improve since the changes reach the clients instantly.
Metering means that cloud computing resources usage should be measurable and should be metered per client and application on a daily, weekly, monthly, and yearly basis.

Application
 
 

Cloud application services or "Software as a Service (SaaS)" deliver software as a service over the Internet, eliminating the need to install and run the application on the customer's own computers and simplifying maintenance and support. People tend to use the terms ‘SaaS’ and ‘cloud’ interchangeably, when in fact they are two different things.[citation needed] Key characteristics include:[clarification needed]

Network-based access to, and management of, commercially available (i.e., not custom) software
Activities that are managed from central locations rather than at each customer's site, enabling customers to access applications remotely via the Web
Application delivery that typically is closer to a one-to-many model (single instance, multi-tenant architecture) than to a one-to-one model, including architecture, pricing, partnering, and management characteristics
Centralized feature updating, which obviates the need for downloadable patches and upgrades.

Wednesday, January 12, 2011

Super Computers


supercomputer is a computer that is at the frontline of current processing capacity, particularly speed of calculation. Supercomputers introduced in the 1960s were designed primarily by Seymour Cray at Control Data Corporation (CDC), and led the market into the 1970s until Cray left to form his own company, Cray Research. He then took over the supercomputer market with his new designs, holding the top spot in supercomputing for five years (1985–1990). In the 1980s a large number of smaller competitors entered the market, in parallel to the creation of the minicomputer market a decade earlier, but many of these disappeared in the mid-1990s "supercomputer market crash".

Cloud computing holds plenty of promise—one key to success is planning ahead.


Cloud computing is an umbrella term used loosely to describe the ability to connect to software and data via the Internet (the cloud) instead of your hard drive or local network. The following story is the second in a three-part series—aimed at helping IT decision-makers break through the hype to better understand cloud computing and its potential business benefits.

As more and more businesses consider the benefits of cloud computing, IT leaders are suggesting the best strategy to this emerging technology model is a thoughtful plan that weighs its impact in all corners of the business.

Because cloud computing is a still-fledgling model, vendors and standards bodies are busy sorting through definitions and interoperability issues. That aside, businesses can be certain that cloud computing, at its core, is an outsourced service and outsourcing by its very nature implies potential risks.

Cloud computing can offer real benefits, including lower data center and overall IT costs, streamlined operational efficiency, and a pathway to the latest technology. But that doesn't mean it's the best path for every company or every application.

Outlined below, Lori MacVittie, technical marketing manager at Seattle-based application delivery provider F5 Networks, and William Penn, chief architect at Detroit-based on-demand platform provider Covisint, weigh in on key risk issues companies should consider before making the move to cloud computing.

1. Lack of planning. The biggest risk is not having a roadmap, says Penn. Companies need to understand how external services fit into their enterprise as a whole.

"It can be a struggle for some people to have the vision to incorporate outside services into their business plan," says Penn. "But the outside network, the cloud, needs to be part of that roadmap."

2. Integration challenges. Most businesses aren't moving all of their applications to the cloud, and probably never will-and this causes data integration challenges. Penn says it's important to remember that it isn't just hardware or software that needs integration, but also processes, problem resolution, and employee interaction with data and systems.

3. Security concerns. Security is top of mind for IT executives—both the physical security of the data center, as well as the intervening network and security of the data itself. Data in the cloud is housed and accessed via an offsite server owned by a third party. Companies need to carefully consider the security and liability implications for proprietary data and overall business models.

4. Compliance guidelines needed. Cloud providers haven't yet addressed various industry standards such as HIPAA or Sarbanes Oxley, so companies with strict compliance or audit constraints are less likely to be able to use external applications.
5. Lack of technology standards. For now, there are no technology industry standards for coordination within and among data centers or vendors. Technology industry leaders are still debating the definition of cloud computing itself, so it will take some time before any standards are set.


Microsoft is proud that a system running Windows HPC Server 2008 took 10th place... behind nine supercomputers running Linux. Even then, this was really more of a stunt than a demonstration that the HPC Server system is ready to compete with the big boys.

The Roadrunner does have competition now though. The Cray XT Jaguar also recently busted the petaflop wall. The Cray also, of course, runs Linux. In the XT's case, it's running CNL (Compute Node Linux). CNL is based on SUSE Linux.

Needless to say, all the Linux systems do have working parallel-processing languages, like GCC, PGI and PathScale. For now, and the foreseeable future, Linux will not only stay the fastest computers, they'll also be the most useful fast computers.

Super Computers


A computer is a machine that manipulates data according to a list of instructions.
The first devices that resemble modern computers date to the mid-20th century (around 1940 - 1945), although the computer concept and various machines similar to computers existed earlier. Early electronic computers were the size of a large room, consuming as much power as several hundred modern personal computers.[1] Modern computers are based on tiny integrated circuits and are millions to billions of times more capable while occupying a fraction of the space.[2] Today, simple computers may be made small enough to fit into a wristwatch and be powered from a watch battery. Personal computers, in various forms, are icons of the Information Age and are what most people think of as "a computer"; however, the most common form of computer in use today is the embedded computer. Embedded computers are small, simple devices that are used to control other devices — for example, they may be found in machines ranging from fighter aircraft to industrial robots, digital cameras, and children's toys.
The ability to store and execute lists of instructions called programs makes computers extremely versatile and distinguishes them from calculators. The Church–Turing thesis is a mathematical statement of this versatility: any computer with a certain minimum capability is, in principle, capable of performing the same tasks that any other computer can perform. Therefore, computers with capability and complexity ranging from that of a personal digital assistant to a supercomputer are all able to perform the same computational tasks given enough time and storage capacity.

The era in which computing power doubles every two years is drawing to a close, according to the man behind Moore's Law Jonathan Richards For decades it has been the benchmark by which advancements in computing are measured. Now Moore's Law - the maxim which states that computers double in speed roughly every two years - has come under threat, from none other than the man who coined it. Gordon Moore, the retired co-founder of Intel, wrote an influential paper in 1965 called 'Cramming more components onto integrated circuits', in which he theorised that the number of transistors on a computer chip would double at a constant rate. Silicon Valley has kept up with his widely accepted maxim for more than 40 years, to the point where a new generation of chips, which Intel will begin to produce next year, will have transistors so tiny that four million of them could fit on the head of a pin. Related Links * IBM unveils nanotechnology chip advance * Intel chips away at AMD * Intel apologises for 'racist' ad In an interview yesterday, however, Mr Moore said by about 2020, his law would come up against a rather intractable stumbling block: the laws of physics. 
"Another decade, a decade and a half, I think we'll hit something fairly fundamental," Mr Moore said at Intel's twice-annual technology conference. Then Moore's Law will be no more. Mr Moore was speaking as Intel gave its first demonstration of a new family of processors, to be introduced in November, which contain circuitry 45 nanometres - billionths of a metre - wide. The 'Penryn' processors, 15 of which will be introduced this year, with another 20 to follow in the first quarter of 2008, will be so advanced that a single chip will contain as many as 820 million transistors. Computer experts said today that a failure to live up to Moore's Law would not limit the ultimate speed at which computers could run. Instead, the technology used to manufacture chips would shift. The current method of Silicon-based manufacturing is known as "bulk CMOS", which is essentially a 'top-down' approach, where the maker starts with a piece of Silicon and 'etches out' the parts that aren't needed. "The technology which will replace this is a bottom-up approach, where chips will be assembled using individual atoms or molecules, a type of nanotechnology," Jim Tully, chief of research for semi-conductors at Gartner, the analyst, said. "It's not standardised yet - people are still experimenting - but you might refer to this new breed of chips as 'molecular devices'." Anthony Finkelstein, head of computer science at University College London, said, however, that a more pressing problem in the meantime was to write programs which took full advantage of existing technologies. "It's all very well having multicore chips in desktop machines, but if the software does not take advantage of them, you gain no benefit." "We are hitting the software barrier before we hit the physical barrier," he said. Mr Moore, who is 78, pioneered the design of the integrated circuit, and went on to co-found Intel in 1968, where he served as chief executive between 1975 and 1987.

History of computing hardware
The history of computer hardware encompasses the hardware, its architecture, and its impact on software. The elements of computing hardware have undergone significant improvement over their history. This improvement has triggered worldwide use of the technology, performance has improved and the price has declined.


[1] Computers are accessible to ever-increasing sectors of the world's population.

[2] Computing hardware has become a platform for uses other than computation, such as automation, communication, control, entertainment, and education. Each field in turn has imposed its own requirements on the hardware, which has evolved in response to those requirements.

[3]The von Neumann architecture unifies our current computing hardware implementations.

[4] Since digital computers rely on digital storage, and tend to be limited by the size and speed of memory, the history of computer data storage is tied to the development of computers. The major elements of computing hardware implement abstractions: input,

[5] output,

[6] memory,

[7] and processor. A processor is composed of control

[8] and data path.


[9] In the von   Neumann architecture, control of the data path is stored in memory. This allowed control to become an automatic process; the data path could be under software control, perhaps in response to events. Beginning with mechanical data paths such as the abacus and astrolabe, the hardware first started using analogs for a computation, including water and even air as the analog quantities: analog computers have used lengths, pressures, voltages, and currents to represent the results of calculations.

[10] Eventually the voltages or currents were standardized, and then digitized. Digital computing elements have ranged from mechanical gears, to electromechanical relays, to vacuum tubes, to transistors, and to integrated circuits, all of which are currently implementing the von  Neumann architecture.