External Link

Parallels et Similitudes Stockage dans un Cloud (nuage)

Cloud storage similarities (Pstorage) is highly available distributed storage (virtual SAN) with built-in replication and disaster recovery.Pstorage virtualization platform allows the storage on top devices in material with hard drives locally and was able to unite in the storage cluster in scenarios such as virtualization using virtual machines (vessels) and / or monitoring container system (CTS). Pstorage provides rapid live migration of VMS monitoring...

Information Service and Cloud Dedicated Hosting

Dedicated hosting service, dedicated server, or managed hosting service is a type of internet hosting the client leases an entire server not shared with anyone. It is more flexible than shared hosting, and organizations have full control over the server (s), including choice of operating system, hardware, etc. There is also another level of dedicated or managed hosting managed hosting is called complex. Apply complex managed hosting on physical...

Virtual Private Servers in the world revolution Virtualization

Virtual Private Server (VPS) is a virtual machine is sold as a service by Internet hosting service.VPS runs its own copy of the operating system, customers have access to the root level for that instance of the operating system, so they can install almost any software that runs on this operating system.For many applications are functionally equivalent to a dedicated physical server, and being defined by the software can be created more easily...

Cloud Game technology Cover

Game on demand, sometimes called the games on demand, is a kind of online games. Currently, there are two main types of clouds games: Games video game based on the cloud and streaming file based on the cloud. It aims to provide cloud gaming to end users less friction and live ability to play games across various devices.Cloud types of gamesGame on demand is a term used to describe the shape of online game distribution. The most common games Video...

Insurance data center, not to lose money and information

Insurance for the data center facility used to house computer systems and associated components, such as telecommunications and storage systems. Generally includes redundant or backup power, redundant data communications connections, environmental controls (eg, air conditioning, fire suppression) and the various safety devices. Large data centers are operations on an industrial scale using a larger amount of energy in a small town.Requirements for...

The principle cloud apple icloud

Cloud computing is the storage cloud service Apple launched in October 12, 2011. In July 2013, the service was 320 million users.This service provides users with ways to store data such as documents, pictures and music on remote servers to download for iOS and devices Macintosh or Windows, share and send data to other users and manage their devices apples in case of loss or theft.Service also provides a way to save the iOS wireless devices directly...

Cloud database technology

Cloud database is a database, which usually runs on the cloud computing platform, such as Amazon EC2, GoGrid, Salesforce.com, Rackspace, Microsoft Azure. There are two joint deployment models: Users can run databases on the cloud independently, using the image of the virtual machine, or they can buy access to the database service, which is maintained by the cloud database provider. Databases available on the cloud, and some are and some SQL-based...

External Link

Thursday, December 22, 2016

'Father' Of The Internet

Tim Berners-Lee, while working as an independent consultant at a nuclear research laboratory in 1980, developed an innovative way of storing information in a program named Enquire.

That work was later used as the foundation for the development of a global hypertext system - popularly known as the Internet or the World Wide Web.

The WWW was developed to increase the ease with which people could exchange information. This became a reality with the introduction of the first WYSWIG (What You See Is What You Get) hypertext web browser which was written by Tim Berners-Lee.

The advantage of the WWW over previous systems was the lack of a need for a centralized server. In short, this meant that it was just as easy to retrieve, as well as link to, a document that was down the hall as across the world.

This was a huge breakthrough in computing science.

The Web and the first web server were released to the hypertext communities in mid 1991, after being released within CERN in late 1990. In order to achieve a coherent standard for the WWW, specifications for URLs, HTML and HTTP were published.

The universality forced by these specifications, the non-dependence on a central server and decision by Berners-Lee not to profit from the WWW led to a high level of adoption of the technology between 1991-94. A ten fold increase in annual traffic was recorded on the first Web server during this period.

With the advent of the Web, a number of spin-off technologies have emerged. A vast array of server side, client side and database languages have been created to fulfill needs of businesses and individuals.

There are two types of programming languages used on the WWW: client-side and server-side.

A client-side language is executed in the users' browser and is not dependent upon the Web server. Client-side programming is done almost exclusively with JavaScript.

A server-side language executes on the Web server. In recent years server-side programming has become more popular than client-side programming because it is independent of the type of browser that the surfer is using. Programmers refer to this as being 'cross-platform'. Perl, PHP, ASP and JSP are popular client-side programming languages.

Databases have been developed to allow for 'dynamic' websites.

Dynamic websites allow for a high level of personalization when retrieving information.

Whenever you type in values in a form on a web page - whether those values are for a user id and password, the characteristics of your ideal partner or an author's name - it's a 'dynamic' web site. That is just a way of saying that there is a database being used to run the website.

Popular databases used include MySQL, PostgreSQL, Microsoft SQL Server and Oracle.

An area of the WWW that Berners-Lee has direct involvement is in his role as the Director of the World Wide Web Consortium (WC3) which has existed since 1994.

The aim of the WC3 is to achieve coherent standards between all companies using web technologies such as HTML, CSS and XML. Prior to the creation of the standards detailed by the WC3, companies used different standards, which led to potential incompatibilities. The WC3 remedied this by creating an open forum - allowing companies to agree on core standards for WWW technologies.

The future of Berners-Lee's influence on modern computing is in the context of the Semantic Web. 'Semantic' means 'meaning'.

A semantic web is one where elements that appear in a document hold some meaning that can be automatically processed by a machine in some form of data gathering. Currently, documents on the WWW written in HTML hold no meaning ­ they're presentation based.

Tim Berners-Lee laid the conceptual foundation for the World Wide Web. It was his initial idea to create a way where information could be freely and easily exchanged. The standards associated with it and the lack of reliance upon a central server, gave the Web a cross platform advantage and independence, which led to its meteoric rise in popularity.

In turn, its popularity spawned and popularized many different programming languages, databases, markup standards, servers - as well as - viruses and worms.

However, while Tim Berners-Lee is the 'father' of the Internet, its development over the years is a result of the efforts of an extraordinary number of individuals. There is little question that 500 or 1,000 years from now, historians will look at the invention of the Internet as one of those rare, seminal historical events - much like Gutenberg's printing press.


...How Much Do You Charge For "X"?

This is a question that comes up a lot on sales calls and one that you want to handle with care. As I've stated in other posts, questions are always driven by thoughts and never happen by accident. There is always a "context" from which the questions come and your ability to understand the context will improve your odds in developing the right answer. When I suggest that we work to create the "right" answer I don't mean that we are trying to fool anyone. Frequently, when we are addressing questions there are multiple answers and we just want to make sure that we have a higher likely hood of picking the right one.

In relation to price questions, it is always important to answer the question "in context". So usually, in order to understand the "context" in which the question was asked you'll need to ask more questions. Also, you'll often find that the question "how much do you charge" is really not the real question. Starting a dialog with the prospect about what they want will move them away form price and get you better information. Using a "reflector" or reverse will help you understand the real question.

Of all of the "reflectors" or reverses that we teach in relation to price one of the simplest has turned out to be one of the best. When asked about price try "...it depends". This simple phrase has an uncanny way of handling an awful lot of the price questions you'll get. Price often depends on a lot of things like:

When do you need it?

How many do you need?

What kind do you need? (good, better, best?)

Another great reflector, particularly effective on the telephone for inside sales people is "while I'm looking it up did you select that item for a reason?". Often times prospects calling in for a price on an item, hear it... and hang up. Engaging the prospect and getting better information will not only help you build rapport but eliminate a lot of those "get a price and hang up calls".

Talking about price before understanding what your prospect is trying to accomplish is sales suicide. Use some of these simple reflectors and you'll get better results!

Tuesday, December 6, 2016

Parallels et Similitudes Stockage dans un Cloud (nuage)


Cloud storage similarities (Pstorage) is highly available distributed storage (virtual SAN) with built-in replication and disaster recovery.

Pstorage virtualization platform allows the storage on top devices in material with hard drives locally and was able to unite in the storage cluster in scenarios such as virtualization using virtual machines (vessels) and / or monitoring container system (CTS). Pstorage provides rapid live migration of VMS monitoring devices and CTS through the nodes without having to copy the VM / CT data, the high-availability storage as it becomes available remotely.

Advantages

Are listed in the main Pstorage features below:
There are no special hardware requirements. Basic equipment (SATA / SAS, 1GB Ethernet drives +) can be used to create storage.

Indications of strong consistency. This makes it suitable for Pstorage via the iSCSI protocol, VM and TC operates on top of that (unlike the storage objects such as Amazon S3 or SWIFT).
  • Built-in replication.
  • Automatic recovery from disasters on the hard drive or node failure.
  • High availability. Data remains accessible even for the failure of the hard drive or the contract.
  • Optional caching SSD. SSD caches enhance the overall performance of the group to write and read operations.
  • Checksumming data and washing. Test and washing greatly improve the reliability of the data.
  • Development upon request. You can add more storage nodes to the cluster to increase its disk space. Image VM / CT size is not limited by the size of one of the hard drives.
  • Petabytes tables
  • More uniform performance materials and capacity utilization on the contract, and the contract has been taken advantage of those inactive.
  •  High performance - similar to the SAN.

Monday, December 5, 2016

Information Service and Cloud Dedicated Hosting


Dedicated hosting service, dedicated server, or managed hosting service is a type of internet hosting the client leases an entire server not shared with anyone. It is more flexible than shared hosting, and organizations have full control over the server (s), including choice of operating system, hardware, etc. There is also another level of dedicated or managed hosting managed hosting is called complex. Apply complex managed hosting on physical servers, hybrid server and virtual servers with many companies choose hybrid (a combination of physical and virtual) hosting solution. There are many similarities between the standard and complex managed hosting, but the main difference is the level of administrative support and engineering the customer pays for all because of the growing size and complexity of infrastructure deployment
. Steps presented to support most of the administration, including security, memory, storage, and support of information technology. Proactive service in the first place in nature. Usually can supply [1] Server is managed by a hosting company as an added service. In some cases, a dedicated server can offer less overhead and more on the return on investment. Often includes servers in data centers, similar to owned facilities, providing redundant power sources and HVAC systems. However property, which is owned by the server hardware vendor, in some cases, which provide support for operating systems or applications. [Citation needed]Use the dedicated hosting offers high performance service, security and stability benefits of e-mail, and control. Because of the relatively high cost of dedicated hosting, and is mostly used by sites that receive large amounts of traffic.

Bandwidth and communication

Denotes transfer rate of data bandwidth or the amount of data that can be transferred from one point to another in a given period of time (usually two), and are often represented in bits (data) per second (bit / s). For example, visitors to your server, a website or applications to take advantage of the bandwidth * Third - total transport (measured in bytes transferred)

95 percentile method


Line speed, and described the 95 percentile, refers to the speed at which the data stream from the server or device, measuring 5 minutes for this month, dropping the top 5% of the measures that are more high, based on the use of the following month to the highest extent. This is similar to the measurement of the average, which can be considered as a percentile of 50 gauge (50% of the above steps, and 50% of the measurements below), then it puts the bits in the 95 percentile with 5% of the above measures value, and 95% of measurements below the value. It is also known as the Burstable bills. The transmission rate is measured in bits per second (or kilobits per second, or Mbps Gigabit per second).

Unlimited way


Measure the width of the second frequency range is unlimited, where the roof or control of the speed server service "high end" suppliers. High-speed line to the measured bandwidth is not the sum of Mbit / s allocated to the server and configured at the transformation. For example, if you buy a 10 Mbit / s unlimited bandwidth, online top speed is 10 Mbit / s. At 10 Mbit / s leads to the provider to control the speed of the transfer will take place, while providing the opportunity for the owner of a dedicated server to avoid being accused of excess bandwidth. Unlimited bandwidth services usually incur additional charges.

Total transfer method


Some providers calculate the total transfer, which is a measure of the actual outgoing and incoming data, measured in bytes. Although it is generally the sum of all traffic to and from the server, a certain degree of traffic providers only (password from the Internet server).

bandwidth aggregation


This is a key mechanism to accommodate buyers determine which provider to provide the right pricing mechanism for bandwidth price band. [According to who?] The prize package hosting providers over dedicated bandwidth with a monthly fee for a dedicated server. Let us illustrate this with the help of an example. This average $ 100 per server from one of the common providers offer customized domain with 2 TB of bandwidth. Suppose you bought 10 servers, you will have the opportunity to consume 2 TB of bandwidth per server. However, suppose your application structure is given only two of the 10 web servers really face while using the rest to store, search, data or other internal functions of the base, then supplier that allows compile bandwidth allows you to consume the year 20 TB of incoming or outgoing bandwidth, or both, depending on their policy. The assembly, which does not provide the bandwidth you can use only four terabytes of bandwidth provider, will the rest of the 16 TB of bandwidth will be unusable in practice. This fact is known by all hosting service providers and hosting providers to reduce costs by offering the amount of bandwidth that will be used frequently. This is known as the increase in sales, and enables customers to higher bandwidth to use more of the group can provide otherwise, because they know that this will be offset by customers who use less than the maximum allowed.One reason for the choice of outsourcing household sources is the availability of high power many institutions networks. As a dedicated server providers take advantage of the huge amounts of bandwidth, they are able to get lower prices on the basis of the size to include several combination providers of bandwidth. To get the same type of multi-vendor network without combination of bandwidth, it will be a great investment in routers heart, long-term contracts, monthly bills and expensive, you need to be in place. Necessary to develop a network without a bandwidth multi-vendor mixture is not economically significant to accommodate spending services.And include many suppliers server service level agreement-based network dedicated arrive on time. Some providers dedicated server hosting services offer 100% of the time on their network. By securing multiple vendors to connect to and use of redundant hardware of necessity, and suppliers are able to ensure the highest up-time; usually between 99 to 100% up time and they are the highest quality provider. One aspect is the highest quality providers are more likely to be up to the quality link across multiple service providers, which in turn provides a large redundancy in the event a multiple drops store in addition to improving possible routes to destinations.Bandwidth consumption has become in recent years in the use of the model to the use Mbps per gigabyte. A measured bandwidth traditionally access line, which included the possibility of buying the necessary Mbps certain monthly cost of speed. Shared hosting model developed, and the tendency to GB or total number of bytes transferred, replace Mbps speed line began to form a dedicated server providers therefore offer per gigabyte.The main players in the server market with dedicated large amounts of bandwidth ranging from 500 to 3,000 gigabytes GB model using "increase sales." It is not uncommon for key players to provide the service with 1Terabyte (TB) of the pass band or higher. Use the measurement models based on the byte level usually include a certain amount of bandwidth to each price per gigabyte server after a certain threshold has been reached. Expect to pay extra to use the bandwidth of the surplus. For example, if it has been assigned a dedicated server 3000 GB of bandwidth per month, and uses the client 5000 GB of bandwidth in the billing period, you will be charged for the additional bandwidth Go 2000 that the bandwidth of the the excess. Each supplier has a different model for the bills. Not set standards in the industry.

Virtual Private Servers in the world revolution Virtualization

Virtual Private Server (VPS) is a virtual machine is sold as a service by Internet hosting service.VPS runs its own copy of the operating system, customers have access to the root level for that instance of the operating system, so they can install almost any software that runs on this operating system.For many applications are functionally equivalent to a dedicated physical server, and being defined by the software can be created more easily and configured. They are priced much lower than the equivalent physical server, but since they share the basic hardware with another VPS, performance may be lower and can depend on the volume of work in other cases on the same hardware node.

By default (Virtualization)

Virtual server is a driving force similar to those that led to the development of time-sharing multi-programming in the past. Although resources are still common, as in the time-sharing model, virtualization offers the highest level of security, as the default type used, individual isolated virtual servers and operating systems, most of the other and can work in itself, which can be re-alone independent start as a virtual instance.It was the division of a single server to view multiple servers increasingly common on small computers since the launch of VMware ESX Server software in 2001. The physical server hypervisor usually works commissioned to create, release, manage and a "guest" resources, operating systems, or virtual machines. And the distribution of these guest operating a share of server resources of physical systems, usually in a way that visitors are not aware of other material resources than those allocated to it by the program of hypervisor. The VPS runs its own copy of the operating system, customers have access to the extraordinary level of this instance of the operating system, you can install almost any software that runs on the operating system, but Given the number of virtual clients usually run on a single machine and VPS usually has a limited time processor, and RAM, and disk space.
Although VMware and Frt- V dominate the default home for business, they are less frequent for VPS providers, mainly due to cost constraints - which generally use products like OpenVZ, Virtuozzo, Xen or KVM.


Accommodation hosting

Main article: Comparison of virtual hardware platformMany companies offer virtual private server hosting or virtual dedicated server hosting an extension of the hosting services. There are many challenges that must be taken into account when proprietary software license in a multi-tenant virtual environments.With unmanaged or cars Managed Hosting, allows customers to manage its server instance.Usually host unlimited display with no limit on the amount of data transferred across the line bandwidth fixed. Typically, offered to host unlimited  with 10 Mbit / s and 100 Mbit / s or 1000 Mbit / s (with some up to 10 Gbit / s). This means that the client is theoretically able to use CT ~ 3.33 in the 10 Mbit / s ~ 33 TB 100 Mbit / s and 333 ~ TB 1000 Mbit / s monthly line (though in practice and the values ​​that will be much less). In virtual private server, it will be shared bandwidth and (should) means that there is fair use policy concerned. As host unlimited marketing is usually limited but generally because of the policies and conditions of service acceptable use. Offers unlimited disk space and bandwidth is always wrong because of the costs, and the ability to support the technological limitations.

Sunday, December 4, 2016

Cloud Game technology Cover

Game on demand, sometimes called the games on demand, is a kind of online games. Currently, there are two main types of clouds games: Games video game based on the cloud and streaming file based on the cloud. It aims to provide cloud gaming to end users less friction and live ability to play games across various devices.

Cloud types of games


Game on demand is a term used to describe the shape of online game distribution. The most common games Video Cloud currently methods (or pixels) and the flow of the stream file.

Video


"Game on demand", also called "games on demand" is a kind of online games that allow live feeds and custom games on computers and mobile devices, similar to video on demand, thanks to the using a thin client. Are stored in the actual game, performed, and provided that the operator on the server company or remote game flowing video results directly on consumer computers across the Internet.  This allows access to games without the need for a controller and makes a great capacity of the user's computer is to a large extent, the server is a system that performs the processing needs.   and move the controls and pressing the button the user directly to the server, where they are registered, and the server sends back to the game controls the input response.

Companies that use this type of cloud gaming include PlayGiga, CiiNOW, Ubitus, Playcast Media Systems, Gaikai and OnLive.

Games on demand service is a game that takes advantage of a broadband connection, a large server clusters, encryption and compression to stream to a subscriber game content. Users can play games without downloading or installing the actual game. Is not the game content is stored on the hard drive occurs playing hard to implement the user code in the first place to the cluster server, so the subscriber can use the computer less energy play the game the game usually requires from the server does all the heavy performance processes that are usually performed by the computer of the user The final  Most gaming platforms clouds and closed property ; not published the first cloud platform open source games until April, 2013. 
stream file

Cloud games using streaming files, also known as progressive download, thin client, which is published in the actual game running on your gaming device as a mobile device or a computer or console of the user. Small part of the game, usually less than 5% of the total size of the game are loaded at the beginning so that the player can start playing quickly. The game does the rest of the content to the end user device while playing. This allows instant access to games with low-bandwidth connections to the Internet without delay. Cloud is used to provide a means for those who are subject to the flow of the game content and analyzing large amounts of data.

Games based on cloud streaming file requires a device that has the capabilities of the hardware to run the game. Often content, the game is downloaded is stored on the end user device, where it was stored temporarily.

Companies that use this type of cloud gaming include Kalydo, Approxy and SpawnApps.

Insurance data center, not to lose money and information

Insurance for the data center facility used to house computer systems and associated components, such as telecommunications and storage systems. Generally includes redundant or backup power, redundant data communications connections, environmental controls (eg, air conditioning, fire suppression) and the various safety devices. Large data centers are operations on an industrial scale using a larger amount of energy in a small town.

Requirements for insurance  modern data centers

IT operations is the most important aspect of the organization's activities worldwide. One of the main concerns is business continuity. Companies rely on their information systems to run their operations. If the system became available, may be reduced or ceased operations company completely. It is necessary to provide a reliable infrastructure for IT operations, to minimize any risk of crashes. Information security is a concern,
which is why the data center must provide a safe environment that reduces the chances of a security breach. Data center must maintain high standards to ensure the safety functions of the computer environment hosted there. This is accomplished through repetition of mechanical cooling and electrical systems (including energy generators in emergencies) data center along the fiber optic cable service.
Identifies infrastructure TIA-942 Unified Communications of the Association of the telecommunications industry for data centers, the minimum requirements for the centers of communications infrastructure data and the data center, including data centers, hosting centers and one multi-tenant company and Internet tenant data. It aims topology proposed in this document apply to any size data center.
Telcordia GR-3160 NEBS requirements for equipment provides communications and data center spaces, and guidelines for data centers in the areas of telecommunications networks, and environmental requirements for the equipment to be installed in these spaces. These standards have been developed jointly by Telcordia and industry representatives. And can be applied to areas of data centers with equipment data or information processing technology (IT). The equipment can be used for: operation and management of the communications network to provide carrier-based applications directly on the data center to the customer by the carrier to provide hosted applications to third parties to provide services to its customers to provide a combination of these and similar data center applications
The efficient operation of the data center requires a balanced investment in both the installation and equipment shelters. The first step is to create the appropriate reference center for equipment installation environment. Tawhid and stereotypes can generate savings and efficiency in the design and construction of communications data centers.
Normalization means building integrated and engineering equipment. Module has the advantages of portability and ease of growth, even though the expectations of planning is not the best. For these reasons, should be planned data centers in frequent communications equipment building blocks, strength and support (conditioning) equipment when associated practices. The use of a dedicated centralized systems require more accuracy to the needs of future expectations to prevent expensive in construction, or perhaps even worse - under construction fails to meet future needs.
Data "lights-out", also known as the center of a dark or dark data center is almost an ideal data center and eliminate the need for direct access by employees, except in exceptional circumstances. Due to the lack of need for staff to enter the data center, and can be used without lights. All devices can be accessed and managed by remote systems with automation software used to implement unattended. In addition to saving energy, and low labor costs and the ability to determine the location in addition to population centers, and implementation of data center extinguishing the lights reduces the risk of malicious attacks on infrastructure.
There is a tendency to update the data centers to take advantage of the performance and energy efficiency of the equipment increases the capabilities of the latest computer, such as cloud computing. As this process is known data processing center.
Organizations that are experiencing rapid growth, but their data centers, information technology era. Industry research firm International Data Corporation (IDC) puts the average age of the data center at the age of nine.  Gartner, says other research data center company of more than seven years outdated.
In May 2011, he said, uptime Institute Research Organization Data Center found that 36 percent of large IT companies surveyed expect to exhaust the capacity in the next 18 months.
Transformation of the data center is a step-by-step approach through integrated projects with the passage of time. This differs from the traditional method of data upgrades that take the serial approach and fragmented center.  and include pilot projects within the project of transformation of data unification / consolidation, virtualization, automation and security center. Standardization / unification: The purpose of this project is to reduce the number of data centers can be a large organization has. This project will also help to reduce the number of hardware and software platforms, tools and processes in the data center. Organizations replace aging data center equipment in accordance with those that provide the capacity and performance of the latest. And standardized computer platforms, networks and management so that they are easier to manage.  Virtualization: There is a tendency to use desktop virtualization technologies to replace or enhance multiple devices from the data center, such as servers. Default reduces capital and operating expenses,  and reduce energy consumption. Used virtualization technologies also to create virtual desktops, which can then be hosted in data centers and rented on a subscription basis.  Data published by Lazard Capital Markets Investment Bank reports that 48 percent of the company's operations will be apparent in 2012. Gartner default Show as a catalyst for the update.  Implementation: automating the data center by automating tasks such as saving, configuration, patching, and version management, and compliance. While the companies suffer from low skilled IT workers,  automate tasks are data centers operate more efficiently. Security: In modern data centers, data security has been the integration of the virtual security systems with existing physical infrastructure.  The security of modern data center should consider physical security, network security and data integrity and user ..