Filter: High Availability

flash storage

Advocating The Use Of Flash Storage

You are likely to have used flash storage at some point in your life, if you have browsed the web on a laptop, listened to music on an MP3 player or even texted someone on your smartphone.  In fact any device with flash chips that can serve as a storage repository is loosely termed as flash storage. It can be anything from a USB device to an all-flash storage array. Flash storage was initially only used to augment traditional media or hard disk drives as a caching complement. But since then flash technology has become faster and more reliable. Solid state drives (SSD) with fully integrated circuit board, containing multiple flash chips are now being designed to replace the spinning media for hosting data sets for key applications. With enterprise data storage needs continuing to rise, it may be time to revisit the flash versus disk-based storage systems debate. Let us take a serious look at the benefits of having an all-flash infrastructure for data centers.

What Is Flash Memory?

A traditional hard drive utilizes rotating platters and heads to read data from a magnetic device, whereas flash storage uses a type of non-volatile semiconductor flash memory technology to store data. helps to eliminate any rotational delay in spinning disk and seek time functions to improve performance. Toshiba and Intel first produced flash memory devices way back in the 1980s that were based on electrically erasing programmable read only memory technology, which allowed data to be written and deleted electronically in a flash.

Flash uses a trapped charge on non-volatile memory chips to store data and therefore the data is retained, even when power is removed. The primary technologies used for flash storage are single level cell, dual level cell, triple level cell, 3D Flash and variants on each of them. The flash memory capacity has doubled every year since it came about and according to estimates the cost per bit for flash has reduced by a factor of 200,000, which is why enterprises are now looking at flash as a complete replacement for hard drive arrays.

Choosing An All-Flash Array

An all-flash array is a storage system wholly built around solid-state drives or flash memory with very high I/O capability, instead of traditional hard disk drives (HDD). Flash provides fast, reliable and consistent performance with low latency and fast data transfer rates. Flash storage devices are now available with capabilities of 500,000 up to one million input/output operations per second in a single footprint and latency figures of one millisecond or less. They can stand up to the demands of high-performance applications for instant access and low latency.

Adding SSDs into existing hardware platforms may provide a better solution than a hard drive-based array but it certainly does not best utilize flash storage capabilities. Traditional storage arrays help in optimizing the performance of the hard drive through the use of caching and intelligent algorithms to complement the physics of accessing spinning media. Whereas all-flash arrays are designed from the ground up to work with the unique characteristics of flash media by providing enough back-end bandwidth capacity to cope with solid-state media. There is also planning for wear leveling across all devices and preparation for multi-layer redundancy in case of failure.

Operational Advantages Of All-Flash Storage For The Enterprise

All-flash arrays may not be less expensive than hybrid arrays or traditional hard disk arrays, on a dollar-per-gigabyte basis ($/GB) but a better cost comparison model would be on a dollar-per number of input output operations per second basis ($/IOP). The total cost of ownership and acquisition cost for all-flash solutions can be significantly lowered by intelligently using storage optimization technologies, such as data de-duplication, thin provisioning, and compression. According to a study by Wikibon, the total cost of ownership for flash is expected to continue to fall through 2020 and the cost per terabyte will reduce from roughly $151 in 2016 to $9 by 2020. All-flash disk arrays can provide ultra-high speed performance, enterprise-class availability, reliability and storage efficiencies with built-in preventive maintenance for cost savings.

All-flash solutions are better for high performance applications, such as transactional databases or virtual infrastructures, including server and desktop infrastructures that can often cause a bottleneck. SSD arrays are more suited to an environment where low latency and consistently high throughput are required because it can deliver millions of IOPS and sub-millisecond latency in a tiny space. Flash memory can speed up complex database operations and provide consistent robust performance capabilities with built in redundancies, while being scalable and reliable.

Flash storage can allow a company to do things that were simply not possible to do with traditional storage, like speed up its business analytics process and to potentially create new business opportunities. SSD’s remove bottlenecks in the environment by speeding up existing processes and providing quick access to stored data so that businesses can do more in the same amount of time. A move to flash storage can considerably reduce the backup window and decision-making time, while increasing work productivity and revenue growth.

Solid-state drives are more durable because they don’t rely on any moving parts whereas hard drives depend on spinning disks. There is no cause for concern about damage to the storage units or loss of data should any disaster happen to the flash storage. This is an amazing attribute where data security is concerned. Due to the lack of moving parts in flash storage, SSD’s use fewer resources than traditional storage solutions, thus saving substantial money in reduced energy costs.

Transition To Flash

Your organization should first evaluate its data set, application behavior and workload to best determine whether it would benefit from an all-flash storage system in its data center.  Storage management of massive disk arrays can be challenging but more enterprises are transitioning to all-flash data storage for time-sensitive and key performance applications. Falling SSD prices and the availability of more quantifiable reports from organizations about the strategic business advantages of adopting an all-flash storage solution will definitely result in an uptick for all-flash rollouts.

How to Disrupt Customer Thinking For Greater Consideration and Higher Conversions

What would you say if you knew that your potential customers were nearly 60% of the way through the purchase process before they ever engage with sales?  A study from CEB Marketing Leadership Council indicates just that.  Customers start their due diligence by looking at third party websites, referring to colleagues or other customers, and by using other sources other than the vendor’s website.  This begs the question: how do you disrupt customers who are already 60% of the way to their purchase decision in order to become an option in their consideration set?

There are really two ways to approach this quandary.  You could pull out all the bells and whistles when the buyer finally decides to engage with you and/or you can find a way to get out in front of them much earlier in the process before they are fully entrenched in their thinking. 

Bells and Whistles
Let’s first look at the bells and whistles.  When your potential customer finally decides to visit your website it will be more important than ever that your website is set up for conversion optimization and that it leads them quickly to the valuable information they seek. 

To accomplish this, you will need strong search engine optimization and to make effective use of your website analytics to better understand visitor behavior on your website. Specifically, you will use tools such as conversion funnels to develop ways to move people powerfully through the buying process on your website, page by page.  Navigation and aesthetics also play a key role in making sure that your website provides the best user experience for your visitors.  These functionalities help cut down on bounce rates and unobtrusively guide potential customers through the purchase process. 

Get Out In Front of It
Spending time on optimizing your website for success may not be enough to earn that coveted spot in your target markets consideration set.  It’s a defensive strategy to a problem you may not have been aware of.  Getting out in front of it is an aggressive, offensive move that will put you in front of the market without them realizing that you’re being proactive in seeking them out. 

Search engine marketing techniques allow your message to appear throughout the internet on websites with topics relevant to your solution.  These are the third party websites that CEB Marketing Leadership Council indentifies as resources buyers are relying on as alternatives to vendor websites.  Content marketing also allows you the opportunity to be on these third party websites through tactics like content partnerships and reviews.  The overarching goal of any content marketing strategy is to influence buyers without directly and overtly selling to them.  This is a superior tool to use when you’re trying to be in front of the right people at the right time.

Buyers are referring more frequently to colleagues before making purchase decisions.  Social media is a great host to customer reviews and testimonials.  YouTube is the 2nd largest search engine and the 3rd most visited website in the world behind Google and Facebook.  Using this knowledge to put the right message out to your target market helps you get into their consideration set earlier on in the buying process than if you don’t utilize them. These are just a couple examples of the surprising number of ways to effectively get in front of your prospective customers, to disrupt and engage before they are too far along the purchase path.

Stay on Top
It used to be that in order to get information on a business you had to interact in one way, shape or form with a sales rep.  Then came websites. Websites allowed users to get coveted information about their needs from something other than a person.  Fast forward and welcome social media and other information aggregation sites to the buying process; resulting in a downward slippery slope of loss of control for many businesses.

The truth is that as times change, businesses need to stay on top of the changes to the buying process.  Buyers are savvy.  They know that there are a lot of resources at their disposal.  Businesses need to be equally as savvy and constantly be finding new ways to educate their target audience in order to stay front of mind.

We, at GlobalDirective, can help you regain lost footing and get back ahead of your prospects, disrupting their process long enough for you to ‘wow’ them. And in that moment, earn your place in their consideration set and increase your conversion rate.

This is a guest blog written by Michelle Keyser, Director Content and Social Media Marketing, of GlobalDirective.  She is a strategist and blog contributor for the digital marketing agency. 

High Availability and the Increased Need for Uptime

Systems and network uptime have never been more important for organizations. With a global economy that relies heavily on nonstop e-commerce, the appetite for downtime is dropping. Customers as well as business partners and employees expect continuous access to information and applications.

Because of these factors, ensuring high availability is — or certainly should be — a high priority for corporate IT.

“Downtime is no longer acceptable in many cases; it’s not an option,” says Robert Bready, research director at IT research firm Aberdeen Group.

Availability is proving to be critical as greater proportions of a company’s relationships with its customers depend on IT systems and services, says Eric Hanselman, chief analyst at 451 Research. “The toll-free hotline is going the way of the dodo, as mobile apps increase convenience and create tighter bindings with customers.”

The Importance of Being Available to Customers

As the financial services industry and other businesses have seen in the last year, availability is not just a matter of keeping applications running, Hanselman says. “The path to customers now needs to be resilient to denial-of-service attacks, adding a much larger element to the availability calculus.”

While availability is critical to customer perception, many enterprises have a difficult time attaching value to infrastructure investments to improve it, according to Hanselman. Only 13 percent of enterprises listed availability as one of their top two criteria for service providers in 451 Research’s most recent TheInfoPro study.

“This is a contradiction that organizations will have to address to meet customer expectations,” Hanselman says. Availability has to be a key component of application design, “and this extends from data-source management all the way out to service-provider selection.”

“While Active-Active application components and distributed databases are leading technology efforts, basic steps like having and practicing recovery plans are things that many enterprises have yet to accomplish. Capabilities don’t have to be technically sophisticated to be effective.”

What Downtime Costs Organizations

The cost of downtime has been steadily rising. Aberdeen Group surveyed 208 IT professionals in May 2013 and found that the average cost per hour of downtime across organizations of all sizes was more than $163,000 per hour. Large companies experienced financial losses of more than $600,000, medium companies more than $215,000 and small businesses more than $8,000.

These figures account for losses due to a drop in employee productivity and revenue, Bready says, but they don’t reflect factors such as tarnished corporate brand, reputation and customer attrition.

It’s not just a matter of systems being available; organizations need to ensure that users have fast, high-quality access to critical business applications. As a result, network latency has become increasingly intolerable for companies.

“User expectations of application performance have grown dramatically, even as more applications are delivered over challenging mobile environments,” Hanselman says. “Part of this is due to improvements in network infrastructure, but a growing portion is due to the increasing volume of multimedia traffic, where latency’s effects are much more visible.”

If high-speed access to critical applications is not available, “you may have 5, 10, 100 or 1,000 employees sitting around for an hour or a day,” Bready says. “If your applications are slow you’re not even going to know whether your customers are with you anymore or have left for a competitor. The appetite for latency is nonexistent.”

Strategies to Minimize the Damaging Effects of Downtime

In the event of downtime, how can a company minimize the loss of business and revenue?

“Businesses can minimize the effects of any outage through reasonable planning,” Hanselman says. “Ensuring that error messages give users not only an understanding of the problem but also a way to resolve it is a simple and crucial step in application design. Having a fallback plan will ensure that companies know how to react when failures occur.”

Having alternate, reduced-function websites and alternate providers can keep customers informed and connected, Hanselman says. “Companies should expect failures and practice responding to them.”

Organizations need to have comprehensive plans for both business continuity and disaster recovery, Bready says. And yet, a surprisingly high number of companies do not.

“We find that even though the cost of downtime is so high, and that one of the top pressures organizations are facing is the risk of business interruption, a lot of companies still don’t have a business-continuity plan in place,” Bready says. “We found that only 55 percent of all survey respondents have a disaster-recovery document in place.”

Disaster Recovery and Business Continuity Are One and the Same

No longer can companies just look at traditional data backup as a way to ensure uptime, Bready says. “In the old days, you would [perform] backups daily weekly or monthly, and if something went down, you would rebuild the server, reload the OS, applications and data and then resume operations.”

That doesn’t work in the current business environment, because it assumes downtime. “People have to think of disaster recovery and business continuity as one; they need to do both,” Bready says. “You need both practices to be in the forefront.”

Key components of any recovery and continuity strategy should include the ability to replicate physical and virtual servers for failover, Bready says. “If you’re involved in e-commerce, you need to replicate [systems] to other areas. Virtualization allows you to do that much more easily than with traditional servers.”

Continuous uptime is especially important for large companies in industries such as financial services and e-commerce. But even for smaller companies, in any business, being down for 24 hours can be costly.

“Financial services and online retailers get a lot of the focus in discussions on availability, but the need for greater resilience is just as important in a broad range of other industries,” Hanselman says. “Healthcare and manufacturing have clear requirements, as do any number of enterprises [that] need to maintain availability to provide value. Expectations from customers have grown, and astute businesses are looking to deliver value anytime that they want it.”

[image: Melpomenem/iStock/ThinkStockPhotos]

Don’t Forget The Basics When It Comes To Your Website

How many times have you come across a website that might look nice, yet annoys you with quirks, annoyances, or broken links? If you have a website (any who doesn’t these days?), then you’ll want to do your best to make sure that all of your visitors have a pleasant experience. You’ll want to make sure your site functions smoothly, without errors and annoyances, and renders properly regardless of what browser your visitor might be using. On top of that, a little search engine optimization couldn’t hurt either.

Addressing any underlying HTML, CSS, SEO, spelling, and broken link problems your website may have is a crucial first step toward a first-class, trouble-free website. A good automated website checker can easily find many of these problems, and even offer suggestions and solutions towards addressing them. Tools that can check many different aspects of a website with only one-click can be especially helpful and time-saving. Only after you’ve checked the underlying HTML & CSS should you move on to checking your site in different browsers, and then to user testing.

Also, be sure not to forget about accessibility checking. You’ll not only help those who have vision disabilities, but everyone who accesses your site because the techniques used can often improve accessibility for everyone. Another benefit of an accessible site can be improved search engine optimization. For example, using keywords with appropriate image ‘alt’ text, is another opportunity to let search engines know what searches your website is targeting.

CSE HTML Validator is a tool that can thoroughly check a web page or website for multiple problems. There’s a free online version at http://www.OnlineWebCheck.com. Just give it your website’s URL and it will check basic HTML & CSS syntax, showing only errors and warnings. There’s also a free (for personal and educational use) desktop version at http://www.freehtmlvalidator.com/, which offers basic HTML checking only, and then there’s the paid version at http://www.htmlvalidator.com, which offers the most thorough checking, including CSS, SEO, link, and accessibility checking. However, you don’t even have to pay for the professional version if you take advantage of the fully-functional, FREE PUBLIC BETA at http://www.htmlvalidator.com/freebeta, which lets you use a recent & fully-functional BETA version for free until January 31, 2013. For business websites, the paid version (or free BETA) is highly recommended.

And, for a limited time, save $25 when you buy CSE HTML Validator here, making the standard edition only $44. It’s guaranteed to help you improve your site or your money back (30-day guarantee). Please note that CSE HTML Validator is for Windows only (except for the online version).

 

What is Backlining and Why Is It Important

[[{“type”:”media”,”view_mode”:”media_original”,”fid”:”209″,”attributes”:{“alt”:””,”class”:”media-image”,”height”:”168″,”style”:”width: 177px; height: 168px; margin: 8px; float: right;”,”width”:”177″}}]]Most business owners are familiar with the term search engine optimization (SEO) and they know that ranking in Google, Yahoo! or Bing for non-branded target keywords is the best way for their website to be found online. However, most owners do not understand the technical components and aspects that influence search engines results affecting how their website ranks.

There are several factors search engines use in their algorithms when determining search results for your target keywords. One of the most important factors is the number of “backlinks” a website has. Considered a “vote” from another website, backlinks are incoming links to a site or page from another source. Also known as inbound links and part of an “off-page SEO program”, a backlink is essentially a hyperlink pointing from one website to another.

The number of backlinks a website has is one key indicator used by a search engine’s algorithm indicating the importance of that page, while helping that page become associated with your target keywords—the fundamental basics of SEO. The strategy used to obtain backlinks is particularly important and if not done correctly, could actually harm your rankings in the search engines. In order to build safe links for your site, it’s important to have an understanding of the four basic components needed for a successful backlinking campaign:

  1. Link Popularity
    This rule of popularity is based on how many other websites have links on their page that point to yours. Because popular sites with well-written and unique content should naturally attract incoming links, this is a factor the search engines take into account when ranking a website. A page with more quality inbound links will more than likely have stronger influences on the search engines than a page with no back links to it.
  2. Link Reputation
    Not all incoming links are created equally and it’s vital to obtain backlinks from quality sources. Based on relevance, the link reputation determines the quality of one’s links. This is defined in part by how closely related the sites are that link to one another. For example, a general sports website may include links to other website pages that are on lacrosse or football. These are considered relevant links and will likely have more of a positive impact in the search engines algorithms. However, if the same sports page linked to a site for pets or household goods, the search engines wouldn’t consider it as relevant and would provide less importance on that link. After all, Google considers the backlinks to your page the “company” you keep. Make sure that you’d be comfortable with a site representing you to the search engines before linking there!
  3. Anchor Text
    Relevance (and ranking) in the search engines is also done through anchor text. These are words or phrases that are clickable hypertext to your web page. As both human viewers and search engines see this text, anchor text defines the web page it links to and therefore should be relevant to your website. To be successful in the search engines, businesses must use a varying set of keywords that compels others to click on the link to visit that web page. Before visiting the website, people should know what the website is about by having relevant, descriptive text.
  4. Google Page Rank
    Google Page Rank is not the “end all, be all” measurement, (after all, Larry Page named it after himself); however it is an effective tool for marketers and webmasters. The page rank of a website is a link analysis algorithm that Google uses to measure the generalized importance of a page. Ranging from zero to ten, it combines link popularity and reputation to decide how high a site will place on that scale—zero being the lowest and ten being the highest. Websites with a higher page rank are likely to have more relative, authoritative inbound links.While Google reads an incoming link for website #1 to website #2 as a “vote” from the first URL to the second, it also takes a close look at the website that’s providing the vote. A website that is considered “popular” provides a stronger vote, becoming the ideal place for websites with less popularity to get backlinks from. This form of endorsement can help the lesser popular websites become better known and rank higher.

While there are several elements of backlinking, it is perhaps the most important to build incoming links naturally and directly for human visitors. Before placing a backlink, think about whether it would still be worthwhile if Google didn’t exist. The search engines are trying to make a user-friendly experience and usually give more credit to links that are natural and beneficial. Obtaining backlinks through this method is one of the best ways to help your website rank high in the search engines.

[image: ThinkStockPhotos]

Google AdWords Express: The Benefits and How It Is different from Traditional Google AdWords Pay-Per-Click

Google AdWords Express is the do-it-yourself version of AdWords. It was designed for the brick-and-mortar small business that doesn’t have a lot of time to invest in PPC, but sees the value of advertising on Google.

How Does AdWords Express Work?

Traditional AdWords Pay-per-Click is almost infinitely customizable. Users can adjust keywords, targeting, bid sizes and make use of the robust analytics. Needless to say, this can be overkill for many small business owners.

With that in mind, Google has designed the simpler, stripped-down AdWords Express service. Once a business has a Google+ Local Page (formerly called a Google Places Page), they can go in and design a campaign. Google even suggests keywords based on the type of business. Because it’s integrated with Google+ Local, targeting is built in. It doesn’t matter if a searcher Googles “skateboard shop in Richmond,” or happens to be in Richmond when Googling “skateboard shops,” Google treats the queries the same, as far as AdWords Express is concerned.

  • Unlike traditional AdWords, there’s no bidding on keywords. A business simply selects their monthly budget, and Google will do the rest.
  • Don’t forget to set up Google Analytics to properly track metrics and conversions.
  • Make sure your customers post a lot of positive reviews as these are one of the first thing users see when your Ad appears.

The Bottom Line

AdWords Express certainly does what it says it will in terms of simplifying the learning curve for small business owners, and is very effective at leveraging local search. However, the level of customization is extremely limited, and sophisticated users will probably move to traditional AdWords. That said, if you are a small business owner with limited time to manage online marketing, AdWords Express might be a great marketing investment.

Over the next months, Google is making a lot of changes to improve their local listings experience and bring the community, identity, sharing & relationships of Google+ to local businesses. To stay up to date you should join their newsletter here and to learn more about Google AdWords or Google AdWords Express PPC you can contact our Search Engine Marketing Partner Conversion Pipeline.

How to Use Facebook for Business – An Introductory Guide

At this point, you have heard of social media and inbound marketing. Maybe you’ve experimented with Twitter and checked out your kids’ Facebook profiles, and you can see the value for college students who want to make sure they’re all at the same bar on Saturday night. But why does any of this matter to you or your business?

Social media and inbound marketing are increasingly important assets for businesses to get found by and engage with potential buyers on the web. Think about the way you find information about products and services – are you watching TV ads? Going through your junk mail? Or are you consulting a search engine or a friend? People have gotten better and better at ignoring marketing messages with DVRs, caller ID, and spam filters. Instead they visit Google and social networks for answers to their questions. The question for you is, will you be there to answer it?

Facebook is not an evil time-waster, a community just for younger generations, nor is it irrelevant for marketers – even B2B folks. Rather, Facebook is a tool for connecting people with those around them. And, as with any social media tool, marketers have an opportunity to use Facebook to expand their online footprint and directly engage with customers and prospects.

Download eBook Now!

[[{“type”:”media”,”view_mode”:”media_original”,”fid”:”222″,”attributes”:{“alt”:””,”class”:”media-image”,”height”:”40″,”style”:”width: 103px; height: 40px; margin: 8px; float: left;”,”width”:”103″}}]]

 

[image: ThinkStockPhotos]

Is Google becoming an ISP (Internet Service Provider)?

On July 26th Google Fiber rolled out a new broadband service in Kansas City that boasts speeds of 1000mb per second, or, 1gb per second. If speeds like that boggle your mind, they should. That’s fast enough for an average movie to download in a few minutes; 100 times faster than the average American currently enjoys.

To make this offer even more appealing the associated costs appear pretty low. The Google Fiber website offers three plans, the least expensive of which is zero dollars, for exactly what you’re probably paying through the nose for right now. This plan will be available for 7 years– Google calls it “future-proofing” homes, because there’s more content available online every day, and it’s betting that eventually consumers will want to upgrade. They’re probably right. This free Internet plan includes up to 5Mbps download, 1Mbps upload speed, no data caps, the network box and free service guaranteed for at least 7 years.

$70 per month has the Gigabit internet which includes instantaneous viewing and file sharing, high-powered wifi and four Gigabit Ethernet ports. For just $50 more ($120 per month) you get all that plus a wide selection of channels, including HD, a free 16GB Nexus tablet to use as a remote, and a 2TB storage box (in addition to the 1TB accessible through Google Drive, their file-sharing system).

Google is starting to look a lot like an Internet service provider — The ISP to rule all ISPs! They’re taking a playful approach to all of this as people can sign up for their service with only a $10 deposit and locations called ‘fiber-hoods’ will be completed based on the number of signups in the area. Cute names like a ‘fiber hoods” and psychedelic colored Bunny as a mascot are fun, but this is serious threat to companies like Comcast, Verizon and Cox—Especially since all of Google’s services are uncapped, which will strike fear into the hearts of their competition.

Google Fiber is still a pilot project for now and they have been quiet about any expansion plans in other cities; however we’d guess it’s only a matter of time since their target is to have half the subscribers in Kansas City up and running by late 2013. As they expand, cynic’s voices will become louder as many are claiming Google Fiber is a ploy to gain further information about users; where we go online, what kinds of things we buy, where we take our Android phones. Some people don’t have a problem with this, and—like Google—feel it helps the search and advertising giant better serve them. Others suggest it’s merely to flex their muscle in Washington, which has vowed to improve broadband penetration but so far has shown little progress.

Ars Technica quotes Forrester analyst James McQuivey saying: “No one will redesign a global business because the economics of it have changed in one city. Unless Google has recently announced plans to roll out in 10 cities, this is just a really neat thing for Kansas City that every industry will closely watch in case it has the potential to spread.”

Let’s hope it spreads like wildfire.

[image: ThinkStockPhotos]

World IPV6 Launch: What is IPV6 and How Will It Affect You?

You may have heard in recent months the panicked cry from Internet Service Providers and Web hosting companies, “We’re running out of IP addresses!”, and like most of us you probably thought “oh well” and went back to surfing the web, not paying much attention. However, when explained a different way, such as, “The Internet is full and closed to new business”, that got our attention.

What’s All This IP Stuff About?

IP stands for Internet Protocol and the last version, v4, provided the world with around 4 billion IP addresses. That would seem like plenty, wouldn’t it? However when every node of an Internet Protocol (IP) network, such as your computer, a router, or network printer is assigned an IP address, all those numbers are used fairly quickly. Even with ISPs using techniques such as Network Access Translation (NAT) to create clearinghouses for hundreds of domains under one IP address, we still managed to run through them all.

Enter IPV6.

Clearly the internet needs more IP addresses. How many more, exactly? Well, how about 340 trillion trillion trillion (or, 340,000,000,000,000,000,000,000,000,000,000,000,000)? That’s how many addresses the internet’s new “piping,” IPv6, can handle. That’s a number big enough to give everyone on Earth their own list of billions of IP addresses. Big enough, in other words, to offer the Internet virtually infinite room to grow, from now into the foreseeable future.

So what is the problem then, you ask? As of last year, only about 12% of internet-connected networks support IPV6. In order to get the IPV6 integration moving The Internet Society created World IPV6 Launch and Day initiatives.

World IPV6 Launch

World IPV6 Day was held June 8, 2011, and represented a massive test of IPV6. Facebook reportedly called the results “encouraging” and left IPV6 enabled permanently. Over the next year, all the participants worked to iron out the kinks in preparation for the World IPV6 Launch.

World IPV6 Launch was held on June 6, 2012, and represented a commitment on the part of hundreds of Internet companies, including Facebook, Google, Yahoo, Cisco, Comcast and various ISPs to rolling out IPV6 permanently, allowing for full integration by websites and users.

By all accounts it seems to have been a rousing success, in that the Internet didn’t break and our web surfing went on as normal. The Internet grew exponentially bigger, and the transition was seamless.

How Will This Affect Me?

This won’t affect you and IPv4 services will continue to operate as usual. There aren’t any upgrades to install or new hardware to buy, we can sit back and continue surfing the web like we always do; unaware that we almost ran out of IP addresses and the pending doom.

In fact, you may be using IPv6 already, visit ipv6test.google.com to find out. Many devices you use already support IPv6; however, the websites you visit and your Internet Service Provider must first enable IPv6 before you can use it.

Is Lunarpages Internet Solutions IPv6 Ready?

Yes we are. In fact we were ready for this change in October 2011.

“Lunarpages Internet Solutions is an award-winning and technologically driven hosting company with a commitment to customer service at its core. It makes sense, then, that Lunarpages is already fully IPv6 ready and has made the technological commitment to remain ahead of the game for the benefit of all our customers – current and future.”