Superb Internet Facebook Superb Internet Twitter Superb Internet Google Plus Superb Internet printerest Subscribe to Superb Internet's blog RSS feed

PCI DSS and the End of SSL

  • Business Talk
  • Security

PCI DSS

  • What is PCI DSS?
  • Requirements for Compliance
  • SSL Now Unacceptable Security for PCI DSS
  • Your Solid Compliance Partner

What is PCI DSS?

The Payment Card Industry Data Security Standard (PCI DSS) is the most widely recognized standard for organizations that take payments online via credit and debit cards. It was developed collaboratively in 2004 by American Express, Discover, MasterCard, and Visa. The real incentive for those companies was to prevent online fraud that could threaten people’s ability to use payment cards safely on the Internet, but PCI has the added benefit of making identity theft less likely for consumers.

Requirements for Compliance

There are six basic intentions of the PCI standards, all of which seek to strike a balance between strong security and convenience – with primary focus on the former.

  1. The network should be completely secure.

Firewalls should be deployed. In the case of wireless LANs, specially designed firewall should be used, since those networks are high-risk for spying and breaches by cybercriminals.

“In addition, authentication data such as personal identification numbers (PINs) and passwords must not involve defaults supplied by the vendors,” explained tech writer Margaret Rouse. “Customers should be able to conveniently and frequently change such data.”

  1. Payment data should be safeguarded in storage.

All data that relates to individuals should be safeguarded against unauthorized access in storage. This type of data includes birthdates, Social Security numbers, answers to password retrieval questions, ZIP Codes, and phone numbers. It should be encrypted using established industry standards prior to transmission.

  1. Anti-malware definitions should be kept current.

Anti-malware applications, such as programs defending against viruses and spyware, should be updated whenever new patches become available. In that manner, the software can be kept safe from bugs and weaknesses that could lead to compromise.

  1. Access to the data should be limited.

Companies should only request information from consumers on a “need to know” basis – in other words, when it’s necessary to conduct the transaction or verify identity. Each user of the system should have their own ID name or account number, and both hardcopy and digital protection should be maintained.

“Examples include the use of document shredders, avoidance of unnecessary paper document duplication, and locks and chains on dumpsters to discourage criminals who would otherwise rummage through the trash,” said Rouse.

  1. Networks should be monitored and reviewed.

Monitoring of the network should be continual, with periodic vulnerability assessments to confirm that security mechanisms are working adequately. No security software should ever be outdated. Anti-malware software must check incoming and outgoing information, applications, memory, and storage devices often – persistently if possible, to meet the persistent threat of malicious hackers.

  1. A policy for data security should be developed.

Organizations must strictly enforce the policy, with sanctions when its parameters are ignored.

SSL Now Unacceptable Security for PCI DSS

PCI DSS 3.0 was retired on June 30, 2015. That’s major news because of what’s replacing it.

PCI DSS 3.1 was released in April after multiple, devastating vulnerabilities were found in open source versions of SSL such as OpenSSL. In the new version, secure sockets layer (SSL) and early versions of transport layer security (TLS) are no longer acceptable for safeguarding payment information.

SSL is now considered unacceptable for new deployments. For systems that are currently deployed, companies have until June 30, 2016, to upgrade.

“The update means online merchants will have to switch off SSL in web servers and support the latest version of the Transport Layer Security protocol,” said info security journalist Phil Muncaster. “Bricks and mortar stores will also need to pay attention, especially if they have any payment apps using SSL that may need updating.”

The National Institute for Standards and Technology (NIST), the nonregulatory body that establishes standards for the US government, instructed all agencies to switch to TLS 1.2.

The rapidfire release of this new version of PCI DSS, closely following the January 1 issuance of version 3.0, shows how grave the exploit potential within SSL currently is.

It is now possible with outdated, less sophisticated security technologies such as SSL for attackers to access interactions between client and server, according to Venafi security VP Kevin Bocek. “[O]rganizations must identify use of SSL/TLS, plan a remediation strategy and move to the secure protocols, encrypt data before transmission, or apply additional layers of transmission security that are not vulnerable, such as IPSEC,” he explained.

Bocek additionally commented that both IT vendors and general businesses should know that this quick adaptation is the new normal. They should be prepared to adapt rapidly when any further exploits are discovered in the months and years ahead.

Your Solid Compliance Partner

The security landscape continues to evolve, and recent high-profile hacks (such as the multiple-month invasions of Anthem, Sony Pictures, and the US State Department) reveal the complexity of the advanced persistent threat. You want a cloud provider that goes above and beyond PCI DSS standards.

Furthermore, we all want more than just security: we want cloud performance to meet strict standards as well. That’s why all our cloud servers are Passmark-rated.

Do you want a cloud VM that typically delivers four times the performance of virtual servers with similar specs from AWS and SoftLayer? Then make your cloud Superb.

By Kent Roberts

How Data Scientists Can Turn Your Big Data into Marketing Magic

  • General
  • Technology

Big Data

  • Data Rapidly Takes Over the Earth
  • The Science and Art of Data
  • The Customer is Sooooo Right
  • Listening with Apps and People
  • Big Data Magic Tricks

Data Rapidly Takes Over the Earth

These days, the data is large and in charge. It grows and grows, and it can certainly give us extraordinarily valuable insights. Big data is allowing for more intelligent choices, better efficiency, stronger interaction with customers, disease prevention, and stockmarket predictive models. Some day big data analytics could replace us behind the steering wheel, and it can already trump us in the Daily Double.

Business now has an extraordinary amount of information. Data is prevalent to the extent that it is almost overwhelming. Look at it this way: the rate at which data will be created in 2020 is expected to be 44 times what it was in 2009. How do we turn all the data into meaningful insights?

The Science and Art of Data

Analytic tools are becoming more sophisticated, but skilled data scientists should open their safes and get ready to start dumping in some dough. Smart businesses will get both technology and human capital so they can access the right data, gathered and sorted ideally for maximum impact. You don’t just need insights, actually, but the capacity to explain what you found succintly and engagingly. What’s your data story? You actually want your data analyst to be both a scientist and an artist. Combine the two, and you have marketing magic.

McKinsey forecasts that in just three years, “the United States alone could face a shortage of 140,000 to 190,000 people with deep analytical skills as well as 1.5 million managers and analysts with the know-how to use the analysis of big data to make effective decisions.”

Is your office looking for a data scientist? The Oval Office already did. Early in 2015, the Obama administration welcomed the newly appointed US Chief Data Scientist. Also this year, the University of Rochester created its own Institute for Data Science. A poll by recruitment agency Burtch Works reveals the obvious, that more people want data scientists than ever before and that they’re willing to pay big bucks to get them.

The first place that you want a data scientist to specialize is marketing, argues Brian Kardon, CMO of marketing analytics company Lattice Engines: “In marketing,” he says, “effectively using data to understand customers and predict buying behavior can make the difference between a winning customer experience and a failing one that lives forever on social channels.”

The Customer is Sooooo Right

Okay, customers are often wrong. However, increasingly sophisticated digital environments mean that they are more empowered than ever before to find solutions that match their expectations:

  • 57% of buying is individual research
  • 70% of the sales funnel precedes the salesperson

It’s all about the content, and it’s critical that the content is customized (as proven repeatedly by research). Data analytics will allow you to find the most ideal prospects and customize in the best possible ways.

“Big data is perhaps the most important way to create a user experience that treats customers in the way that they want to be treated – like individuals,” says Distilled outreach director Adria Saracino. “[I]f you listen properly, your data will tell you vital information about your customers and clients.”

Listening with Apps and People

Clearly you want to leverage automation with predictive apps. Software helps to filter your leads and gives you immediate intelligence. That intelligence is meaningless without expertise on staff, though.

A well-trained and thoughtful data scientist can straddle the line between IT and business. They should be grounded in math, of course, so that they are aware of all elements of the analytic algorithms. Their background should help them tweak equations to fine-tune campaigns and build sales.

Data scientists live in the land of intent data, behavioral data, and fit data:

  • Intent data – keywords and website tracking
  • Behavioral data – content and emails users access
  • Fit data – characteristics of company and credit score

Integrating these three elements creates incredible customer profiles.

Data scientists can also devise and interpret strong split testing for better results .

Again, according to Kardon, you need both the software and the human touch to create marketing magic. “Without technology, we would lack the clouds of data that inform outreach decisions,” he says. “Without data science, we would lack the insights that escape algorithmic automation, and the ability to translate data insights into effective decisions.”

Big Data Magic Tricks

As big data tricks becomes more fundamental to business success, you need to hire a guy with a cape and magic wand: a data scientist (data magician?). You also need the technology to set the stage for that individual to perform.

Our cloud stage offers major performance and reliability (true 100% HA) differences over others using mainframe-era/style centralized storage and inferior,
Ethernet-based networking technology.

By Kent Roberts

How HIPAA Law Applies to Mental Health Records

  • General

Health Care IT

  • HIPAA Guidelines on Mental Health
  • Congressional Bills Currently Under Discussion
  • Maintaining HIPAA Compliant Technology

HIPAA compliance is fundamental for healthcare companies and the business associates who handle their patient information, but mental health is a special case.

Specific considerations for mental health include:

  • Determining what is acceptable in terms of informing caregivers and family
  • Figuring out how records can be transferred to other practices
  • Understanding whether sharing between providers is allowable.

HIPAA Guidelines on Mental Health

HIPAA discusses the requirements of covered entities related to mental health records. This information is often especially sensitive compared to other health data.

The federal regulations are designed to maintain security and privacy of information but to also aim for transparency when appropriate. The HIPAA Privacy Rule states that covered entities can discuss mental health issues directly with loved ones and professionals caring for the patient.

According to the mental-health guidance on the HIPAA site, mental health facilities can ask the patient if it is okay to communicate particular details to people involved in their daily care. Doctors can also inform the patient that they are going to make disclosures, allowing them a chance to say that it’s unacceptable. Another option is that they can “infer from the circumstances, using professional judgment, that the patient does not object,” states the HHS guidelines. “A common example of the latter would be situations in which a family member or friend is invited by the patient and present in the treatment room with the patient and the provider when a disclosure is made.”

It’s also permissible for a covered entity to reveal mental-health details when the patient is absent or unconscious – as long as the patient’s best interests remain the top priority. Also, it’s only acceptable to relay information that specifically applies to the third party’s involvement in billing or the care continuum.

Now, for the most part, mental health and physical health are treated the same under the law. But notes from mental therapy sessions are given specific focus in the regulations.

Notes taken during psychotherapy sessions are considered to be in a different category from the rest of mental health records for two reasons, according to the HHS: “both because they contain particularly sensitive information and because they are the personal notes of the therapist that typically are not required or useful for treatment, payment, or health care operations purposes, other than by the mental health professional who created the notes.”

Healthcare organizations have to get authorization signed for any disclosure of these notes – except for cases in which other laws demand their disclosure, as when the patient admits to harming someone or makes threatening statements. The specific rule is that healthcare providers generally can’t give any information to relatives or caregivers that the patient does not want them to have, but they can do so if there is an imminent threat that third parties might help to mitigate.

Similarly, it’s okay for covered entities to listen to the perspectives of family and others to better design treatment plans. Notes related to those additional perspectives are protected within the law as well.

“[A]ny information disclosed to the provider by another person who is not a health care provider that was given under a promise of confidentiality … may be withheld from the patient if the disclosure would be reasonably likely to reveal the source of the information,” states the HHS.

Congressional Bills Currently Under Discussion

As of June 2015, two bills being discussed in US Congress could change how mental health is handled. First, the Including Families in Mental Health Recovery Act is intended to make it more clear when it’s acceptable to share information and provides $30 million of funds to train relevant parties on what sharing is allowable under HIPAA.

Second, the proposed objective of the Helping Families in Mental Health Crisis Act is to clarify that it’s okay to share protected health information when specific parameters are met:

  • The information that is disclosed is related specifically to diagnosis; courses of treatment; setting dates for visits; or pharmaceuticals and directions for their use. No mental therapy notes can be included.
  • Sharing the information is critical to safeguard the patient or others.
  • Sharing will be helpful in the care the patient receives for additional illnesses.
  • It will improve consistency and quality throughout the care continuum.
  • Failing to disclose the details could lead to additional health problems.
  • Due to the mental condition, the patient is unable to grasp to the doctor’s instructions or could suffer disability if the treatment isn’t completed.

Keep in mind that, as of this writing, both of the above are still moving their way through the congressional process.

“Until any changes are made final,” explained HIT writer Elizabeth Snell, “healthcare providers must keep themselves educated on how to maintain HIPAA compliance while caring for patients being treated for mental health conditions.”

Maintaining HIPAA Compliant Technology

Obviously much of the concern about mental health, as discussed above, has to do with whether or not it’s okay to communicate information to additional parties. Determining what parties have access to information is just one element of the law. The core concern with information itself is the security of IT systems.

That’s our specialty. Our HIPAA compliance-ready solutions  provide secure cloud and data center hosting practices to help healthcare providers achieve HIPAA compliance.

By Kent Roberts

IT Analyst: The Cloud May Not Exist

  • Cloud
  • General

Cloud Question

What is “the cloud”? Well, it’s actually thousands of technological infrastructures, each designed and managed by individual companies. One IT analyst thinks that the notion of a single cloud is ill-conceived.

  • The Cloud,” or a Cheap Imitation?
  • Cloud Snowflake Syndrome: Every Cloud is Very, Very Special
  • Careful Review of Outsourcing & Cloud
  • Cloud Choice is Essential

The Cloud,” or a Cheap Imitation?

What’s wrong with “the cloud”? It’s easy to see the benefits in terms of expense and speed. However, the marketing nonsense, where everything is called “the cloud” when it’s actually an individual cloud that is being referenced, gets old fast. In fact, in some cases, “the cloud” is hardly a cloud at all. Take, for instance, AWS – is that “the cloud”? If it is, then why is a guy who built a $200 million cloud startup calling AWS’s monolithic cloud “irrelevant” and “weak”?

IT analyst Stanton Jones of benchmarking consultancy ISG says that we should start using “a cloud” instead so that we can repair the damage of thinking that every corporation with big marketing dollars can tell us they are selling us the cloud when it’s often a cheap imitation and certainly doesn’t deserve the status of “the cloud.”

The problem is that this difficult-to-pinpoint technology has become a cure-all. What will solve all legacy IT problems, including cost, complexity, and speed? The cloud will. Right, that answer seems fine until you look at how vast the cloud computing industry is and how unstandardized it is.

“Depending on the cloud that buyers choose, the services they’ll receive are a combination of outsourcing and insourcing, with some traditional software procurement mixed in,” added Jones. “Either way, it will likely look very different from traditional on-premises IT or traditional ITO.”

Cloud Snowflake Syndrome: Every Cloud is Very, Very Special

In the current ecosystem, there are pluses and minuses. Due to the general lack of standardization, we have a bunch of different clouds, and it’s hard to tell what’s what. The positive is that each cloud tries to stand out through innovation and differentiation.

Who knows, though. We could have an interoperable, open-cloud future. There are two ways this can be achieved. One is through cloud standardization that would allow workloads to pass seamlessly from one cloud to another.

The other approach is “attacking this problem from an application point of view,” explained Jones. “Docker, an open-source initiative that creates ‘containers’ for applications, hopes to solve this problem by creating application portability across environments.”

Although it sounds as if cloud technology is moving in the right direction in some ways, the idea of data being able to change hands easily is not as prominent as you work your way up the cloud stack from infrastructure to platform to software. Overall, SaaS pulls in much more revenue than does IaaS. Cloud software companies aren’t as focused on interoperability, instead creating customized mobile and analytic functionalities that only work within their own systems.

Cloud Snowflake Syndrome is confusing for those buying ITO. Many buyers just assume it’s all the same. If it’s all the same, you might as well go with the lowest-priced option that will be the easiest to adopt and has the strongest support staff. But it’s not.

Careful Review of Outsourcing & Cloud

That’s only part of the story. If you want to approach the situation intelligently, also consider these three differences between companies that provide outsourcing and companies that provide cloud.

#1 – Cost

The specific cloud that you choose for infrastructure will not mean that you necessarily spend less money.

“For public cloud IaaS, cost savings depend heavily on designing an appropriate infrastructure configuration for the application and the usage profile of this configuration,” said Jones. “The potential for savings is unique to each provider and how the buyer chooses to use the platform.”

#2 – Standards

Outsourcing providers typically base their systems on universally accepted industrial standards. Although there are certain standards that cloud providers use, such as SSAE 16 to verify the security of their data centers, it’s really much more of a jungle (hence the importance of our Passmark–rated servers, as discussed below, which blow AWS out of the water). It’s typically difficult not to overpay when looking at cloud infrastructure. Similarly, software clouds typically can’t go below a certain number of users once they are signed on. Cloud choice is paramount.

#3 – Management

Third, the routine management provided by an extensive staff at outsourcing providers is no longer necessary with the sophistication of automation allowed by cloud architectures. APIs talk with one another and with clients. Clearly support still matters, so it’s a factor to consider in your cloud choice, but there is also significant disparity among the APIs that are used.

“The robustness and maturity of these APIs varies wildly from cloud provider to cloud provider,” commented Jones. “Again, each cloud is unique, therefore, the way that buyers interact with each cloud provider will be unique as well.”

Cloud Choice is Essential

As established above, calling cloud technology “the cloud” creates confusion and allows cheap imitations to be taken seriously in the marketplace.

You want standards? Our systems meet numerous international standards and compliance guidelines, and all of our cloud servers are Passmark-rated.

Plus, our cloud is simply just better than the competition, with performance that is typically 4 times higher than Amazon and SoftLayer.

By Kent Roberts

 

How Open Government Data and Cloud Computing Create Value

  • Cloud
  • What's New?

Saving Money

  • Innovations in Power and Information
  • Better Insights
  • The human Side of Technology
  • Continuing the Push
  • The Right Cloud

Innovations in Power and Information

In recent years, the federal government has moved to adopt open data and cloud technology. Open data makes it easier for governmental offices: data access is more affordable, and it is simple to make information publicly available. Cloud computing renders the costs of IT infrastructure more manageable and creates an environment within which big data analytics can allow agencies to technologically address complex issues.

“Cloud computing and open data take two previously costly inputs—computing power and information—and make them dramatically cheaper,” explained Center for Data Innovation analyst Joshua New. “Government agencies invest large amounts of capital and time to build and manage their own data centers and IT infrastructure.”

In other words, the cloud makes planning for the future dramatically more flexible. When only traditional IT was available, it was necessary to establish capacity by predicting how many resources would be needed in the coming months and years. Changing the capacity was complicated, so organizations ended up setting up systems with extra resources so that they would not run into a wall.

The cloud makes it possible for the federal government to adjust as it goes, scaling up and down in tune with demand, which is both more energy-efficient and more cost-effective. Since government offices now have to make data available on the Internet in languages readily understood by devices, the public sector has drastically reduced the time it was spending to transfer data out and take it in: no need for individual transfer of outgoing data, and no need to ask for data due to immediate accessibility.

Better Insights

Since both open data and cloud computing enhance the government’s information-sharing capabilities, these two technologies have made it easier to work with the data and learn from it.

One great example is the Consumer Sentinel Network. The network, managed by the Federal Trade Commission, is a coalition of dozens of governmental agencies at all levels. They share information when individual complaints are made about companies, such as fraudulent telemarketing offers, do-not-call violations, and credit scams.

Cloud systems don’t just make intergovernmental sharing easier; they also make it easier for the public sector to share information with businesses and American citizens.

One example of a major cloud migration is being conducted by the National Oceanic and Atmospheric Administration (NOAA). The agency expects its total data storage to increase by 90,000 TB annually beginning in 2020, and that data could not be publicly available without the cloud.

“NOAA expects the scalability and ease of deployment of these cloud solutions will help reduce the bottleneck effect that limited government IT infrastructure can have on organizations and businesses that rely on government data,” said New.

The Human Side of Technology

With the increase in data capacity, governmental offices can improve their position related to tech talent. After all, private industry often outpaces the public sector because business has historically been able to pay more than the government has for individuals who are highly skilled at data. There are relatively few people who are experts in data science, and every organization wants the top people so that they can benefit from processing their data in meaningful ways.

With cloud computing and open data, the public sector is better able to compete. In 2014, the General Services Administration created 18F, an office charged with enhancing federal IT services.

“18F hosts the competitive Presidential Innovation Fellows program,” New commented, “which attracts highly skilled technologists to improve government services with open data, such as making education [more] accessible and improving opportunities for private sector entrepreneurs.”

The US government has also trying to better connect with top talent through conferences and events, such as Health Datapalooza and hacking events geared toward finding solutions for public challenges. The 2014 National Day Of Civic Hacking addressed more than three dozen national and international problems by leveraging open data.

Continuing the Push

As you can see, both open data and cloud have much to offer the public sector and the American people. Many federal systems have been migrated at this point.

“The US Government is spending a considerable amount of its budget on cloud services,” said technology journalist David Hamilton. “US agencies are expected to spend about US$3 billion on cloud projects in fiscal 2014 (which began October 1, 2013), which is around $800 million more than officials predicted in 2013.”

However, the transition to cloud is still far from complete. Many agencies have only transferred their email and storage systems at this point, for instance. Two initiatives, Cloud First and the Open Government Directive, have proven incredibly beneficial, but the benefits will multiply as these technologies continue to see broader use.

The Right Cloud

Cloud computing has many advantages for government and business. However, it’s important to remember that cloud technology is not uniform. You want a cloud service provider that offers Passmark-rated performance. Passmark is the only objective comparison that you can use to determine actual CPU performance (since gigahertz and other variables are irrelevant between various CPU generations).

Spin up your Passmark-rated cloud VM today.

By Kent Roberts

Retooling the IT Resume: When to Throw Out the Rulebook

  • Business Talk

Job Interview Prep

Everyone knows that when you write a resume, you want everything to move back in time from the present. Is that always the way that it should be done, though? No, as indicated by a particular client of resume revamp specialist Donald Burns.

  • Tossing the Rulebook
  • Past Successes
  • Excess Complexity
  • The Upside-Down Resume
  • Partnerships You Can Trust

Michael Wallace has an important characteristic that he shares with huge tech icons: a big success straight out of the gates. Like Zuckerberg and Gates, Wallace has been spending his career leveraging the success he saw upfront.

With this atypical history, a standard format proved unuseful when career consultant Donald Burns started rewriting Wallace’s resume.

Tossing the Rulebook

Burns says that his objective as a resume writer is to take two hours of conversation with a job candidate and turn it into a document that is easily digestible so that a a recruiter can get the basic gist in six seconds.

“I’ve revamped over 1,200 resumes, but I had the worst time organizing Michael’s story, because his chronology is backwards,” said Burns. “The vast majority of successful people break through midway through their careers, or towards the end of their careers, but that’s not Michael’s story.”

A standard resume, as we all know, is organized to promote the most recent activity to show that the person has been steadily growing, since a person usually has more impressive credentials and successes as they move along.

However, resume organization is not etched into stone. Sometimes it makes sense to throw out the rulebook so that whoever is looking at your resume will be impressed with the opening highlights and be drawn into your story.

Past Successes

After getting an engineering degree in 1992, Wallace joined Display Products Technology, a startup founded by three executives who used to work for IBM. The company had come up with a way to fix broken liquid crystal displays (LCDs). Wallace came up with a much cheaper way to perform the same task in his first six months on the job. His own breakthrough ideas made it possible for LCD recycling to grow into a multi-billion-dollar market.

“Over the years, DPT grew fast,” Burns explained. “It reorganized and changed its name five times in 20 years — but Michael stayed on as the technical linchpin that kept the operation growing.”

The heyday of DPT came and went. Just last year, the company closed down. Especially because of the different names, such as Incline Global Technology Services, his resume made his career seem much less impressive than it actually was. It looked as if he had been jumping around from one company to another, when he had instead been loyal to the same company that had continually rebranded itself.

Excess Complexity

While those in sales and marketing often have an easier time with their resumes since they are constantly thinking about how to present themselves, many IT folks struggle with readability, as was the case with Wallace.

Specifically, prior to Burns revamping the resume, it looks like a bunch of puzzle pieces on a table – an image that Burns often evokes with his clients. Rather than the whole picture, it was broken up into disparate headings and bullet items that weren’t integrated.

“When I first saw it, the resume consisted of about 80 bullets and sub-bullets. These bullets are like pieces of a jigsaw puzzle – I could see the pieces, but no completed picture,” said Burns. “If you present readers with a box of puzzle pieces, they have no idea what they’re looking at.”

Burns himself didn’t understand Wallace’s story until Wallace told him about conceiving the new repair approach. The method caused DPT, a UK company, to grow astronomically. Buckingham Palace even acknowledged the business’s success.

The Upside-Down Resume

Once Burns knew all the facts, he decided the best approach was to simply turn the work history upside-down so that he was telling the career story as a straightforward timeline rather than moving back from the present.

According to Burns, the original resume “made [Wallace] look like a low-level LCD repair technician, [while] the new version makes him appear as an engineering Superman.” He added, “[A]ll we had to do was break some rules!”

Partnerships You Can Trust

When you look for a new IT job, you want to know you aren’t short-selling yourself, as was Wallace prior to Burns.

You also want to find organizations that you can trust with your talent and your experience. That’s especially the case if you are currently frustrated with anything about your current employer.

In the same manner, when you look for a cloud provider, you want to know the best approaches are being used. For instance, you need a distributed architecture supported by InfiniBand technology if you are to achieve full 100% high-availability. Unfortunately, many CSP’s use Ethernet and mainframe-era centralized storage instead.

Partner with Superb Internet to experience performance that is generally 300% stronger than AWS and SoftLayer for cloud-hosted virtual machines with similar specs.

By Kent Roberts