Clients      Employees

May 29, 2015

No Comments


Avoiding the Pitfalls Server Virtualization in SMB Environments

In Part 1 of our post on virtualization at small-to-midsize businesses (SMBs), we discussed how virtualization is helping SMBs do more with less while streamlining operations and enhancing productivity. We also looked at how SMBs are seeing a real impact on their bottom lines rather quickly. Not surprisingly, this is leading more and more SMBs to explore server virtualization in their IT environments.


According to research from Techaisle, 60 percent of servers in SMB organizations have been virtualized, and those organizations expect 70 percent of their servers to be virtualized this year. That’s the good news. The bad news is that SMBs continue to struggle with virtualization adoption due to the complexity involved.


What are the most common pitfalls that prevent virtualization deployment success? Techaisle identified these as the top five:


  • Cost of licenses for the virtualization solution
  • Failure to achieve projected cost savings
  • Challenges associated with managing virtual servers
  • Cost of software licenses for applications in the virtual environment
  • Budget overruns for the project as a whole


Just as the benefits of virtualization are magnified for SMBs, so are the long-term costs and impact of a poorly conceived and managed virtualization deployment. As with any corporate initiative, SMBs should focus on the following factors when embarking on server virtualization:


Be sure of your budget. Most of the reasons for virtualization deployment failure have to do with cost overruns. SMBs need to have a clear picture of the capital and operational costs involved and enter the project with a concrete budget.


Do your homework. What do you expect to achieve from your virtualization deployment? You can’t implement an effective solution until you’ve clearly defined areas for improvement and what a virtualization solution should allow you to accomplish.


Is the virtualization platform you’ve chosen scalable and aligned with your business processes? Does it follow an industry-standard approach that will simplify management, maintenance, licensing and support? How will it be managed and monitored? How can you leverage your existing infrastructure? Have you checked to make sure that you won’t run into compatibility issues with legacy applications? What type of training is involved for administrators? How can you ensure optimal performance and security?


These are just a handful of the questions that need firm answers before you begin your virtualization deployment. Doing this homework in advance will save you money and make for a much smoother deployment.


Make sure you understand virtualization. You don’t need to understand all technical aspects of virtualization backward and forward. However, your technology solution provider should be able to answer all of your questions and make sure you have the foundational knowledge necessary to evaluate your options and make sound decisions.


You certainly don’t want to miss out on the benefits of virtualization because you feel overwhelmed, but you also shouldn’t rush into any decisions that could cost you money because you don’t understand what you’re buying.


Don’t focus solely on cost. You will inevitably encounter a solution provider that tries to win your business based on price. Cheap hardware and software tends to be difficult to manage and scale, making it more expensive in the long run. Make sure your hardware and software investments will deliver the performance and capacity you need now and five years into the future.


ICG can help you decide if virtualization is right for your business. If virtualization does make good business sense, we can design and deploy a solution that helps your organization become more efficient and productive while reducing operational costs.

May 20, 2015

No Comments


Why SMBs Are Jumping on the Virtualization Bandwagon

Research firm Techaisle just released the results of its 2015 study of server virtualization adoption among small to midsize business (SMBs). The study shows that 54 percent of SMBs have adopted server virtualization, up from 41 percent two years ago. The adoption rate is higher among midmarket businesses — server virtualization in that segment has reached 88 percent and is expected to grow to 95 percent within one year.


Virtualization enables one physical server to be divided into multiple virtual servers. Hypervisor software allows virtual servers to remain isolated and unaware of each other, and allocates the resources of the physical server to the virtual servers.


Each of these virtual servers operates as a unique device capable of running different operating systems and handling different workloads and applications. This enables organizations to consolidate servers and fully leverage existing hardware and software.


Server virtualization was once thought of as an expensive, complicated endeavor that was reserved for the largest corporations. This may have been the case at one time, but not anymore. The latest technologies make server virtualization cost-effective and relatively straightforward to implement, enabling SMBs to take advantage of a host of business benefits:


Reduced capital expenses. A simplified physical IT infrastructure means less new hardware needs to be purchased, installed and maintained, while server consolidation drastically reduces the cost of adding new applications and services. The upfront savings often equal the cost of virtualization implementation.


Reduced complexity and operational costs. A simplified physical architecture is easier and less expensive to manage, maintain and secure when the right tools are deployed. You can focus less on hardware and more on the services and applications that improve business operations.


Centralized management. Both physical and virtual servers can be centrally managed, monitored and controlled from a single console. Existing virtual machines (VMs) can be moved from server to server, allowing for resource sharing and workload balancing.


Improved disaster recovery and business continuity. Copies of virtual servers can be saved and archived as files or snapshots at a remote site for disaster recovery. This creates redundancy without additional hardware. Live migration preserves business continuity by transferring live VMs between physical servers without downtime.


Availability of legacy apps. Instead of maintaining outdated server hardware to run legacy apps, a virtual version of existing hardware can be created on modern servers. The legacy apps perform the same way on the new server, giving you time to update the apps if necessary.


Simple testing of software updates and security patches. Virtualization enables an organization to test software and security solutions on a virtual copy of their IT infrastructure. This allows you to work out as many bugs as possible for deploying new software on your live system.


Support for internal services. If you want to setup a platform, such as a company intranet, for internal use, virtualization allows you to do so on a VM instead of purchasing new hardware.


Virtualization can provide significant cost savings and operational benefits to small business, allowing you to take full advantage of a streamlined IT infrastructure. Contact ICG to learn more about how virtualization can be a game changer for your business.

May 15, 2015

No Comments


What Merchants Need to Know about PCI 3.0

In our previous post, we discussed the Payment Card Industry (PCI) Security Standards Council (SSC) latest update to the PCI Data Security Standard (DSS) and Payment Application Data Security Standard (PA-DSS) – also known as PCI 3.0. The main driver behind PCI 3.0 is a desire to make payment security a business-as-usual activity and shared responsibility across entire organizations rather than an annual compliance report.


This shift in thinking is driven by the lack of PCI compliance among merchants. According to a Tripwire survey, only 41 percent of retail companies are using penetration testing to pinpoint security vulnerabilities and just 44 percent have implemented a process for file integrity monitoring.


While the PCI-SSC has provided a summary of changes and evolving requirements in PCI 3.0, there are certain updates to the standard that are likely to have the greatest impact on merchants.


Stricter Penetration Testing Mandates. An ongoing concern has been whether cardholder data is adequately segmented from other networks, which is why organizations must conduct penetration tests and vulnerability assessments to determine if a security breach is possible. With PCI 3.0, penetration testing must now follow an industry-accepted methodology.


Those organizations that don’t have in-house personnel with the expertise to conduct such a test will need to hire a service provider who adheres to a formalized methodology that validates segmentation.

System Components Inventories. System components include any hardware or software used in the cardholder data environment. Merchants must maintain an inventory of system components and explain what each piece of technology does and for what purpose. Organizations that have many locations and those that utilize virtualization may struggle to manage the inventory of these ever-changing system components.


Increased PoS System Inspections and Access Controls. Point-of-Sale (PoS) devices that capture cardholder data must be inventoried and periodically inspected to ensure they haven’t been altered or replaced by different devices. Because card skimming is a prevalent problem, employees must be able to identify signs of tampering or suspicious behavior, which is likely to require additional security training for anyone who works at the point of sale. Physical access to PoS by employees must be controlled and authorized by the merchant, and if an employee leaves, access must be revoked immediately.


Additional Service Provider and Vendor Requirements. In addition to using unique authentication credentials for each customer environment, PCI 3.0 requires service providers to provide comprehensive written details of compliance-related services, roles and responsibilities. For example, service providers are required to take responsibility for cardholder data that they possess. Documentation should clarify which PCI compliance requirements are the responsibility of the merchant and which are the responsibility of the vendor or service provider. Agreeing to the scope of each party’s responsibilities in writing will add accountability and avoid confusion during compliance assessments.


Stronger Antimalware Systems. Previously, antimalware systems needed to be working, remain current and produce report logs. Under PCI 3.0, merchants are required to “identify and evaluate evolving malware threats” and have a process in place that alerts the organization of new malware. The antimalware system must also be configured to prevent users from disabling or altering the system without authorization from management.


ICG understands the latest PCI compliance requirements and can help you make cardholder data protection part of your everyday business processes.

May 12, 2015

No Comments


Why Payment Card Security Must Be an Everyday Business Practice

In our previous post, we explained that 2015 is being dubbed the “year of mobile payments,” as more and more retailers begin to accept smartphone-based payment options. 2014 has been given a less-auspicious name: the “year of the data breach.” Hundreds of millions of credit card numbers were stolen last year, affecting as many as 60 percent of American consumers.


Preventing 2015 from being a repeat (or worse) requires a new approach to credit card security. That is the aim of version 3.0 of the Payment Card Industry (PCI) Data Security Standard (DSS) and Payment Application Data Security Standard (PA-DSS). The latest set of security compliance requirements for organizations that accept credit and debit card payments, PCI 3.0 went into effect on January 1, 2014, and became mandatory on January 1st of this year.


PCI 3.0 represents a significant update of the standard. While version 2.0 contained only two different requirements compared to version 1.2.1, version 3.0 has 20 different requirements compared to version 2.0. Most of the changes involve clarification of existing requirements as opposed to new ones, but there is also change in mindset.


The central message conveyed by the new standard is that payment security must be an everyday business process, a shared responsibility across the entire organization to protect cardholder data. In many cases, organizations have been putting compliance on the back burner until it needs to be assessed and validated. Moving forward, the PCI SSC expects payment security to become a business-as-usual discipline. As part of this shift in approach, organizations will be required to self-validate their own processes, services and technology to identify and correct compliance issues.


PCI 3.0 also includes best practices for ensuring PCI-DSS compliance on a regular basis. These best practices include:


  • Ongoing monitoring of security software and protocols to make sure they’re operating properly.
  • Implementing processes to quickly detect and address security control failures.
  • Evaluating how planned modifications to the environment, such as changing system and network configurations or adding new systems, will affect the PCI-DSS scope, and then adjusting security controls accordingly.
  • Determining how mergers, acquisitions and other organizational changes affect the PCI-DSS scope and whether or not existing technology will be supported by their vendors.
  • Assigning and separating responsibilities for security and operations to ensure a system of checks and balances.

The updates in PCI 3.0 are intended to shine a new light on the importance of cardholder data security and safety throughout organizations. They require that merchants follow best practices that ensure consumer trust in the payment card system.


For example, vendors will now be required to use separate passwords for each customer environment. This rule comes the result of a security breach in which a hacker gained access into a single account and used the same password to infiltrate every other account for that particular vendor. While modern threats receive the most attention, this case shows the need to address the basic best practices, which can be accomplished in part by increasing awareness and education.


In our next post, we’ll discuss in greater detail some of the specific requirements of PCI 3.0 from a technology perspective.