" dedicated servers Archives - LuxSci

Posts Tagged ‘dedicated servers’

Infrastructure Requirements for HIPAA Compliance

Thursday, December 1st, 2022

If you are building a new environment that must comply with HIPAA, you may be surprised to find that the infrastructure requirements for HIPAA compliance do not require the use of any specific technology. This provides a lot of flexibility for developers and architects but can also introduce risk if you are unfamiliar with the requirements. This article outlines a few considerations to keep in mind as you build a HIPAA-compliant infrastructure or application.

infrastructure hipaa requirements

Dedicated Servers and Data Isolation

Reliability and data security are two of the most important considerations when building a healthcare application. Building an infrastructure in a dedicated server environment is the best way to achieve these aims. Let’s look at both.

Reliability

Hosting your application in a dedicated environment means you never have to share server resources with anyone else, and it can be configured to meet your needs exactly. This may also include high-availability configurations to ensure you never have to deal with unexpected downtime. For many healthcare applications, unexpected downtime can have serious consequences. 

Security

A dedicated environment isolates your data from others, providing an added security layer. Segmentation and isolation are crucial components of the Zero Trust security stance, and using a dedicated environment helps keep bad actors out. Hosting your application in a public cloud could put sensitive data at risk if another customer falls victim to a cyberattack or suffers a security incident.

HIPAA does not require the use of dedicated servers. Still, any host you choose must follow the HIPAA requirements associated with access controls, documentation, physical security, backups and archival, and encryption. Review our checklist for more details about HIPAA’s security requirements.

Encryption

It’s worth spending a minute discussing encryption because it’s an often misunderstood topic. Encryption is listed as an “Addressable” standard under HIPAA. Because it is not “Required,” this leads many to think that it is optional. The Rule states: “Ensure the confidentiality, integrity, and availability of all electronic protected health information the covered entity or business associate creates, receives, maintains, or transmits.” So, while HIPAA does not state that covered entities must use encryption, it does say that they need to ensure the confidentiality of any ePHI that is created, received, maintained, or transmitted.

The confusion arises because HIPAA is technology-neutral and does not specify how exactly to protect ePHI. Encryption is unnecessary if your organization can devise another way to protect sensitive data. However, practically speaking, there aren’t many alternatives other than not storing or transmitting the data at all. Encryption is the easiest and most secure way to protect electronic data in transmission and at rest.

At-Rest Encryption

HIPAA does not require at-rest encryption, though it is recommended to decrease risk and potential liability in some situations. Suppose your risk assessment determines that storage encryption is necessary. In that case, you must ensure that all collected and stored protected health information is encrypted and can only be accessed and decrypted by people with the appropriate keys. This makes backups secure, protects data from access by unauthorized people, and generally protects the data no matter what happens (unless the keys are stolen). Storage encryption is essential in any scenario where the data may be backed up or placed in locations out of your control. 

Transmission Encryption

If protected health information is transmitted outside of the database or application, encryption must also be used to protect the data in transmission. At a minimum, TLS encryption (with the appropriate ciphers) is secure enough to meet HIPAA guidelines. However, TLS alone may not be appropriate for your use cases.

  • Consider using a portal pickup method, PGP, or S/MIME encryption when transmitting highly sensitive information to end users.

Backup Infrastructure Requirements for HIPAA Compliance

Backups and archival are often an afterthought regarding HIPAA compliance, but they are essential. HIPAA requires that organizations “Create a retrievable, exact copy of electronic protected health information, when needed, before movement of equipment.” You must be sure that all ePHI stored or collected by your application is backed up and can be recovered in case of an emergency or accidental deletion. If your application sends information elsewhere (for example, via email), those messages must also be backed up or archived. HIPAA-compliant backups are robust, available, and accessible only by authorized people.

Under HIPAA Omnibus, organizations must keep electronic records of PHI disclosures for up to three years. Some states and company policies may require a longer record of disclosures; some states require up to ten years. When building a HIPAA-compliant infrastructure from scratch, it’s also essential to build backups.

Conclusion

If it is your first time dealing with infrastructure requirements for HIPAA compliance, be sure to ask the right questions and work only with vendors who thoroughly understand the risks involved. It can be overwhelming, but by selecting the right partners, you can achieve your goals without violating the law. 

Implementing Zero Trust Architecture

Tuesday, March 8th, 2022

The US Government has released its zero trust strategy to help government agencies implement zero trust architectures. It requires federal agencies to meet certain standards before the end of the 2024 fiscal year.

zero trust architecture

The zero trust strategy aims to improve the nation’s security posture and reduce the potential harms from cyber attacks. It assumes that attackers cannot be kept outside of network perimeters and sensitive data should be protected at all times.

The move toward zero trust architecture is a significant undertaking for the federal government, and this strategy aims to outline a common path for agencies to take, as well as limit uncertainty about transitioning.

It will require agency heads to partner with IT leadership in a joint commitment to overhaul the current security architecture and move toward a zero trust model. The strategy encourages agencies to assist each other as they work to implement zero trust architecture, exchanging information and even staff where necessary. Ultimately, the zero trust strategy aims to make the federal agencies stronger and more resilient against cyber attacks.

What Does The Zero Trust Architecture Strategy Include?

The Cybersecurity and Infrastructure Security Agency (CISA) created a zero trust maturity model to guide the strategy. The model contains five pillars including:

  • Identity
  • Devices
  • Networks
  • Applications and Workloads
  • Data

There are also three themes that cut through each of these areas:

  • Visibility and Analytics
  • Automation and Orchestration
  • Governance

Identity

First, the strategy includes a number of identity-related goals. Federal agencies must establish centralized identity-management systems for their employees. These systems must integrate with common platforms and applications.

Another core goal is for agencies to use strong multi-factor authentication throughout the organization. However, it must be enforced at the application layer rather than at the network layer. Password policies no longer require the use of special characters or frequent password changes.

The new strategy will also require that user authorization also incorporates at least one device-level signal. This could include confirming the device is authorized to access the application and has up-to-date security patches.

Devices

Under the Devices pillar, federal agencies must participate in CISA’s Continuous Diagnostics and Mitigation (CDM) program. This allows them to create reliable asset inventories. The other major goal is for each agency’s Endpoint Detection and Response (EDR) tools to be deployed widely and to meet CISA’s technical requirements.

Networks

Among the network-related measures, agencies need to use encrypted DNS to resolve DNS queries wherever it is technically supported. They must also force HTTPS for all web and API traffic. On top of this, agencies also need to submit a zero trust architecture plan that includes their approach to environmental isolation to the Office of Management and Budget.

Applications and Workloads

In addition, there are a number of application and workload-related goals for agencies, including:

  • Operating dedicated application security testing programs.
  • Undergoing third-party application security evaluations.
  • Running a public vulnerability disclosure program.
  • Working toward deploying services that employ immutable workloads.

Data

When it comes to data, agencies must follow a zero trust data security guide created by a joint committee made up of Federal Chief Data Officers and Chief Information Security Officers. Agencies must also automate data categorization and security responses, with a focus on tagging and managing access to sensitive documents. They must also audit any access to encrypted data in commercial cloud services. Another goal is for agencies to work alongside CISA to implement logging and information sharing capabilities.

Zero Trust Architecture and the Future

The federal government isn’t just pushing toward a zero trust architecture model as a fun new hobby. Instead, it is a response to the increasing sophistication of cyber attacks, especially those originating from nation-state level groups.

These complex and well-resourced cyber attacks aren’t only a threat to government agencies. Other organizations face similar threats in the ever-changing threat landscape. The reality is that businesses also need to move toward the zero trust model in order to effectively defend themselves in the future.

LuxSci can help your organization make the change through services such as our zero trust email options, or our zero trust dedicated servers. Contact our team to find out how LuxSci can help your organization prepare for a zero trust future.

Zero Trust and Dedicated Servers

Tuesday, July 6th, 2021

We will continue on in our series on Zero Trust, this time discussing Zero Trust and dedicated servers. As a quick recap, the Biden Administration ordered all federal agencies to develop a plan to adopt Zero Trust Architecture. This is a security model that begins with the assumption that even an organization’s own network may be insecure.

It accepts that bad actors may be able to penetrate the network, therefore a network designed under the Zero Trust model is built to make security perimeters as small as possible. Zero Trust Architecture also involves constantly evaluating those who are inside the network for potential threats.

One of the core aspects of Zero Trust Architecture is the concept of trust zones. Once an entity is granted access to a trust zone, they also gain access to other items in the trust zone. The idea is to keep these trust zones as small as possible to minimize what an attacker would be able to access if there is a breach.

Dedicated servers are a critical component of trust zones and Zero Trust Architecture as a whole.

zero trust and dedicated servers

The Role of Dedicated Servers in Zero Trust Architecture

Dedicated servers are an important part of Zero Trust Architecture. LuxSci customers can host their services on their own dedicated servers or server clusters, instead of sharing a server with other clients who may introduce additional threats. This isolates an organization’s data and resources from other entities, creating a small trust zone.

LuxSci also uses micro-segmentation to protect each customer’s server cluster. Our solution is host-based, and the endpoints are protected by firewalls. Each customer’s server (or cluster of servers) is dynamically configured in a micro-segment using server-level firewalls. This means that each customer is separated from others, and there is no privileged access between customers.

As a dynamic host-based micro-segmentation solution, this setup adapts fluidly to software modifications, service alterations, customer changes, and new developments in the threat landscape (as detected by automated systems).

Our customers can also choose to place a static traditional network firewall in front of their assets. This acts as an additional line of defense. With this traditional firewall on top, both customer assets and the dynamic micro-segment are placed in a well-defined network segment with added ingress and egress rules.

Access Controls

LuxSci’s dynamic host-based micro-segmentation solution is complemented by our flexible and highly configurable access controls. These include:

  • Two-factor authentication
  • Time-based logins
  • IP-based access controls
  • APIs that can be restricted to the minimum needed functionality
  • Application-specific passwords

These configuration options allow your organization to tailor access to your systems on a more granular level, limiting unauthorized access while still making resources available where necessary.

Limiting access and verifying user identities are important aspects of Zero Trust Architecture. These access controls fit hand-in-hand with our micro-segmentation setup for protecting server clusters.

Zero Trust: Dedicated Servers vs Shared Cloud Systems

A shared cloud system is not suited to the Zero Trust model, because the data and computations for different customers are managed in a shared environment. This means that segmentation isn’t possible, so the potential threats from other customers on shared resources can’t be eliminated. The risks of using a shared cloud server have been well-documented elsewhere. The industry’s shift to Zero Trust Architecture only reinforces the importance of using dedicated server environments.

Compared to cloud environments, dedicated servers are better aligned with Zero Trust Architecture. LuxSci’s dynamic customer micro-segmentation isolates customers from each other, protecting your organization from these additional threats. A second layer of network firewalls only serves to reinforce the separation, making the defenses even more formidable.

Contact our team if you want to learn more about how dedicated servers and Zero Trust Architecture can help to protect your organization from advanced threats.

Dedicated Servers: How They Improve Security And Reliability

Tuesday, December 8th, 2020

What’s best for your organization, shared or dedicated servers? If your company is looking for website hosting, an email provider, or hosting for other online services, this question may not be high up on its list of priorities. The differences between shared and dedicated servers may not seem particularly important at first. However, this choice could have significant security and reliability ramifications.

Many providers will steer you toward shared servers, or only provide a “shared cloud,” even though these may not be in your company’s best interest.

Dedicated Servers

Why?

It’s more efficient and cost-effective for them to lump a bunch of their customers onto the same server. This makes it easier to manage and reduces the provider’s overhead expenses. Your provider’s cost-savings and ease-of-administration probably aren’t your organization’s greatest concerns. Instead, you should be more worried about the additional risks and complications that shared servers can bring to your business.

While dedicated servers can be a more expensive option, the security and reliability benefits they provide make it worthwhile.

Security of Shared vs Dedicated Servers

Let’s say your website is hosted on a shared server, along with a bunch of other websites. For the sake of this example, let’s also presume that you are exceptionally diligent. All of your software and plugins are always updated as soon as possible, you have strong passwords, two-factor authentication, and suitable access control policies. You have regular security audits, and any issues that pop up are immediately rectified. Your site is essentially Fort Knox and meets compliance requirements.

But what about the sites that you share your server with?

You have no control over them, and can’t enforce the same security precautions that your organization does.

Well, that’s their problem, right?

It is, but it could very easily become your problem as well. There are a number of situations in which things could go badly for your organization as well.

Security Risks on Shared Servers

  • One or more of the other websites may be highly vulnerable. Whether through neglecting their updates or other poor security practices, they may be easy to infiltrate. If hackers can compromise one site on the server, it can give them a window into the others. This means that sharing a server significantly increases your own risk of data breaches. Cybercriminals may even target your site deliberately by looking up others that share the same IP address.
  • Malicious actors may set up their own sites on your shared server, with the sole intention of using this access to penetrate the other sites. This can also result in your organization’s website and database being breached.
  • You may share the server with a high-value target, such as a political activist or journalist. If they raise the ire of others, they could fall victim to a DDoS attack. Not only could this prevent legitimate visitors from accessing their website, but it could use up the shared resources, and prevent others from visiting your site.

These examples around web hosting also apply to other services such as email hosting, video conferencing, payment processing, online chat, etc. It is always a better and more secure choice to isolate your services and data from others to the maximum degree possible.

While shared servers can be the more economical option, particularly for those with limited needs, you also need to weigh these savings against the potential threats that come from any of these attacks. Is the possibility of suffering an attack, as well as the costs, damage to your organization’s reputation and stress worth the slight reduction in price?

If you use a shared server, it takes control out of your organization’s hands.

Reliability of Shared vs Dedicated Servers

If your organization uses a shared service, it shares resources with other organizations provisioned on the same shared server(s). The disk space, disk throughput, memory, network capacity, and processing power are all split or shared between the various parties.

This isn’t necessarily a problem—unless one of the other customers starts consuming all of the resources. If you shares a server with one of these bad neighbors, the strain can cause your services to slow down, or even become unavailable. In practical terms, this could result in your website being down, your email inaccessible, an inability for your employees to send messages, of your video teleconferencing system malfunctioning.

If a bad neighbor sends out email spam, that activity can also get the whole shared server or shared IP space blacklisted. This can result in your company’s emails going straight to spam, even when it has done nothing wrong. The reliability of your email sending and the successful inbox delivery of your messages depend on others when using shared resources.

LuxSci’s Dedicated Server Options

You can avoid facing these security and reliability issues through segmentation and isolation. LuxSci provides a range of options to suit a variety of different needs. These include giving clients:

  • Their own dedicated server(s) that are firewalled off from other customers.
  • Their own network segment with dedicated physical or network firewalls. These can be customized according to an organization’s needs.
  • Their own dedicated physical hardware, which means that even virtualization hypervisors aren’t shared between customers.

These options give our clients the flexibility they need to meet their organization’s unique requirements. Pursuing one of the above options will mean that your organization won’t have to worry about the threats or reliability problems that sharing a server can bring. LuxSci is HITRUST CSF certified and specializes in building custom, highly secure web environments designed to meet our customers’ needs.

Learn more about LuxSci’s dedicated server options: Schedule a Consultation