SoftNAS™ Expands to Oracle Cloud, Achieves VMware Partner Ready Certification, Releases SoftNAS 5.5

Buurst continues to broaden its reach of its Enterprise-grade, High-Performance SoftNAS virtual Storage product

HOUSTON, TEXAS, UNITED STATES, February 14, 2024 — Buurst today announced the expansion of its flagship product, SoftNAS, to the Oracle Cloud Infrastructure (OCI) VMware solution. This move broadens its reach and delivers its high-performance, scalable storage solution to a wider range of businesses. Additionally, SoftNAS has re-validated its VMWare Partner Ready status as a member of the Technology Alliance Program (TAP), further solidifying its commitment to delivering seamless integration with leading virtualization platforms and has release SoftNAS version 5.5.

“We are excited to bring SoftNAS to the Oracle Cloud and re-validating our VMWare Partner Ready status,” per Arlene Ogden, Head of Engineering at Buurst.

Key Benefits of SoftNAS on Oracle Cloud:

• High-performance storage: SoftNAS delivers outstanding performance and scalability, making it ideal for demanding workloads
• Seamless integration: SoftNAS integrates seamlessly with OCI services, via VMware, simplifying deployment and management
• Cost-effectiveness: SoftNAS offers a flexible pricing model that helps businesses optimize their cloud spending
• Enhanced security: SoftNAS leverages Oracle’s robust security features to protect valuable data

VMWare Partner Ready validation:

SoftNAS has re-validated its VMware Partner Ready status, signifying its compatibility and interoperability with VMware vSphere®. This certification ensures seamless integration with existing VMware environments, allowing users to leverage SoftNAS’s advanced features without complexity.

“We continually make advancements to demonstrate our commitment to providing customers with the flexibility and performance they need to thrive in today’s dynamic Enterprise NAS landscape,” Vic Mahadevan, CEO of Buurst states. “Our Engineering and Product teams continue to make great improvements on our flagship product.”

About SoftNAS, powered by Buurst:

Buurst has released SoftNAS version 5.5, providing enhancements beyond the OCI and VMware announcements, such as enabling Offline Activation, Storage Pool Expansion Flow, and various infrastructure upgrades of ZFS, DeltaSync, and other updates that continue to strengthen the performance of SoftNAS.

Who is Buurst?

Buurst is a leading provider of software-defined storage solutions. Its SoftNAS virtual Network Attached Storage (NAS) appliance product, delivers high-performance, scalable, and cost-effective storage for businesses of all sizes. Buurst is committed to providing customers with innovative solutions that help them achieve their Enterprise Storage goals.

Andy Bowden
Buurst, Inc
+1 346-410-0643
pr@buurst.com

Data Retention: Tough Choices Ahead

Data Retention: Tough Choices Ahead

As the cost per byte of storage has declined, it has become a habit to simply store data “just in case.” At a time when the overwhelming majority of data was generated by human beings, nobody thought much of it. Data was summarized, information extracted from it, and the raw data points were still kept should they be needed later. Later seldom came.

Cisco tells us that as of 2008, there were more things connected to the internet than people, so we can use that as the point in time when the amount of data being generated and stored had its hockey stick moment. Now we have more sensors in more places monitoring more and more activity and generating more and more data points. By 2010, then Google CEO Eric Schmidt explained how we generated and stored as much data every two days as we did from the dawn of civilization up to 2003.

That’s a lot of data.

Running Out Of Room, Or …

The natural reaction is to instinctively feel that, at some point, we’re going to run out of storage capacity. If Moore’s Law holds, that won’t happen. We’ll just keep inventing new, more compressed storage technologies.

But what we are running out of is time.

Long ago, the last thing anyone in the data center did was to make sure the daily backups were running. They would run into the night all by themselves. Then they would run through the night. Then they were still running when everyone came into the office in the morning.

Fortunately, we’re clever and adaptable, so we came up with incremental backups. Instead of recopying and recopying data we had already copied, we only copied data that had changed since the last backup. Then we moved to faster backup media. Now we’re backing up the data as we’re saving it in primary storage. Ultimately, the restore time objective becomes impossible to achieve in the time available to us.

Making Tough Choices

Now we have to make a difficult choice. Once we’ve processed the data and created valuable information, do we or do we not keep the original raw data as it was collected? Or do we decide to discard it?

Or do we have to choose to save some of the raw data and not other parts of it? What are the criteria upon which that choice can be made? How do we anticipate in our planning which data points need to be stored and which will be discarded?

Now Add Machine Learning

This problem becomes exacerbated by the introduction of machine learning and artificial intelligence technologies to data analytics. When a machine is performing much of the data collation, selection, and processing, how are we to know which data points the machine will want to retrieve to complete its analysis? What if we choose incorrectly?

Other Possible Strategies

Being more pragmatic about this challenge, we need to think about data reduction. First of all, when and where does it occur?

Many of us take a physical relocation from one place to another as an opportunity to discard belongings that we no longer need. Some perform this discarding as they are packing to move. Others, often in a rush to make the move, simply pack everything and promise to do the discarding when they arrive at the new location. Many of us have boxes upon boxes that have yet to be unpacked since we moved in many years ago.

In the classic framework, we can choose to perform data reduction at the core of the network, in the server processors that will perform all the analytics. Or we can choose to perform data reduction at the edge where the data is being collected so the load on the servers and storage are reduced.

The ultimate solution will likely be a combination of both, depending on the workload and the processing required.

Begin With The End In Mind

There has been much discussion about data science — how it’s the art of extrapolating useful information from data and turning it into knowledge that facilitates superior decision-making.

As we continue to see the internet of things produce Schmidt’s estimate of five exabytes per day, data science must expand its scope to include the development of an end-to-end data strategy. This must begin with careful planning and consideration surrounding the collection of data, layers of summarization and reduction, preprocessing, and, finally, deciding which data points get stored and which are discarded.

As always is the case with data storage issues, this will be a volume-velocity-value process based on the business use case involved and at what point data gains value. The science is nascent, but the opportunity is immense.

Winning the Data Intelligence Game

Winning the Data Intelligence Game

A case can be made that every company is now a data company. But, it is the effective use of data and not amassing stockpiles of archived data that counts. The advent of big data inspired many businesses to ride the data intelligence wave. While businesses across verticals bought into the idea of big data, what they were doing was hoarding massive sets of raw data. The hype had their attention, but they were not fully prepared. And their real challenge today is – how to unlock meaningful insights from this data haystack.

It’s about time businesses move from gathering massive amounts of data to using this data to arrive at better business decisions. It starts with fixing data hoarding and then being able to leverage data intelligence wisely.

Cross-industry surveys from the likes of McKinsey prove that only a few companies have right out of the gate managed to achieve significant business impact from their investments in big data analytics. What really distinguishes one competitive business from the next is how it uses its own data.

YOU ALREADY OWN A GOLDMINE. HERE’S HOW YOU UNLOCK IT.

If your business struggles with data collation and isn’t getting the desired ROI from your investment and efforts, there is no need to panic and take short-sighted decisions, unless you want your competition to benefit from your impatience.

Now is the time to take advantage of inexpensive data storage, IoT data streams, and the availability of affordable and simple-to-deploy artificial intelligence (AI), Machine Learning, and cloud analytics tools. These technologies make it easier for your business to capture the right data at the right time with more meaningful insights, to get the right business outcomes.

There is strong merit in leveraging these new opportunities as this clearly adds to your competitive advantage. Tangible evidence suggests that data-driven businesses clearly outperform the competition and they are twice or thrice as likely to better succeed in achieving greater sales performance, deeper customer engagement, higher employee productivity, better marketing effectiveness, and other desired organizational performance metrics.

ADOPT ORGANIZED DATA MANAGEMENT AND HARNESS DISRUPTIVE TECH TO MAKE BETTER BUSINESS DECISIONS

A recently published article in The Economist rightly says, “the world’s most valuable resource is no longer oil, but data.” But, only when data is organized and easy to access. Turning data into corporate value requires having a solid data Management Strategy. Today’s modern business deals with plenty of distributed data that resides and emanates from multiple silos. These include, but are not limited to, data centers, global branch offices, CRMs, public clouds, private clouds, IoT sensors, and SaaS platforms.

With some initial expert help and cloud-based AI or analytics services, your business can organize and process such siloed data from disparate sources. Deploying affordable and flexible data analytics tools helps figure out what’s valuable, also making it more accessible. Possibilities go beyond making the data usable. Companies and entrepreneurs can also benefit from powerful (yet affordable) predictive analytics tools, which let you build frameworks to extract greater ROI.

For example, a leading retail store was in the news for creatively targeting pregnant women even before they became mothers! By mining nuggets of useful data from their shopping patterns, this retail store managed to strengthen engagement and increase value capture with expectant mothers. Such Data mining and predictive analytics tools let you identify and tap into many such opportunities that may otherwise go unnoticed.

Building data intelligence into your organization starts with data organization and assessment. Organized data can provide predictive insights, a deeper understanding of your customers, risks, and opportunities, and also helps arrive at better business decisions.

The 5 pillars of cloud data management

The 5 pillars of cloud data management

DThe 5 pillars of cloud data management

As the lifeblood of your business, data must be easily available in the cloud to boost your agility and ability to innovate, but easy accessibility must be balanced with protection to ensure maximum business value.

As more and more businesses adopt cloud services, seizing on the latest software tools and development methodologies, the lines between them are blurring. What really distinguishes one business from the next is its data.

Much of the intrinsic value of a business resides in its data, but we’re not just talking about customer and product data, there’s also supply chain data, competitor data, and many other types of information that might fall under the big data umbrella. Beyond that there are a multitude of smaller pieces of data, from employee records to HVAC system logins, that are rarely considered, but are necessary for the smooth running of any organization. And don’t forget about source code. Your developers are using cloud-based repositories for version control of application code. It also needs to be protected.

In the past, companies would typically try to centralize their data and lock it safely away in an impenetrable vault, but hoarding data doesn’t allow you to extract value from it. Data gains business value when it’s transported from place to place as needed and available to be leveraged, not locked away in some dark place. People need swift, easy access to data and real-time analysis to make innovative leaps, achieve operational excellence and gain that all-important competitive edge.

Cloud Data Management


As the importance of data has grown clearer many businesses have been stockpiling as much of it as they can get their hands on with the idea that the value will come along later. Businesses grow organically, so new systems and software are adopted, mergers and acquisitions prompt integrations and migrations, and new devices and endpoints are added to networks all the time. Even the most organized of businesses inevitably end up with a complex structure and data that are distributed globally.

Another layer that exacerbates this problem is people. Sometimes your employees will show poor judgment. They may unexpectedly wipe out critical data or accidentally delete configuration files. Disgruntled employees may even do these things deliberately. Then you must consider all the employees and contractors working for your partners and vendors, who often have access to your business-critical data.

To effectively manage your data without shuttering it and blocking legitimate requests for access, you need a solid cloud data management strategy and that begins with five important considerations.

1. Resting data.


Most of the time data sits in storage. It’s often behind firewalls and other layers of security, which it should be, but it’s also vital to ensure that your data is encrypted. It should be encrypted all the time, even when you think it’s safely tucked up in your vault.

If you properly protect your data at rest by encrypting it, then anyone stealing it will end up with lines of garbled junk that they can’t decipher. You may think it’s unlikely a cybercriminal will breach your defenses, but what about a motivated insider with malicious intent or even a careless intern? Hackers most common point of penetration is actually your employees’ devices, whereby they gain a foothold that can be leveraged to go deeper into your networks. Encrypt everything and take proper precautions to restrict access to the decryption key.

2. Accessing data


It’s very important that your employees can access the data they need to do their jobs whenever and wherever they want, but access must also be controlled. Start by analyzing which people need access to what data and create tailored access rights and controls that restrict unnecessary access. Any person requesting access to data must be authenticated and every data transaction should be recorded so you can audit later if necessary. Active Directory is the most common place to manage and control such access today.

Access control should also scan the requesting device to ensure it’s secure and doesn’t harbor any malware or viruses. Analyzing behavior to see if the user or device requesting access falls into normal patterns of use can also be a great way of highlighting nefarious activity.

3. Data in transit


It’s crucial to create a secure, authenticated and encrypted tunnel between the authenticated user and device and the data they’re requesting. You want to make the data transfer as swift and painless as possible for the end user, but without comprising security. Make sure data remains encrypted in transit, so no interceptor can read it. Choosing the right firewalls and virtual private network (VPN) services is vital. You may also want to compartmentalize endpoints to keep data safely siloed or employ virtualization to ensure it doesn’t reside on insecure devices. ‘

There’s no doubt that most companies focus their data protection efforts here and it is important, but don’t focus on data in transit to the detriment of other areas.

4. Arriving data


When the data arrives at its destination you want to be certain that it is authentic and hasn’t been tampered with. Can you prove data integrity? Do you have a clear audit trail? This is key to effectively managing data and reducing the risk of any breach or infection. Phishing attacks often show up in the inbox as genuine data to fool people into clicking somewhere they shouldn’t and downloading malware that bypasses your carefully constructed defenses.

5. Defensible backup and recovery


Even with the first four pillars solidly implemented, things can and do go sideways from time to time when least expected. Most companies recognize the importance of proper backup hygiene today and have implemented backup and recovery processes. Be sure to actually test and validate your ability to restore the backups and recover periodically.

In the cloud, there’s another critical area to carefully consider. Be careful not to put all your data eggs in one basket. Do not store your backups in the same cloud account where your production data resides. That’s a formula for disaster you may not recover from should a hacker somehow gain access to your network and delete everything.

That is, leverage multiple cloud accounts to segregate your backup data from your production data. Be certain to back up your cloud infrastructure configuration information as well, in case you ever need to rebuild it for any reason.

In the unlikely event your production environment should somehow become compromised, it’s critical a copy of all backups and cloud configuration are stored separately and secured from tampering and deletion. One way to do this is to create a separate backup account (on the same cloud or different cloud) with a “write only” policy that allows backup and archival data to be written and read, but not deleted. This protects your business by ensuring your DR systems and backups will always be available should you need them to recover.

By crafting a plan to cover data storage, data access, data in transit, data arrival, and defensible data backup/recovery, you’ve erected five pillars that will be strong enough to bear the load of your company data and withstand the forces which are trying to break in. But there are still many cloud data management pitfalls to avoid. Ensure that you can quickly recover from the most common issues that arise from operating in cloud environments.

You can have the best products and employees in the world, but without data they are powerless, so take steps to ensure it flows freely and safely. Smart data management will empower your staff to leverage the latest cloud technologies, innovate new products and services and differentiate your organization from the competition.

Check Also

Data Management Software
Cloud Data Management Success
Cloud data management strategy
Cloud File Data Management Pitfalls

When will your company ditch its data centers?

When will your company ditch its data centers?

It’s important to weigh up the costs and limitations of traditional data centers and consider transitioning your business to the cloud. By modernizing your infrastructure, you can focus on gaining a competitive edge in your core business.

data centers

Agility and speed are of paramount importance for most organizations as they try to innovate and differentiate themselves from the competition. The need for flexibility and rapid scalability is driving more and more companies into the cloud, as traditional data centers are no longer proving to be competitive, agile or robust enough.

It should come as no surprise that Cisco predicts 94 percent of workloads and compute instances will be processed by cloud data centers by 2021. But deciding when to take the leap, weighing the costs and risks, and developing a successful strategy is easier said than done.

Why companies are ditching those data centers and how they can make the transition as smooth as possible.

The push of traditional data center costs

Traditional data centers are enormously expensive to maintain. To set one up you need to find a suitable space and then fit it out with everything from uninterruptible power systems (UPS) to cooling HVAC units that keep servers from overheating, not to mention extensive investments in storage and networking equipment.

All of that comes before you consider the cost of hiring data center personnel with the right expertise to keep things running. These are employees outside your core competency, required just to keep your infrastructure working. Then there’s the ongoing energy costs of maintaining the data center and dealing with maintenance.

Economies of scale are important across the board in data centers, but they make a huge difference when it comes to operating and energy costs. A Ponemon Institute report found that the average annual cost per kW for a data center that’s 50,000 square feet or larger was $5,467, compared to an annual cost of $26,495 per kW for data centers between 500 and 5,000 square feet in size.

It’s not easy to scale up and down quickly, so when you’re not using full capacity, cash is being burned. When you push beyond capacity you’re faced with the prospect of expensive expansion or outsourcing. Even outsourcing the physical data center to a colo facility leaves one effectively in the data center hardware and IT infrastructure business.

The pull of the cloud

As many as 81 percent of enterprises now operate multi-cloud strategies, according to Right Scale’s State of the Cloud Report. Business units want to be free to adopt the very latest tools and technologies. They want to be able to pivot and pounce where they see an opportunity, innovating through machine learning and AI, the automation of software development pipelines, and greater depth in agile data analysis. And they want to do all this unencumbered by an internal IT department and bureaucracy.

Shadow IT is a reality, whether you accept it or not, and that genie is not going back into the bottle. While Gartner studies have found that shadow IT is 30 to 40 percent of IT spending in large enterprises, the Everest Group suggests it’s closer to 50 percent or more. It’s time to embrace the cloud. Maintaining a data center and a layer of IT infrastructure to support it is fast-becoming untenable.

No wonder Gartner names cloud system infrastructure services as the fastest growing segment in a fast-growing market. It’s possible to get better service and greater value by leveraging this competitive space.

While cost is perceived as a big barrier, you can offset the savings you’ll make by shutting down the data center or reducing capacity. Security is evolving from a big concern into a driver for cloud adoption. Major cloud providers are protecting data by encrypting and segmenting it across several locations and they hire the finest talent available because it’s their core business.

The shift to the cloud won’t happen overnight, but make no mistake, it is happening. The big question then becomes – how do you plan for and make the transition successfully?

Creating a roadmap

Most organizations have huge data centers and colos with lots of entanglements based on decades of legacy and acquisitions. There’s no way to unwind all of this quickly. For the health of the business, disruption must be managed carefully, which means assessing existing infrastructure and business assets and developing a plan to move them incrementally. Look at how to start developing new apps in the cloud and divest yourself of legacy infrastructure step-by-step.

What is required to mitigate the risk and enable organizations to transition is some sort of connective tissue that bridges the gaps between your data centers, applications and legacy systems and your chosen cloud environments. Make sure you formulate a strategy that allows you to integrate and access your data without sacrificing control.

There are lots of data management pitfalls to watch out for as you migrate to the cloud, but with the right kind of cloud fabric, you can move things along at your own pace based on your available resources, budget, and business priorities.

Ultimately, leveraging the cloud isn’t about one data center hosting strategy vs. another. It’s about modernizing the infrastructure required to run the business, focusing on your core business instead of IT infrastructure and hardware management, and connecting your business with cloud services to differentiate and compete more effectively.

SoftNAS V4 Cuts Time and Complexities of Cloud Migration Projects “by Up to 90%”

SoftNAS V4 Cuts Time and Complexities of Cloud Migration Projects “by Up to 90%”

storage newsletter

SoftNAS cuts the time and complexities of cloud data migration projects by up to 90%

SoftNAS delivers cloud storage cost savings by leveraging durable, low-cost cloud object storage. It cuts the time and complexities of cloud data migration projects by up to 90%, turning months into weeks and simplifying live data migrations to the cloud. It enables customers to connect any kind of data to the cloud anywhere in the world.

It addresses the impediments that block real-world cloud adoption, revolving around common cloud storage uses cases: primary and secondary cloud file storage, workload and application cloud migration, hybrid cloud and synthetic cloud backups in partnership with Veeam Software, Inc.

SoftNAS accelerates high-speed global data transfers up to twenty times faster with its patented UltraFast technology, enabling customers to connect remote offices, branch offices, factories and IoT at the edge with the cloud.

Features of SoftNAS 4 address the impediments that block real-world cloud adoption, revolving around common cloud storage uses cases: primary and secondary cloud file storage, workload and application cloud migration, hybrid cloud and synthetic cloud backups in partnership with Veeam. The company conducted customer interviews and identified 18 discrete cloud adoption barriers that SoftNAS 4 solve.

“Based upon our multi-cloud research we uncovered a number of challenges customers are struggling with to take advantage of public cloud infrastructure. SoftNAS 4 addresses many of the customer data management and control challenges head on with its unique combination of cloud NAS, bulk data transfer acceleration and data integration capabilities. SoftNAS is delivering a robust cloud data fabric companies can use as a strategy to more quickly adopt the cloud and save time and money in the process,” said Jeff Kato, senior analyst, The Taneja Group.

“Today we are delivering on our company mission and the vision to become the fabric for business data in the cloud. After three years of relentless company focus, strategic input from our cloud platform partners and customers, we have produced what I believe is the gold standard in cloud data control and management software,” said Rick Braddy, CEO/CTO and founder, SoftNAS. “When I founded SoftNAS six years ago as a disgruntled traditional storage customer, my goals were to reduce the high costs of storage, make it easier to connect applications with the cloud and keep IT in control of its destiny in the cloud. SoftNAS 4 is the realization of those goals and I invite customers and partners to start benefiting from it today,” Braddy said.

SoftNAS 4 features include:

  • UltraFast is the company’s patented technology that saves on time and costs by accelerating global bulk data movement up to twenty times faster than standard TCP/IP protocols at one tenth the cost of alternative bulk data transfer solutions. UltraFast accelerates transfers of data into, out of and across private, public, hybrid clouds and multi-clouds.
  • UltraFast overcomes up to 4,000ms of latency and up to 10% packet loss due to network congestion, connecting remote locations with the cloud over any kind of network conditions.
  • Lift and Shift with continuous sync enables live migration of production data and workloads and keeps content up-to-date when moving data to the cloud, between datacenters and/or distributing it to remote locations. Automatic restart and suspend/resume ensures bulk data transfer jobs run reliably while bandwidth schedules enable customers to open or throttle network usage. Lift and Shift works with UltraFast, so migrations can take place over any kind of network anywhere in the world.

  • FlexFiles integrates and transforms 24 types of data by leveraging Apache NiFi and a set of pre-built data integration processors. With FlexFiles, customers can tackle massively complex data integration projects combining file systems, Hadoop, Redshift, HTTP(S), Web Services, SQL/noSQL, XML, S3/Blob Objects and Custom Data Integrations.

  • FlexFiles makes connecting a customer’s business with the cloud and SaaS services to integrate through a drag and drop interface with no programming required. It also leverages UltraFast, enabling customers to leverage FlexFiles and Apache NiFi over any network conditions.

  • ObjFast makes cloud object storage run at near-block-level performance while still taking advantage of object storage pricing, which results in up to 67% cost savings vs. block storage alone. It throttles data transfer to cloud object storage so it’s as fast as possible without exceeding AWS or Azure object storage R/W capabilities.

  • ObjFast’s patent-pending acceleration technology has been optimized, tested, and certified for Veeam Synthetic Full Backups, so Veeam cloud backup, copy jobs, and recovery runs at near-block-level performance with the public cloud.

  • SmartTiers is a patent-pending automated storage tiering feature that moves aging data from more expensive, high-performance block storage to less expensive object storage, according to customer-set policies, reducing public cloud storage costs by up to 67% and is available for beta testing.

SoftNAS 4 is available in three product editions that run on AWS, Microsoft Azure, and vSphere. Customers will be able to launch all company’s cloud products on-demand, directly from the AWS and Azure Marketplaces to spin up cloud data solutions in just minutes with no prior purchasing approvals.

Visit StorageNewsletter.com.