6 Basic Security Concerns for SQL Databases – Part 2

In our first article, we looked at some of the basics of securing databases, examined several examples of major data breaches, and reviewed best practices companies use to prevent hackers from accessing sensitive information. In this article, we are going to look at some advanced database security issues and recommend steps you can take to address them.

With the explosion of big data and cloud computing in recent years, database security is more critical than ever. Sites like Facebook use massive, unstructured databases managing millions of data points to handle their enormous user base of more than one billion. Clearly, these types of computing environments provide security challenges.

Advanced Database Security Considerations

According to Imperva, a California-based cyber-security company, there are several major security threats to organizational databases. They include:

  • Excessive privileges or unused privileges

  • Abuse of privileges

  • Input injection (formerly SQL injection)

  • Malware

  • Poor audit trails

  • Exposure of critical storage media

  • Vulnerable and misconfigured databases

  • Insecure sensitive data

  • Distributed denial of service attacks (DDoS)

  • Low levels of security education and knowledge of proper procedures

Tackling Big Data Database Security

These challenges are magnified in advanced databases. Although similarities exist between traditional data security and big data security, the differences include:

  • The amount of data collected and analyzed in big data applications. The sheer variety and volume of big data increase exponentially the challenge of maintaining security. Data repositories are sprinkled across the enterprise, and every source has its own permission levels and security detail. Research, governance, compliance and other data may be in different data sets. The data transfer rates and workflows might be different for each data source. Each of these variables presents another potential attack point for hackers.

  • The technology used for both unstructured and structured big data. One of the major challenges of securing modern databases is that database tools such as Hadoop never had much security baked into them in the first place. By their very nature, they create vulnerabilities that are less prevalent in traditional databases.

  • How big data is stored. Picture a single database server environment in comparison to the distributed environment found in big data applications. By design, these databases can spread out across a number of data environments and server clusters in multiple locations. The distributed infrastructure increases the potential for attacks.

Recommended Security Controls

To meet these challenges, the SANS Institute, an organization focused on security research and education, has developed a list of recommended security controls that increase cyber-defense for advanced database configurations. They include:

  • Account monitoring. Eliminate any inactive accounts, require users to implement strong passwords, and establish maximums for failed login attempts. Close control of database access brings down the chance of a hacker doing damage from the inside.

  • Application security. Implement secure editions of open source software such as Apache Accumulo.

  • Inventory of devices. Monitor every hardware device on your network so that any unauthorized device can be quickly located and blocked from gaining access.

  • Inventory of software. Similar to device inventory, every application that accesses the network must be authorized. Block installation or execution of unauthorized and unapproved software.

  • Procedures and tools. Rather than building security guidelines from scratch each time you add an application or new piece of software, develop checklists, benchmarks, and guidelines that apply to every application. Two things that can help you get started are the Center for Internet Security Benchmarks Program and the NIST National Checklist Program.

  • Vulnerability assessment. On an ongoing basis, assess and evaluate new information and knowledge to identify potential vulnerabilities in your database, and implement procedures to minimize damage. Remember that hackers are on constant attack and are always trying to take advantage of new knowledge in the marketplace.

  • Protect browsers and email. Browsers and email software are popular access vectors for hackers to try to reach your system. Maintaining solid email and browser security minimizes the attacks on your database through these channels.

High-Profile Data Breaches

As powerful as modern technology is, sometimes it is hard to believe that computers that process millions of points of data on a daily basis can be crippled so easily. Yet, data breaches on a massive scale are regular items in the news. Here are some examples:

  • Ashley Madison. A group of actors dubbed “The Impact Team” announced they would release Ashley Madison customer information if it did not shut down operations. A site that facilitated extramarital affairs, Ashley Madison’s owners apparently did not believe they were vulnerable and took no action against the threat. However, The Impact Team made good on its promise, and in July 2015, released more than 37 million customer records, including names and passwords. The result was devastating and ongoing, as many customers continue to deal with the fallout of their names being released.

  • Internal Revenue Service. Hackers compromised the computer systems of the Internal Revenue Service and manipulated tax records for more than 300,000 taxpayers. Using stolen credentials, they garnered millions of dollars in bogus refunds. They were only discovered when the IRS noticed an inordinate number of requests for all the tax returns.

  • CareFirst/BlueCross BlueShield. Health records are some of the most personal pieces of data stored in corporate databases. Yet, the health industry continues to experience significant data breaches. In May 2015, CareFirst determined that hackers have gained access to more than one million members’ names, email addresses and birth dates. One good note: The thieves did not get to their employment information, social security numbers or financial data because the passwords were encrypted.

  • Kaspersky Lab. Is it possible for a security vendor to experience a significant cyber attack? The answer is yes because, in June 2015, Moscow-based security company Kaspersky Lab was infiltrated by hackers. They were able to compromise data on the company’s products that deal with fraud prevention and secure networks.

  • Harvard University. A July 2015 compromise of the security systems at Harvard University was the latest in a string of other breaches at institutes of higher learning across the nation. Although experts are not sure what data the hackers gained, the news of the infiltration was similar to other attacks at institutions, such as a Penn State University breach in the spring. That strike affected the records of more than 18,000 people.

Working With Cloud Providers to Ensure Security

If you are working with cloud vendors, is your data safe? Database security for enterprise computing and the cloud are much the same as non-cloud databases — data breaches have been happening at an alarming pace in both environments. However, placing data in the cloud means it is not on the same site as your organization, which adds another dimension of risk.

One of the sales points cloud providers extol is that they have specialists who are experts in their fields and so have advanced knowledge that your organization may not possess. While that may have validity, it may not be true in all cases.

In addition, using many people means more of the human element, always a greater risk to data security than any other factor. Even though cloud computing presents an idyllic world of data being secured somewhere “up there,” in truth, it is located in a data center much like the one at your site.

You should be asking cloud providers question such as:

  • Where is our data stored?

  • Who manages it?

  • Is it always stored in the same place, or is it moved around to different countries?

  • Do any outside personnel have access to my information?

  • Do you encrypt my data, and if so, how do you do it?

  • Other than your firm, what other organizations have green-light permission levels to the encryption key?

Database security for both traditional bases and the high-speed, high-volume distributed databases of big data and cloud computing are similar. However, the significant size, speed and complexity of databases managing huge amounts of information mean they are also open to a bigger attack surface, more points of vulnerability and increased physical environment concerns.

Effective Security Implementation

The best practices, strategies and tactics for effective security implementation remain the same for both environments: Keep track of hardware devices on the system, closely monitor all applications on the network, come up with solid guidelines and benchmarks that you apply to every program, consistently evaluate potential vulnerabilities in your system and come up with a plan of remediation, and constantly encourage end users and company personnel to maintain good security habits.

In Summary

This wraps up the second article in our two-part database security series. In the first article, we looked at basic database security procedures that can be implemented by database administrators, especially those who may be new to the position. We recommended straightforward procedures like strengthening network security, limiting access to the server, cutting out unneeded applications, applying patches immediately, encrypting sensitive data and documenting baseline configurations.

In this article, we looked at the bigger picture of advanced database security by examining today’s world of cloud computing, big data, and unstructured databases. We discovered that, while the scope and size of these environments differ greatly from a localized, traditional database, the security concerns are the same. Implement these ideas, and you will have taken major steps toward preventing a critical data breach at your organization.

 

6 Basic Security Concerns for SQL Databases – Part 1

Consider these scenarios: A low-level IT systems engineer spills soda, which takes down a bank of servers; a warehouse fire burns all of the patient records of a well-regarded medical firm; a government division’s entire website vanishes without a trace.

Data breaches and failures are not isolated incidents. According to the 2014 Verizon Data Breach Investigations Report, databases are one of the most critical vulnerability points in corporate data assets. Databases are targeted because their information is so valuable, and many organizations are not taking the proper steps to ensure data protection.

  • Only 5 percent of billions of dollars allocated to security products is used for security in data centers, according to a report from International Data Corporation (IDC).

  • In a July 2011 survey of employees at organizations with multiple computers connected to the Internet, almost half said they had lost or deleted data by accident.

  • According to Fortune magazine, corporate CEOs are not making data security a priority, seemingly deciding that they will handle a data problem if it actually happens.

You might think CEOs would be more concerned, even if it is just for their own survival. A 2013 data breach at Target was widely considered to be an important contributing factor to the ouster of Greg Steinhafel, then company president, CEO and chairman of the board. The Target breach affected more than 40 million debit and credit card accounts at the retailing giant. Stolen data included names of customers, their associated card numbers, security codes and expiration dates.

Although the threats to corporate database security have never been more sophisticated and organized, taking necessary steps and implementing accepted best practices will decrease the chances of a data breach, or other database security crisis, taking place at your organization.

6 Basic Security Concerns

If you are new to database administration, you may not be familiar with the basic steps you can take to improve database security. Here are the first moves you should make

  1. The physical environment. One of the most-often overlooked steps in increasing database security is locking down the physical environment. While most security threats are, in fact, at the network level, the physical environment presents opportunities for bad actors to compromise physical devices. Unhappy employees can abscond with company records, health information or credit data. To protect the physical environment, start by implementing and maintaining strict security measures that are detailed and updated on a regular basis. Severely limit access to physical devices to only a short list of employees who must have access as part of their job. Strive to educate employees and systems technicians about maintaining good security habits while operating company laptops, hard drives, and desktop computers. Lackadaisical security habits by employees can make them an easy target.

  2. Network security. Database administrators should assess any weak points in its network and how company databases connect. An updated antivirus software that runs on the network is a fundamental essential item. Also, ensure that secure firewalls are implemented on every server. Consider changing TCP/IP ports from the defaults, as the standard ports are known access points for hackers and Trojan horses.

  3. Server environment. Information in a database can appear in other areas, such as log files, depending on the nature of the operating system and database application. Because the data can appear in different areas in the server environment, you should check that every folder and file on the system is protected. Limit access as much is possible, only allowing the people who absolutely need permission to get that information. This applies to the physical machine as well. Do not provide users with elevated access when they only need lower-level permissions.

  4. Avoid over-deployment of features. Modern databases and related software have some services designed to make the database faster, more efficient and secure. At the same time, software application companies are in a very competitive field, essentially a mini arms race to provide better functionality every year. The result is that you may have deployed more services and features than you will realistically use. Review each feature that you have in place, and turn off any service that is not really needed. Doing so cuts down the number of areas or “fronts” where hackers can attack your database.

  5. Patch the system. Just like a personal computer operating system, databases must be updated on a continuing basis. Vendors constantly release patches, service packs and security updates. These are only good if you implement them right away. Here is a cautionary tale: In 2003, a computer worm called the SQL Slammer was able to penetrate tens of thousands of computer services within minutes of its release. The worm exploited a vulnerability in Microsoft’s Desktop Engines and SQL Server. A patch that fixed a weakness in the server’s buffer overflow was released the previous summer, but many companies that became infected had never patched their servers.

  6. Encrypt sensitive data. Although back-end databases might seem to be more secure than components that interface with end users, the data must still be accessed through the network, which increases its risk. Encryption cannot stop malicious hackers from attempting to access data. However, it does provide another layer of security for sensitive information such as credit card numbers.

Famous Data Breaches

Is all this overblown? Maybe stories of catastrophic database breaches are ghost stories, conjured up by senior IT managers to force implementation of inconvenient security procedures. Sadly, data breaches happen on a regular basis to small and large organizations alike. Here are some examples:

  • TJX Companies. In December 2006, TJX Companies, Inc., failed to protect its IT systems with a proper firewall. A group led by high-profile hacker Albert Gonzalez gained access to more than 90 million credit cards. He was convicted of the crime and invited to spend over 40 years in prison. Eleven other people were arrested in relation to the breach.

  • Department of Veterans Affairs. A database containing names, dates of birth, types of disability and Social Security numbers of more than 26 million veterans was stolen from an unencrypted database at the Department of Veterans Affairs. Leaders in the organization estimated that it would cost between $100 million and $500 million to cover damages resulting from the theft. This is an excellent example of human error being the softest point in the security profile. An external hard drive and laptop were stolen from the home of an analyst who worked at the department. Although the theft was reported to local police promptly, the head of the department was not notified until two weeks later. He informed federal authorities right away, but the department did not make any public statement until several days had gone by. Incredibly, an unidentified person returned the stolen data in late June 2006.

  • Sony PlayStation Network. In April 2011, more than 75 million PlayStation network accounts were compromised. The popular site was down for weeks, and industry experts estimate the company lost millions of dollars. It is still considered by many as the worst breach of a multiplayer gaming network in history. To this day, the company says it has not determined who the attacks were. The hackers were able to get the names of gamers, their email addresses, passwords, buying history, addresses and credit card numbers. Because Sony is a technology company, it was even more surprising and concerning. Consumers began to wonder: If it could happen to Sony, was their data safe at other big companies.

  • Gawker Media. Hackers breached Gawker Media, parent company of the popular gossip site Gawker.com, in December 2010. The passwords and email addresses of more than one million users of Gawker Media properties like Gawker, Gizmodo, and Lifehacker, were compromised. The company made basic security mistakes, including storing passwords in a format hackers could easily crack.

Take These Steps

In summary, basic database security is not especially difficult but requires constant vigilance and consistent effort. Here is a snapshot review:

  • Secure the physical environment.

  • Strengthen network security.

  • Limit access to the server.

  • Cut back or eliminate unneeded features.

  • Apply patches and updates immediately.

  • Encrypt sensitive data such as credit cards, bank statements, and passwords.

  • Document baseline configurations, and ensure all database administrators follow the policies.

  • Encrypt all communications between the database and applications, especially Web-based programs.

  • Match internal patch cycles to vendor release patterns.

  • Make consistent backups of critical data, and protect the backup files with database encryption.

  • Create an action plan to implement if data is lost or stolen. In the current computing environment, it is better to think in terms of when this could happen, not if it will happen.

Basic database security seems logical and obvious. However, the repeated occurrences of major and minor data breaches in organizations of all sizes indicate that company leadership, IT personnel, and database administrators are not doing all they can to implement consistent database security principles.

The cost to do otherwise is too great. Increasingly, corporate America is turning to cloud-based enterprise software. Many of today’s popular applications like Facebook, Google and Amazon rely on advanced databases and high-level computer languages to handle millions of customers accessing their information at the same time. In our next article, we take a closer look at advanced database security methods that these companies and other forward-thinking organizations use to protect their data and prevent hackers, crackers, and thieves from making off with millions of dollars worth of information. 

 

5 Tricky SQL Database Performance Challenges

There’s a good chance you’ve come across various performance challenges while writing queries and working within SQL if you use it on a regular basis. You might be dealing with slow-moving or poorly-written queries, inefficient subqueries, indexes with tons of writes to them but little or no reads, or different CPU and memory issues. This article will explain how to find and address multiple SQL performance challenges.

Definition Bank

Let’s look at some definitions before we begin:

  • Wait States or Queue States: A period of waiting that comes after executing queries or loading resources related to particular tasks. While SQL is executing one or more queries or pulling resources, a certain amount of time must be spent both scanning the storage and data and performing the calculation or task at hand. Two common wait types are latch wait types, such as PAGEIOLATCH_EX, which refers to a wait that happens when a task is waiting on a latch for an I/O request type of buffer, and CXPACKET wait types, a typical problem related to high server CPU usage due to poorly-written parallel queries (queries designed to run concurrently). A third common wait type is the WRITELOG wait, which is related to the SQL session writing the contents of the cache of a log to the disk where the log is stored.

  • Locking: In SQL, there are lock resources and lock modes. Lock resources refer to the places where SQL can place locks, and lock modes refer to the locks that can be placed on resources so they can be accessed by concurrent tasks and transactions. There are several resources where locks can be placed, such as a row in a table or a lock on each row within an index. There are also several lock mode types, such as shared locks and exclusive locks. Some locks are completely fine, but others can be detrimental to performance.

  • Disk and Network I/O: SQL data and transactions funneling in and out of the disk, cache or through the network. The more there are, the worse the performance can be. However, fine-tuning your queries and indexing can significantly reduce the input and output on the physical and logical disks and network.

  • Contention: Typically a term related to contention in locking. Locking in SQL helps to ensure consistency when performing read or write tasks in the database, but contention when locking can happen. Contention can occur, for example, when processes are trying to perform updates concurrently on the same page.

  • High CPU Usage: High server CPU usage as it relates to SQL is directly connected to the SQL processes being run, poor query execution, system tasks and excessive compilation and recompilation of queries. The CPU can also be strained if there are bad indexes in place.

1. Poorly-Written SQL

Because SQL is declarative, you can write the same query in many ways to receive the same results. These differences in how queries are written can affect performance negatively. There are two ways you can rewrite queries to improve performance:

  • Rewrite SQL to minimize query footprint and make them run faster.

  • Rewrite SQL to remove the need for subqueries.

Rewriting Queries

Let’s look at an example of rewriting a query to improve performance. Say you want to find all overlap for a given range. You can do this in a few different ways, but let’s consider these two:

SELECT *

FROM range

WHERE end_time >= @start

AND start_time <= @end

However, this query would obtain the same results, only faster:

SELECT *

FROM range

WHERE (start_time > @start AND start_time <= @end)

OR (@start BETWEEN start_time AND end_time)

The latter is faster because it is more specific in the exact ranges that are to be found.

Removing Subqueries

A subquery is a query that needs another query to run correctly (or at all). However, there are inherent performance problems with subqueries; they can be slower and result in returns of NULL values. However, subqueries can be rewritten as Joins to avoid such problems. Below is an example of a subquery rewritten as a Join:

SELECT * FROM employeeTable WHERE id NOT IN (SELECT id FROM employeeTable2);

SELECT * FROM employeeTable WHERE NOT EXISTS (SELECT id FROM employeeTable2 WHERE employeeTable.id=employeeTable2.id);

Can be turned into:

SELECT employeeTable.*

FROM employeeTable LEFT JOIN employeeTable2 ON employeeTable.id=employeeTable2.id

WHERE employeeTable2.id IS NULL;

The latter achieves the same results without the use of a subquery, improving performance and avoiding any other issues.

2. Bad Indexes

If the number of write functions to a table and its indexes is exceedingly greater than the number of reads to it, there is a good chance the underlying indexes are harming overall performance. Every time a write is performed to an SQL column that has an index, a corresponding modification must also be activated to the column indexes. If a lot of the activity is write activity, it might be worth it to consider removing or altering the indexes involved. Doing so would likely increase performance by reducing the overall output of write activity. Find bad indexes by using Dynamic Management Views to analyze query execution statistics. After finding the indexes that have many writes but zero or few reads, consider dropping those indexes to improve performance.

3. Locking Contention

Locking contention can occur when there are processes that are trying to perform lock updates concurrently on the same lock resources. Furthermore, when two or more tasks are blocking one other because each task has a lock on a resource in which the other tasks are attempting to place a lock, a deadlock can occur because neither can resolve.

For server purposes, this can be explained as multiple requests from multiple users. This increases the likelihood to create conflicts over time because numerous processes can request access to same database resources concurrently.

Here are some tips for detecting contention and deadlocks:

  1. Use a database information tool for detection. According to Microsoft, “…the Database Engine provides monitoring tools … trace flags, and the deadlock graph event in SQL Server Profiler.”

  2. Ensure that all database resources are accessed in the same order every time.

  3. Make updates to SQL before beginning a task or transaction.

  4. Don’t allow the data to be used during tasks and transactions.

  5. Limit or avoid the use of cursors when coding.

  6. Keep your transactions small.

4. Memory Problems

As a general rule, SQL servers are memory hogs, even if there is a good deal of memory installed in the server. To improve performance and reduce memory usage, SQL Server automatically stores data in memory caches. Once data is read from the drive, the data is not going to change unless SQL needs to update the data. If your SQL server has enough memory in it to cache the entire database, SQL Server will cache the whole thing. SQL is programmed to make up for things like redundant code, NULL values, too many writes and not enough reads in queries, which all lead to more memory being used and a lull in performance.

Here is what you can do to check and manage memory usage:

  1. Open SQL Server Management Studio and connect to the correct server.

  2. Choose Windows Authentication from the Authentication menu.

  3. Choose View and then Object Explorer, then right-click on the name of the server and choose Properties. Click Memory on the left, and you will see different options, including “Use AWE to allocate memory,” “Minimum Server Memory” and “Maximum Server Memory.”

Check AWE to Allocate Memory if using a 32-bit server with fewer than 4GB of memory, but leave it unchecked if using more memory or a 64-bit server. Check that the Minimum Server Memory is set to zero. Don’t change that number–if it has been changed, do some research to figure out why, because it is not recommended to change it or set the minimum and maximum to the same amount. If you need to reduce the Maximum Server Memory, reduce the minimum as well. If you need to increase the maximum, keep the value of the minimum the same.

The last option, Maximum Server Memory, will have a default value. If you are trying to reduce the memory that SQL Server can use, you can set it to anything you want. Keep in mind that if you set it lower, performance will dip lower as well. Sometimes administrators will set this to a lower number, say 6GB, to free up memory for the OS to operate smoothly. You may want to test performance based on changing minimums and maximums to see what works best for your particular setup.

5. High CPU Usage

High CPU usage can occur for a number of reasons, but the following commonly cause it:

  • Poorly-written queries

  • High compilations and recompilations

  • High usage of the temporary table

  • System threads causing high spikes in CPU

Because we’ve already gone over poorly-written queries and touched upon high compilations and recompilations, let’s look at the other possible issues:

  1. High usage of the temporary table: As your queries become more complex, temporary tables tend to be used more. Try to eliminate the usage of temporary tables to boost CPU performance.

  2. System threads: Check this by writing select * from sys.sysprocesses where spid<51 and check for any system thread issues.

Resource Monitor can also use high amounts of CPU power when virtual memory runs low. Specific troubleshooting information can be found on Microsoft’s support website.