<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: andreasneuman</title>
    <description>The latest articles on DEV Community by andreasneuman (@andreasneuman).</description>
    <link>https://dev.to/andreasneuman</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/andreasneuman"/>
    <language>en</language>
    <item>
      <title>Introduction to Temporary Tables in SQL Server
</title>
      <dc:creator>andreasneuman</dc:creator>
      <pubDate>Wed, 26 May 2021 14:51:40 +0000</pubDate>
      <link>https://dev.to/andreasneuman/introduction-to-temporary-tables-in-sql-server-2fnf</link>
      <guid>https://dev.to/andreasneuman/introduction-to-temporary-tables-in-sql-server-2fnf</guid>
      <description>&lt;p&gt;A temporary table in SQL Server, as the name suggests, is a database table that exists temporarily on the database server. A temporary table stores a subset of data from a normal table for a certain period of time.&lt;/p&gt;

&lt;p&gt;Temporary tables are particularly useful when you have a large number of records in a table and you repeatedly need to interact with a small subset of those records. In such cases instead of filtering the data again and again to fetch the subset, you can filter the data once and store it in a temporary table. You can then execute your queries on that temporary table. Temporary tables are stored inside “tempdb” which is a system database. Let’s take a look at how you can use a temporary data in a simple scenario.&lt;/p&gt;

&lt;p&gt;More at &lt;a href="https://codingsight.com/introduction-to-temporary-tables-in-sql-server/"&gt;https://codingsight.com/introduction-to-temporary-tables-in-sql-server/&lt;/a&gt;&lt;/p&gt;

</description>
      <category>sql</category>
      <category>database</category>
      <category>sqlserver</category>
      <category>devops</category>
    </item>
    <item>
      <title>New in Delphi DAC: Support for RAD Studio 10.4.2 and Multiple Performance Improvements</title>
      <dc:creator>andreasneuman</dc:creator>
      <pubDate>Thu, 25 Mar 2021 18:39:12 +0000</pubDate>
      <link>https://dev.to/andreasneuman/new-in-delphi-dac-support-for-rad-studio-10-4-2-and-multiple-performance-improvements-1e22</link>
      <guid>https://dev.to/andreasneuman/new-in-delphi-dac-support-for-rad-studio-10-4-2-and-multiple-performance-improvements-1e22</guid>
      <description>&lt;p&gt;Following the release of RAD Studio 10.4.2 Sydney from last week, we are excited to announce support for the &lt;a href="https://blog.devart.com/delphi-dac-support-for-rad-studio-10-4-2.html"&gt;new versions of Delphi&lt;/a&gt; and C++ Builder IDEs in our data access components. DAC products are now also compatible with macOS Big Sur, iOS 14, and Android 11. Additionally, PostgreSQL 13 was supported in PgDAC.&lt;/p&gt;

&lt;p&gt;Data access speed with default settings was significantly increased in LiteDAC and the SQLite provider. The performance of batch operations was improved in all &lt;a href="https://www.devart.com/dac.html"&gt;DAC products&lt;/a&gt;. During batch processing, SQL statements are grouped into a single unit of work, known as a batch, and submitted to the database server in a single call, thereby reducing the network latency. We also reduced memory consumption in batch operations for InterBase and Firebird.&lt;/p&gt;

&lt;p&gt;The LOBs read/write speed was improved for Oracle, SQL Server, DBF files, and ODBC drivers. The data fetch speed was also improved for Oracle and ODBC drivers. The PrefetchRows property, which allows you to set the number of rows to be prefetched during query execution, was supported in the Direct mode (previously available only in the OCI mode). The FindFirst, FindNext, FindLast, and FindPrior methods to search for records in a dataset using filters now work much faster in all DACs.&lt;/p&gt;

&lt;p&gt;The Over-the-Wire (OTW) encryption feature of InterBase was supported in IBDAC to allow you to secure your data during the transmission process with SSL/TLS encryption. Also automatic detection of computed fields when generating update statements was improved in IBDAC.&lt;/p&gt;

&lt;p&gt;In PgDAC, a single TPgConnection object can now be used in multiple threads, and a new property called MultipleConnections allows or denies creation of additional internal connections.&lt;/p&gt;

&lt;p&gt;Three new properties were added in LiteDAC: JournalMode, LockingMode, and Synchronous. The Unicode standard was supported in the TVirtualQuery component. The LastWarningCount property and the OnWarning event were added in MyDAC to get the number of warnings received from the MySQL server and define an event handler method.&lt;/p&gt;

&lt;p&gt;Devart Delphi Data Access Components are suites of components for direct access to common databases and cloud services and allow creating multi-platform database applications in Embarcadero RAD Studio, Delphi, C++Builder, Lazarus, and Free Pascal for Windows, Linux, macOS, iOS, and Android, both 32-bit and 64-bit.&lt;/p&gt;

&lt;p&gt;To try out these features, you are welcome to download new versions of Devart data access components:&lt;br&gt;
&lt;a href="https://www.devart.com/unidac/"&gt;UniDAC 8.4&lt;/a&gt;&lt;a href="https://www.devart.com/unidac/download.html"&gt;[ Download ]&lt;/a&gt;&lt;br&gt;
&lt;a href="https://www.devart.com/odac/download.html"&gt;ODAC 11.4&lt;/a&gt;&lt;a href="https://www.devart.com/odac/download.html"&gt;[ Download ]&lt;/a&gt; &lt;br&gt;
&lt;a href="https://www.devart.com/sdac/"&gt;SDAC 9.4&lt;/a&gt;&lt;a href="https://www.devart.com/sdac/download.html"&gt;[ Download ]&lt;/a&gt;&lt;br&gt;&lt;br&gt;
&lt;a href="https://www.devart.com/mydac/"&gt;MyDAC 10.4&lt;/a&gt;&lt;a href="https://www.devart.com/mydac/revision_history.html"&gt;[ Download ]&lt;/a&gt;&lt;br&gt;&lt;br&gt;
&lt;a href="https://www.devart.com/ibdac/"&gt;IBDAC 7.4&lt;/a&gt;&lt;a href="https://www.devart.com/ibdac/download.html"&gt;[ Download ]&lt;/a&gt;&lt;br&gt;&lt;br&gt;
&lt;a href="https://www.devart.com/pgdac/"&gt;PgDAC 6.4&lt;/a&gt;&lt;a href="https://www.devart.com/pgdac/download.html"&gt;[ Download ]&lt;/a&gt;&lt;br&gt;
&lt;a href="https://www.devart.com/litedac/"&gt;LiteDAC 4.4&lt;/a&gt;&lt;a href="https://www.devart.com/litedac/download.html"&gt;[ Download ]&lt;/a&gt;&lt;br&gt;
&lt;a href="https://www.devart.com/virtualdac/"&gt;VirtualDAC 11.4&lt;/a&gt;&lt;a href="https://www.devart.com/virtualdac/download.html"&gt;[ Download ]&lt;/a&gt;&lt;/p&gt;

</description>
      <category>delphi</category>
      <category>mysql</category>
      <category>sqlserver</category>
    </item>
    <item>
      <title>SQL Server DELETE – Removing One or More Rows from a Table
</title>
      <dc:creator>andreasneuman</dc:creator>
      <pubDate>Tue, 23 Feb 2021 16:03:25 +0000</pubDate>
      <link>https://dev.to/andreasneuman/sql-server-delete-removing-one-or-more-rows-from-a-table-44i9</link>
      <guid>https://dev.to/andreasneuman/sql-server-delete-removing-one-or-more-rows-from-a-table-44i9</guid>
      <description>&lt;p&gt;The right application of the DELETE statement for data removal is crucial, and it involves lots of issues. Still, there are standard practices of the DELETE statement usage that simplify all such tasks.&lt;/p&gt;

&lt;p&gt;This article will explore some of the professional life scenarios to equip you with the most helpful tips to use the DELETE statement correctly.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;About T-SQL Delete&lt;/strong&gt;&lt;br&gt;
First of all, we should get familiar with the T-SQL Delete statement in the simplest manner possible.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What is the T-SQL Delete Statement?&lt;/strong&gt;&lt;br&gt;
The Delete statement, as the name indicates, is a statement that helps us to delete data from the database table.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What is a Table?&lt;/strong&gt;&lt;br&gt;
A table is a structure that you create in a database to store your data. For example, we can have a table of books to store records related to those books.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What is a Database?&lt;/strong&gt;&lt;br&gt;
A database is an organized collection of data and the data structures to hold that data. In other words, data can be stored inside the database in the form of tables.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What kinds of data can be deleted?&lt;/strong&gt;&lt;br&gt;
We can delete one or more records (commonly known as rows) from a table using the delete statement.&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--9sjvnJEX--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/sfn2g5nxw1zr1w48yuc8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--9sjvnJEX--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/sfn2g5nxw1zr1w48yuc8.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;br&gt;
&lt;strong&gt;Simple Definition&lt;/strong&gt;&lt;br&gt;
The Delete statement removes some or all data (rows) from a table.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Microsoft Definition&lt;/strong&gt;&lt;br&gt;
According to Microsoft documentation, the Delete statement removes one or more rows from a table or view in SQL Server.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Deciding whether to remove some or all rows&lt;/strong&gt;&lt;br&gt;
One might wonder how the statement defines whether to remove some or all of the data (rows) from a table. The answer lies in the criteria or conditions specifying what needs to be removed.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Simple Syntax&lt;/strong&gt;&lt;br&gt;
The simplest syntax of the statement is as follows:&lt;br&gt;
Delete FROM  WHERE &lt;/p&gt;

&lt;p&gt;You need to provide the table name and the criteria/condition for the data (rows) deletion from the table.&lt;/p&gt;

&lt;p&gt;Note: It is crucial to use the DELETE statement with a condition (WHERE Clause) although the condition requirement is not a must.&lt;/p&gt;

&lt;p&gt;If you execute the DELETE table command without the WHERE condition, you will end up deleting all the rows (data) from the table. Hence, make it a habit to use the WHERE condition unless you want to delete all rows.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Compatibility&lt;/strong&gt;&lt;br&gt;
This statement is compatible with many versions of SQL Server, including the following:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;SQL Server 2012 and upward versions.&lt;/li&gt;
&lt;li&gt;Cloud-based SQL Server Database (Azure SQL Database).&lt;/li&gt;
&lt;li&gt;Cloud-based SQL Data Warehouse (Azure Synapse Analytics).&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Simple Deletion Scenarios&lt;/strong&gt;&lt;br&gt;
Now we are going to explore the Delete statement usage with several practical scenarios.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The Summary of Steps&lt;/strong&gt;&lt;br&gt;
-Setup a Sample Database.&lt;br&gt;
-View the data.&lt;br&gt;
-Delete the data.&lt;br&gt;
-Insert more data back into the table.&lt;br&gt;
-View the data before deletion.&lt;br&gt;
-Delete the data based on a condition.&lt;br&gt;
-View the data after deletion.&lt;br&gt;
-Insert more data back into the table.&lt;br&gt;
-View the data before deletion.&lt;br&gt;
-Delete the data based on another condition.&lt;br&gt;
-View the data after deletion.&lt;br&gt;
-Insert the data back into the table.&lt;br&gt;
-Delete the data based on two conditions this time.&lt;br&gt;
-View the data after deletion.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Setup Sample Database (BooksSample)&lt;/strong&gt;&lt;br&gt;
We need a sample database to test and run the scripts. First, we need to set that sample database up, covering the following steps:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Create a sample database.&lt;/li&gt;
&lt;li&gt;Create a table in the sample database.&lt;/li&gt;
&lt;li&gt;Insert the data (two rows) into the database table.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Open SQL Server Management Studio or &lt;a href="https://www.devart.com/dbforge/sql/studio/"&gt;dbForge Studio for SQL Server&lt;/a&gt; and execute the following script to set up the sample database:&lt;/p&gt;

&lt;p&gt;-- Connect to the 'master' database to run this snippet&lt;br&gt;
USE master&lt;br&gt;
GO&lt;/p&gt;

&lt;p&gt;-- Create a new database if it does not already exist&lt;br&gt;
IF NOT EXISTS (&lt;br&gt;
    SELECT [name]&lt;br&gt;
FROM sys.databases&lt;br&gt;
WHERE [name] = N'BooksSample'&lt;br&gt;
)&lt;br&gt;
CREATE DATABASE BooksSample&lt;br&gt;
GO&lt;/p&gt;

&lt;p&gt;USE BooksSample&lt;/p&gt;

&lt;p&gt;-- Create the table book&lt;br&gt;
CREATE TABLE [dbo].[Book]&lt;br&gt;
(&lt;br&gt;
    [BookNumber] INT NOT NULL PRIMARY KEY,-- Primary Key column&lt;br&gt;
    [Title] VARCHAR(150) NOT NULL,&lt;br&gt;
    [Stock] SMALLINT NOT NULL&lt;/p&gt;

&lt;p&gt;);&lt;br&gt;
GO&lt;/p&gt;

&lt;p&gt;-- Insert rows into table 'Book' &lt;br&gt;
INSERT INTO [dbo].&lt;a href="https://dev.to--%20Columns%20to%20insert%20data%20into&amp;lt;br&amp;gt;%0A%20%20%20%20%5BBookNumber%5D,%20%5BTitle%5D,%20%5BStock%5D"&gt;Book&lt;/a&gt;&lt;br&gt;
VALUES&lt;br&gt;
    ( -- First row: values for the columns in the list above&lt;br&gt;
        1, 'Learn SQL in 7 Days', 50&lt;br&gt;
),&lt;br&gt;
    ( -- Second row: values for the columns in the list above&lt;br&gt;
        2, 'Creating Databases in Minutes', 50&lt;br&gt;
);&lt;/p&gt;

&lt;p&gt;GO&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;View the data (Book table)&lt;/strong&gt;&lt;br&gt;
Let us view the recently created and populated table in the database. Run the below script:&lt;/p&gt;

&lt;p&gt;-- View data (rows) from the table (Book)&lt;br&gt;
SELECT b.BookNumber, b.Title, b.Stock FROM dbo.Book b&lt;/p&gt;

&lt;p&gt;The output is:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--2DM4VS_w--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/e1q426vj00u30d9sjxd2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--2DM4VS_w--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/e1q426vj00u30d9sjxd2.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;br&gt;
We can see the two rows of the table (Book). At the moment, these are all the data present in this table.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Delete the data&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;As we remember, there is a risk of deleting all rows in a table if we forget to mention the condition/criteria for the correct deletion.&lt;/p&gt;

&lt;p&gt;Always use the WHERE Clause with the Delete statement to avoid accidental data loss. The only exception should be the case when you need to delete all data intentionally.&lt;/p&gt;

&lt;p&gt;To delete all data (rows) from the sample database table, run the following script:&lt;/p&gt;

&lt;p&gt;-- Delete all the data (rows) from the table Book&lt;br&gt;
DELETE FROM dbo.Book&lt;/p&gt;

&lt;p&gt;The output is:&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--68NMlK1V--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/uuqj2tbiqhybz5nb3c09.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--68NMlK1V--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/uuqj2tbiqhybz5nb3c09.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;br&gt;
&lt;strong&gt;View the data after the deletion&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Now, we need to check if all the rows have been deleted:&lt;br&gt;
-- View data (rows) from the table (Book)&lt;br&gt;
SELECT b.BookNumber, b.Title, b.Stock FROM dbo.Book b&lt;/p&gt;

&lt;p&gt;The results are:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--KUV9KN5X--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/3caewmblwmf8j1tfbfpu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--KUV9KN5X--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/3caewmblwmf8j1tfbfpu.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This way, we have successfully deleted all the rows from the Book table. For that, we applied the DELETE statement without any deletion criteria/conditions.&lt;br&gt;
&lt;strong&gt;Insert data back into the table (with the same title)&lt;/strong&gt;&lt;br&gt;
We can insert the data (rows) back into the table and then apply the DELETE statement based on some conditions/criteria.&lt;/p&gt;

&lt;p&gt;This time, we choose to insert more rows, but deliberately having the same title:&lt;br&gt;
-- Insert rows into table 'Book' &lt;br&gt;
INSERT INTO [dbo].&lt;a href="https://dev.to--%20Columns%20to%20insert%20data%20into&amp;lt;br&amp;gt;%0A%20%20%20%20%5BBookNumber%5D,%20%5BTitle%5D,%20%5BStock%5D"&gt;Book&lt;/a&gt;&lt;br&gt;
VALUES&lt;br&gt;
    ( -- First row: values for the columns in the list above&lt;br&gt;
        1, 'Learn SQL in 7 Days', 50&lt;br&gt;
),&lt;br&gt;
    ( -- Second row: values for the columns in the list above&lt;br&gt;
        2, 'Creating Databases in Minutes', 50&lt;br&gt;
),&lt;br&gt;
    ( -- Third row: values for the columns in the list above&lt;br&gt;
        3, 'Creating Databases in Minutes', 50&lt;br&gt;
),&lt;br&gt;
    ( -- Fourth row: values for the columns in the list above&lt;br&gt;
        4, 'Creating Databases in Minutes', 50&lt;br&gt;
);&lt;/p&gt;

&lt;p&gt;GO&lt;/p&gt;

&lt;p&gt;The output is as follows:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--krSX3kBH--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/hq0oh17w48k2pu2u34fr.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--krSX3kBH--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/hq0oh17w48k2pu2u34fr.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Note: To restore the deleted or changed data, you can use the dedicated software solutions. The &lt;a href="https://www.devart.com/dbforge/sql/transaction-log/"&gt;dbForge Transaction Log&lt;/a&gt; solution allows you both to restore those data and view who and when deleted or altered them.&lt;/p&gt;

&lt;p&gt;View the data before deletion&lt;/p&gt;

&lt;p&gt;To view the data, execute the following script:&lt;br&gt;
-- View data (rows) from the table (Book)&lt;br&gt;
SELECT b.BookNumber, b.Title, b.Stock FROM dbo.Book b&lt;/p&gt;

&lt;p&gt;The output is:&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--mnEiOMQG--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/lmbmedzb0wl8bgsqz42q.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--mnEiOMQG--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/lmbmedzb0wl8bgsqz42q.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;br&gt;
The output demonstrates that we have inserted three rows with the same book title by mistake. It is a problem. The simple solution is to delete the unnecessary rows, applying the specific condition for removing rows with duplicate titles.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Delete the data based on a condition (BookNumber)&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Important: We may consider one of the following ways to solve this problem:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Delete by BookNumber&lt;/li&gt;
&lt;li&gt;Delete by Title
In my scenario, I choose not to delete by Title. If we delete by Title, we end up deleting all rows containing that title including those we need to keep. Hence, the recommended approach is to delete the table based on the BookNumber column.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;If we look at the result set, we can easily understand that BookNumber: 3 and BookNumber: 4 are duplicate rows. We must delete them to keep the database consistent.&lt;/p&gt;

&lt;p&gt;Again, the following options arise:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Delete where book number (BookNumber) is greater than 2.&lt;/li&gt;
&lt;li&gt;Delete where BookNumber is 3 and 4.
Let us choose the first option. However, remember that it is only valid if there are no rows after the duplicate rows we are aware of.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Execute the following script:&lt;br&gt;
-- Delete all the data (rows) from the table Book where BookNumber is greater than 2&lt;br&gt;
DELETE FROM dbo.Book&lt;br&gt;
WHERE BookNumber&amp;gt;2&lt;/p&gt;

&lt;p&gt;View the data after the deletion&lt;/p&gt;

&lt;p&gt;Let’s check the table after deleting the data:&lt;/p&gt;

&lt;p&gt;-- View data (rows) from the table (Book)&lt;br&gt;
SELECT b.BookNumber, b.Title, b.Stock FROM dbo.Book b&lt;/p&gt;

&lt;p&gt;The output is:&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--w7z31YRB--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/gc5qbts5yn8kgf25m31f.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--w7z31YRB--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/gc5qbts5yn8kgf25m31f.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;br&gt;
&lt;strong&gt;Insert more data into the table (more stock)&lt;/strong&gt;&lt;br&gt;
To put more data (row) related to the stock, we use the below script:&lt;/p&gt;

&lt;p&gt;-- Insert rows into table 'Book' &lt;br&gt;
INSERT INTO [dbo].&lt;a href="https://dev.to--%20Columns%20to%20insert%20data%20into&amp;lt;br&amp;gt;%0A%20%20%20%20%5BBookNumber%5D,%20%5BTitle%5D,%20%5BStock%5D"&gt;Book&lt;/a&gt;&lt;br&gt;
VALUES&lt;br&gt;
    ( -- First row: values for the columns in the list above&lt;br&gt;
        3, 'Basic Data Structures', 60&lt;br&gt;
),&lt;br&gt;
    ( -- Second row: values for the columns in the list above&lt;br&gt;
        4, 'Advanced Data Structures', 0&lt;br&gt;
)&lt;br&gt;
GO&lt;/p&gt;

&lt;p&gt;View data before deletion&lt;/p&gt;

&lt;p&gt;Have a look at the table:&lt;br&gt;
-- View data (rows) from the table (Book)&lt;br&gt;
SELECT b.BookNumber, b.Title, b.Stock FROM dbo.Book b&lt;/p&gt;

&lt;p&gt;The output is:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--AFcClSic--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/3rq1oxfs1ep278urxsf2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--AFcClSic--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/3rq1oxfs1ep278urxsf2.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;br&gt;
&lt;strong&gt;Delete the data based on a different condition&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Assume that we need to delete out of stock books to keep more accurate information in the database. To do this, we should look for those rows with the Stock is 0.&lt;/p&gt;

&lt;p&gt;We can use the DELETE statement with the condition based on the Stock column value 0:&lt;/p&gt;

&lt;p&gt;-- Delete all the out of stock (Where Stock is 0) books (rows) from the table Book &lt;br&gt;
DELETE FROM dbo.Book&lt;br&gt;
WHERE Stock=0&lt;/p&gt;

&lt;p&gt;View the data after the deletion&lt;/p&gt;

&lt;p&gt;-- View data (rows) from the table (Book)&lt;br&gt;
SELECT b.BookNumber, b.Title, b.Stock FROM dbo.Book b&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--p8nBHoax--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/r9zv18dv5pb8u2kpwtfg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--p8nBHoax--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/r9zv18dv5pb8u2kpwtfg.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;br&gt;
&lt;strong&gt;Insert more data into table (more titles and stock)&lt;/strong&gt;&lt;br&gt;
We are adding two more rows into the table:&lt;/p&gt;

&lt;p&gt;-- Insert rows into table 'Book' &lt;br&gt;
INSERT INTO [dbo].&lt;a href="https://dev.to--%20Columns%20to%20insert%20data%20into&amp;lt;br&amp;gt;%0A%20%20%20%20%5BBookNumber%5D,%20%5BTitle%5D,%20%5BStock%5D"&gt;Book&lt;/a&gt;&lt;br&gt;
VALUES&lt;br&gt;
    ( -- First row: values for the columns in the list above&lt;br&gt;
        4, 'Learn Azure SQL Database in 10 Days', 0&lt;br&gt;
),&lt;br&gt;
    ( -- Second row: values for the columns in the list above&lt;br&gt;
        5, 'Azure SQL Database Concepts', 1&lt;br&gt;
)&lt;br&gt;
GO&lt;/p&gt;

&lt;p&gt;View the data before deletion&lt;/p&gt;

&lt;p&gt;Check the rows before deleting anything further, based on the requirement:&lt;/p&gt;

&lt;p&gt;-- View data (rows) from the table (Book)&lt;br&gt;
SELECT b.BookNumber, b.Title, b.Stock FROM dbo.Book b&lt;/p&gt;

&lt;p&gt;The table output is:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--lIiG-tTP--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/buu1cyzqgn92bwd1l42a.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--lIiG-tTP--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/buu1cyzqgn92bwd1l42a.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Delete the data based on two conditions&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;This time, we need to delete all books (rows) where titles contain the word SQL and they are out of stock (their stock value is 0).&lt;/p&gt;

&lt;p&gt;In other words, we delete all the SQL-related out of stock books.&lt;/p&gt;

&lt;p&gt;In this case, we must specify more than one condition with the DELETE statement. We have to ensure we are only deleting the out of stock books and only those books having the word SQL in their title.&lt;/p&gt;

&lt;p&gt;View the following script:&lt;br&gt;
-- Delete all the out of stock (Where Stock is 0) SQL related (Title contains SQL) books (rows) from the table Book &lt;br&gt;
DELETE FROM dbo.Book&lt;br&gt;
WHERE Title LIKE '%SQL%' and Stock=0&lt;/p&gt;

&lt;p&gt;View the data after the deletion&lt;/p&gt;

&lt;p&gt;We view the data for the final time:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--6Idu3WAe--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/w23jp10eaah66t8vwvka.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--6Idu3WAe--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/w23jp10eaah66t8vwvka.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;br&gt;
Important Tip: Before you delete the data, run the SELECT statement based on the same condition you are going to use for deletion. This way, you ensure that your deleting activities will apply to the correct data.&lt;/p&gt;

&lt;p&gt;For example, run the SELECT query first to ensure that you are only getting those rows that are meant for deletion:&lt;/p&gt;

&lt;p&gt;SELECT * FROM dbo.Book&lt;br&gt;
WHERE Title LIKE '%SQL%' and Stock=0&lt;br&gt;
Once you are confident, you can turn your SELECT into the DELETE statement:&lt;/p&gt;

&lt;p&gt;DELETE FROM dbo.Book&lt;br&gt;
WHERE Title LIKE '%SQL%' and Stock=0&lt;br&gt;
Congratulations!&lt;/p&gt;

&lt;p&gt;You have successfully mastered the task of removing (deleting) one or more rows from a table according to the requirements.&lt;/p&gt;

&lt;p&gt;Stay in touch for advanced deletion scenarios and other professional tips regarding the DELETE statement usage.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Things to do&lt;/strong&gt;&lt;br&gt;
Now that you can remove rows from a table successfully, you can train and improve your skills further:&lt;/p&gt;

&lt;p&gt;*Try to delete rows where there is only one item in the stock.&lt;br&gt;
*Delete all the books where the title contains the word Structure.&lt;br&gt;
*Delete all the books (rows) except for book number (BookNumber) 1.&lt;/p&gt;

&lt;p&gt;Originally posted at &lt;a href="https://codingsight.com/sql-server-delete-removing-one-or-more-rows-from-table/"&gt;https://codingsight.com/sql-server-delete-removing-one-or-more-rows-from-table/&lt;/a&gt;&lt;br&gt;
Author: Haroon Ashraf&lt;/p&gt;

</description>
      <category>sql</category>
      <category>database</category>
      <category>devops</category>
      <category>mysql</category>
    </item>
    <item>
      <title>Compare and Sync PostgreSQL &amp; Amazon Redshift Databases
</title>
      <dc:creator>andreasneuman</dc:creator>
      <pubDate>Wed, 09 Sep 2020 13:52:05 +0000</pubDate>
      <link>https://dev.to/andreasneuman/compare-and-sync-postgresql-amazon-redshift-databases-1okl</link>
      <guid>https://dev.to/andreasneuman/compare-and-sync-postgresql-amazon-redshift-databases-1okl</guid>
      <description>&lt;p&gt;dbForge  &lt;a href="https://www.devart.com/dbforge/postgresql/schemacompare/"&gt;Schema Compare for Redshift/PostgreSQL&lt;/a&gt;  is a tool for easy and effective comparison and deployment of Redshift database structure differences. The tool is best suited when you need to compare Amazon Redshift databases quickly and efficiently, as well as to generate synchronization SQL scripts to update target database schemas.&lt;/p&gt;

&lt;p&gt;The new solution also has such key features as:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Support for Amazon Redshift. The first version features Amazon Redshift support to compare and synchronize database schemas.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Limited Support for PostgreSQL. At the moment, the vendor also implemented the limited support for the following PostgreSQL databases: PostgreSQL, Amazon RDS for PostgreSQL, Azure PostgreSQL.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Support for Redshift Objects. The solution allows users to compare and sync the following Redshift objects: Schemas, Tables, Views, Function, Procedures, Comments, Privileges.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Identification of Object Dependencies. The tool allows users to sync objects they have not selected for synchronization but which are used by objects they added to synchronize.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Review of DDL Differences in Database Objects. Also, users can view DDL differences for each object pair to analyze the schema comparison result.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Execution of Query History. Users can get basic information about PostgreSQL queries executed in the database for a certain period of time.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Execution of Large Script. The large scripts can be executed without opening in the editor.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

</description>
    </item>
    <item>
      <title>MySQL Database Documenter
</title>
      <dc:creator>andreasneuman</dc:creator>
      <pubDate>Wed, 15 Jul 2020 09:04:57 +0000</pubDate>
      <link>https://dev.to/andreasneuman/mysql-database-documenter-7g3</link>
      <guid>https://dev.to/andreasneuman/mysql-database-documenter-7g3</guid>
      <description>&lt;p&gt;The Documenter feature of dbForge Studio for MySQL is a MySQL database documentation tool that automatically generates documentation in the HTML, PDF, and Markdown file formats. It will take you just a couple of minutes and a few clicks to document multiple MySQL and MariaDB databases with all their objects and properties. &lt;a href="https://www.devart.com/dbforge/mysql/studio/documenter.html"&gt;Documenter&lt;/a&gt; also includes a wide range of options to customize the generated documentation file to meet your specific requirements.&lt;/p&gt;

&lt;p&gt;Documenter extracts an extensive database information, including MySQL object types, their details, and properties, as well as inter-object dependencies and DDL codes.&lt;/p&gt;

&lt;p&gt;The tool allows selecting individual MySQL objects and their properties for documenting. Documenter also includes a wide collection of style templates and allows changing the documentation layout with various Bootstrap themes.&lt;/p&gt;

&lt;p&gt;You can generate documentation in three file formats: HTML, PDF and Markdown. HTML and Markdown suit for databases to be published on the web, PDF is good for distributing to various systems and devices. All formats are searchable, which is very convenient especially for large databases.&lt;/p&gt;

&lt;p&gt;Search across the generated documentation files for MYSQL database objects. As you type the name of a required object in the search field, Documenter highlights the matching text in the object tree. You can also navigate throughout the documentation via build-in hyperlinks.&lt;/p&gt;

</description>
    </item>
    <item>
      <title>5 Ways to Export Data from Salesforce to Excel
</title>
      <dc:creator>andreasneuman</dc:creator>
      <pubDate>Tue, 07 Jul 2020 07:51:17 +0000</pubDate>
      <link>https://dev.to/andreasneuman/5-ways-to-export-data-from-salesforce-to-excel-2940</link>
      <guid>https://dev.to/andreasneuman/5-ways-to-export-data-from-salesforce-to-excel-2940</guid>
      <description>&lt;p&gt;Salesforce is a multifunctional and highly customizable CRM platform, which is used by millions of users to manage complex business processes and boost the productivity of sales, customer service, marketing teams, etc.&lt;/p&gt;

&lt;p&gt;Microsoft Excel is a spreadsheet program, which is so familiar and comfortable for many users worldwide, and there are situations when it is much better and easier to work with data in the Excel worksheets rather than in the native UI of a cloud app.&lt;/p&gt;

&lt;p&gt;However, it's not a trivial challenge to export huge data pieces from Salesforce to the Excel worksheets as you need built-in features or external tools, which allow exporting data to the Excel format.&lt;/p&gt;

&lt;p&gt;We briefly describe 5 ways how to connect &lt;a href="https://skyvia.com/blog/5-ways-to-export-data-from-salesforce-to-excel"&gt;Salesforce to Excel&lt;/a&gt;, easily export data from Salesforce to Excel worksheets, pros and cons of these ways and typical use cases applied.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1. Salesforce Data Loaders&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;To export data using data loaders is the first and most standard way. Using these tools, you can export data from Salesforce to Excel as CSV files, make changes to these files in Excel and import changed CSV files back to Salesforce. You can find many of such data loaders on the Internet, and many of them have free versions. You can also try native Salesforce built-in data loader (first-party tool), which you can use after installing it on your PC. However, third-party data loaders such as, for example Skyvia Salesforce data loader, can be more robust than Salesforce’s native offerings. For example, you don’t need to download Skyvia on your computer or laptop as it is a cloud-based solution. You need only a web browser to access it and set minimal configuration to safely export and import data as CSV files.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;2. Excel Original Interface&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;To get data from Salesforce to Excel through Excel original interface is a perfect solution for users of Microsoft Office Professional Edition. Unlike data loaders, in this case the data are displayed directly in the Excel UI, you do not need to use intermediate CSV files.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;3. Excel Add-Ins&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The third way to pull Salesforce data into Excel is to use Excel Add-Ins for Salesforce. Add-ins help you to connect Microsoft Excel to Salesforce, load data from Salesforce to Excel, instantly refresh data in an Excel workbook from Salesforce, edit these data, and update Salesforce from Excel. It enables you to work with Salesforce contacts, accounts, opportunities, leads, users, attachments, tasks, and other objects like with usual Excel worksheets, as well as to perform data cleansing and de-duplication, and apply all the Excel's powerful data processing and analysis capabilities to these data.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;4. ODBC Drivers&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Another way of performing Salesforce export to Excel is by using ODBС drivers. With ODBC drivers, the same as with add-ins, you export data directly into an Excel spreadsheet. However, note that unlike add-ins, ODBC drivers do not allow loading data modified in Excel back to the source.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;5. OData Endpoints&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;OData (Open Data Protocol) is an OASIS standard that defines the best practice for building and consuming queryable and interoperable RESTful APIs in a simple and standard way.&lt;/p&gt;

</description>
    </item>
    <item>
      <title>ODBC Driver for Google BigQuery
</title>
      <dc:creator>andreasneuman</dc:creator>
      <pubDate>Fri, 12 Jun 2020 14:52:46 +0000</pubDate>
      <link>https://dev.to/andreasneuman/odbc-driver-for-google-bigquery-187b</link>
      <guid>https://dev.to/andreasneuman/odbc-driver-for-google-bigquery-187b</guid>
      <description>&lt;p&gt;Devart ODBC Driver for Google BigQuery is a high performance and feature-rich connector for ODBC-compliant applications to enable you to access, analyze and report on your BigQuery data on Windows, both 32-bit and 64-bit. The driver processes standard ODBC function calls, submits SQL statements to BigQuery, and returns results to the application. You can work with BigQuery data warehouse like you would with a database – execute SQL statements to read, write, update, and delete billions of rows in very large tables.&lt;br&gt;
&lt;a href="https://www.devart.com/odbc/bigquery/"&gt;Devart driver&lt;/a&gt; provides capabilities to establish a connection to BigQuery data directly through HTTPS. If you have no access to BigQuery directly through HTTPS, you have also an ability to connect through Proxy using the connection options: Host, Port, Username, Password.&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Show and kill running processes in MySQL</title>
      <dc:creator>andreasneuman</dc:creator>
      <pubDate>Tue, 26 May 2020 08:10:14 +0000</pubDate>
      <link>https://dev.to/andreasneuman/show-and-kill-running-processes-in-mysql-2bkf</link>
      <guid>https://dev.to/andreasneuman/show-and-kill-running-processes-in-mysql-2bkf</guid>
      <description>&lt;p&gt;The KILL command terminates a connection thread by ID along with the related active query, if there is one. Then, to identify queries for deletion, you need to see processes on the server - and the SHOW PROCESSLIST command will be a fine solution. It's not an elegant way to fix database issues, but rather an effective last resort tool. There are 4 major reasons for that:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;If a long-running query holds other transactions from executing your more relevant query&lt;/li&gt;
&lt;li&gt;If a large number of faulty queries block viable queries&lt;/li&gt;
&lt;li&gt;If there are orphan processes after a client was disconnected from a server&lt;/li&gt;
&lt;li&gt;'Too many connections' message&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;None of these scenarios are great, so before executing KILL, make sure other solutions have been tried. But once you know the KILL method is necessary, you will have a few different options. But in this particular guide, we will focus on the more 'intuitive' way of showing and killing faulty queries using simple commands. You should keep in mind that KILL has two modifiers - CONNECTION and QUERY. KILL CONNECTION is essentially the same as KILL, while KILL QUERY terminates only the query for the specified connection ID and leaves the connection itself intact. &lt;a href="https://www.devart.com/dbforge/mysql/studio/show-running-queries-in-processlist.html"&gt;https://www.devart.com/dbforge/mysql/studio/show-running-queries-in-processlist.html&lt;/a&gt;&lt;/p&gt;

</description>
      <category>mysql</category>
    </item>
    <item>
      <title>Different types of JOINS in SQL Server</title>
      <dc:creator>andreasneuman</dc:creator>
      <pubDate>Wed, 13 May 2020 16:04:48 +0000</pubDate>
      <link>https://dev.to/andreasneuman/different-types-of-joins-in-sql-server-566p</link>
      <guid>https://dev.to/andreasneuman/different-types-of-joins-in-sql-server-566p</guid>
      <description>&lt;p&gt;&lt;strong&gt;JOINS fundamentals&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In relational databases, such as SQL Server, Oracle, MySQL, and others, data is stored in multiple tables that are related to each other with a common key value. Accordingly, there is a constant need to extract records from two or more tables into a results table based on some condition. In SQL Server, this can be easily accomplished with the SQL JOIN clause.&lt;/p&gt;

&lt;p&gt;JOIN is an SQL clause used to query and access data from multiple tables, based on logical relationships between those tables.&lt;br&gt;
In other words, JOINS indicate how SQL Server should use data from one table to select the rows from another table.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Z851qII---/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://www.devart.com/dbforge/sql/sqlcomplete/images/joins-types.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Z851qII---/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://www.devart.com/dbforge/sql/sqlcomplete/images/joins-types.png" alt="Basic SQL JOIN types"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Enjoying JOINS with SQL Complete&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.devart.com/dbforge/sql/sqlcomplete/sql-join-statements.html"&gt;SQL Server JOINS&lt;/a&gt; are vitally important to master. As you progress from a database beginner to a more advanced user, you’ll continually need to fetch and combine data from more than one table. At this point, SQL Complete comes to the aid. Its code completion works well even for complex JOIN statements. You don’t need to memorize multiple column names or aliases, dbForge SQL Complete will suggest a full JOIN clause based on foreign keys, or conditions based on column names. These suggestions are available after the JOIN and ON keywords.&lt;/p&gt;

&lt;p&gt;More than that, SQL Complete can prompt a complete SQL JOIN statement when you combine tables based on foreign keys. You can select a JOIN statement from the prompt list manually, in case you need a specific JOIN operation.&lt;/p&gt;

</description>
      <category>sql</category>
      <category>sqlserver</category>
      <category>joins</category>
    </item>
    <item>
      <title>Monitor and Analyse SQL Server Performance</title>
      <dc:creator>andreasneuman</dc:creator>
      <pubDate>Wed, 13 May 2020 15:01:01 +0000</pubDate>
      <link>https://dev.to/andreasneuman/monitor-and-analyse-sql-server-performance-3ba6</link>
      <guid>https://dev.to/andreasneuman/monitor-and-analyse-sql-server-performance-3ba6</guid>
      <description>&lt;p&gt;&lt;a href="https://www.devart.com/dbforge/sql/monitor/"&gt;dbForge Monitor&lt;/a&gt; is a FREE add-in for Microsoft SQL Server Management Studio that allows you to track and analyse SQL Server performance. The tool includes a variety tools for SQL Server diagnostics, including a bunch of SQL performance metrics allowing to see the comprehensive and clear picture of all processes taking place on the server, detect most expensive queries, waiting tasks, and performance bottlenecks. With dbForge Monitor, you will save hours of time spent on SQL Server performance analysis!&lt;/p&gt;

</description>
      <category>sql</category>
      <category>database</category>
      <category>monitor</category>
    </item>
    <item>
      <title>Data Import and Export Tools for PostgreSQL</title>
      <dc:creator>andreasneuman</dc:creator>
      <pubDate>Fri, 17 Apr 2020 06:59:26 +0000</pubDate>
      <link>https://dev.to/andreasneuman/data-import-and-export-tools-for-postgresql-1088</link>
      <guid>https://dev.to/andreasneuman/data-import-and-export-tools-for-postgresql-1088</guid>
      <description>&lt;p&gt;Enjoy simple and seamless data migration and benefit from an easy way to fill your PostgreSQL databases with external source data.&lt;/p&gt;

&lt;p&gt;A salient feature of the tool &lt;a href="https://www.devart.com/dbforge/postgresql/studio/data-export-import.html"&gt;PostgreSQL Import/Export&lt;/a&gt;, built into dbForge Studio for PostgreSQL, is that it supports file transfer between the most frequently used data formats (Text, MS Excel, XML, CSV, JSON, etc.) and allows saving templates for repetitive export and import jobs. With the tool, you can easily:&lt;/p&gt;

&lt;p&gt;Migrate data from other servers&lt;br&gt;
Customize import and export jobs&lt;br&gt;
Save templates for recurring scenarios&lt;br&gt;
Populate new tables with data&lt;/p&gt;

&lt;p&gt;The PostgreSQL Import/Export functionality of dbForge Studio for PostgreSQL allows transferring data from PostgreSQL to any other DBMS as well as from any other DBMS to PostgreSQL using ODBC drivers.&lt;/p&gt;

&lt;p&gt;I strongly recommend utilizing a high-performance and feature-rich Devart PostgreSQL ODBC Driver as it provides quicker and more stable access to PostgreSQL databases compared with standard ODBC drivers.&lt;/p&gt;

&lt;p&gt;dbForge Studio for PostgreSQL has advanced built-in features for exporting and importing data for different cases and scenarios. The Data Import and Export functionality provided in the Studio significantly facilitates filling databases with data and &lt;a href="https://www.devart.com/dbforge/postgresql/studio/postgresql-migration.html"&gt;migrating Postgres data&lt;/a&gt; between databases.&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Creating Pivot Tables in PostgreSQL
</title>
      <dc:creator>andreasneuman</dc:creator>
      <pubDate>Thu, 05 Mar 2020 11:44:13 +0000</pubDate>
      <link>https://dev.to/andreasneuman/creating-pivot-tables-in-postgresql-3ffd</link>
      <guid>https://dev.to/andreasneuman/creating-pivot-tables-in-postgresql-3ffd</guid>
      <description>&lt;p&gt;Pivot Table provides an interactive view of your data going significantly beyond the native crosstab function in PostgreSQL. With very little effort (and minimum queries) you can look at the same data from many different perspectives. This functionality comes in dbForge Studio for PostgreSQL along with other useful database management features. Try the tool and enjoy PostgreSQL pivoting without crosstab.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Extract the significance from a large, detailed data set&lt;/li&gt;
&lt;li&gt;Slice and dice your database without a hitch&lt;/li&gt;
&lt;li&gt;Make your data more comprehensive&lt;/li&gt;
&lt;li&gt;Simplify data representation&lt;/li&gt;
&lt;li&gt;Get all the metrics you need&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Creating a Pivot Table&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;To &lt;a href="https://www.devart.com/dbforge/postgresql/studio/pivot-tables-in-postgresql.html"&gt;create a pivot table&lt;/a&gt;, execute a query to select data you want to analyze and navigate to the query output. Right-click the data grid and select the Pivot table option from the Send to sub-menu. To add fields to the pivot table, simply drag them from the Data Source view to the required pivot table area.&lt;/p&gt;

&lt;p&gt;Note:&lt;br&gt;
Not all the data source tables are good for converting into pivot tables. The data source should contain a column with duplicated values, that can be grouped, and a column with numeric data, that can be used to calculate grand totals and custom totals.&lt;/p&gt;

</description>
      <category>postgres</category>
      <category>database</category>
      <category>pivot</category>
    </item>
  </channel>
</rss>
