How to Alter Column Datatypes in Snowflake
by Mauricio Rojas, on Nov 4, 2021 8:17:16 AM
Snowflake is a great and ease to use platform. I am constantly moving workloads from Teradata, SQL Server, and Oracle to this platform.
However, I have encountered an interesting situation with an Oracle migration, specifically when you specify a column like number, such as in the following situation:
CREATE TABLE AS TEST_TABLE(A NUMBER, T VARCHAR2(10)); INSERT INTO TEST_TABLE(A,T) VALUES(10.123,'A'); INSERT INTO TEST_TABLE(A,T) VALUES(10.123567,'B');
In Oracle, when no precision is specified, numbers are stored as given.
In Snowflake, if we run the same code, NUMBER will be interpreted as NUMBER(38,0).
When you use SnowConvert, it will turn those column types to NUMBER(38,19) because the tool does not have enough information to determine the right precision.
But you might know which is the right precision. Let's say NUMBER(20,4). I had hoped it would be as easy as using an ALTER COLUMN statement, but... sadly it is not. Snowflake's ALTER COLUMN does not allow you to do that. So what can be done? Let's take a look.
I came up with this solution. I can use a Snowflake JS procedure that will get the table definition, create a statement that will change my table, and run it.
(You can get the code from here.)
For the previous example, you can just run the following:
Notice that the first parameter is the table name, the second is the table schema, the column name is the third, and finally the expression that will be used for conversion. Usually just a CAST expression, but it can be used for any conversion expression.
This proc has been a great timesaver for me, so I hope it is a good addition for your Snowflake Toolbox.