How to change the type of a column from varchar(30) to varcahar(100)?

I have a table that describes like this:

mysql> describe easy_table;
+---------------------+--------------+------+-----+---------+----------------+
| Field               | Type         | Null | Key | Default | Extra          |
+---------------------+--------------+------+-----+---------+----------------+
| id                  | bigint(20)   | NO   | PRI | NULL    | auto_increment |
| version             | bigint(20)   | NO   |     | NULL    |                |
| account_id          | bigint(20)   | NO   | MUL | NULL    |                |
| city                | varchar(30)  | NO   |     | NULL    |                |
...
| name                | varchar(255) | YES  |     | NULL    |                |
| name_two            | varchar(255) | YES  |     | NULL    |                |
+---------------------+--------------+------+-----+---------+----------------+
13 rows in set (0.03 sec)

I'm trying to make the city varchar bigger to varchar(100) and this line doesn't work

alter table easy_table alter column city varchar(100);

this also doesn't work

alter table easy_table alter column city varchar(100) not null;

I get this error:

ERROR 1064 (42000): You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'varchar(100)' at line 1

Answers


alter table easy_table modify column city varchar(100) not null;

Use the Modify keyword, not Alter


alter table easy_table modify city VARCHAR(100) ; 

must be you command to change the size of the column.

Please refer to this page.. This is useful link..

http://php.about.com/od/learnmysql/p/alter_table.htm


try

alter table easy_table change city city varchar(100);

Need Your Help

Kafka having duplicate messages

apache-kafka kafka-consumer-api kafka-producer-api

I don't see any failure while producing or consuming the data however there are bunch of duplicate messages in production. For a small topic which gets around 100k messages, there are ~4k duplicates