Getting ready

You will need to have access to a MySQL database. You can install one locally installed, in the cloud, within a container.  I am using a locally installed MySQL server and have the root password set to mypassword. You will also need to install the MySQL python library.  You can do this with pip install mysql-connector-python.

  1. The first thing to do is to connect to the database using the mysql command at the terminal:
# mysql -uroot -pmypassword
mysql: [Warning] Using a password on the command line interface can be insecure.
Welcome to the MySQL monitor. Commands end with ; or \g.
Your MySQL connection id is 4
Server version: 5.7.19 MySQL Community Server (GPL)

Copyright (c) 2000, 2017, Oracle and/or its affiliates. All rights reserved.

Oracle is a registered trademark of Oracle Corporation and/or its
affiliates. Other names may be trademarks of their respective
owners.

Type 'help;' or '\h' for help. Type '\c' to clear the current input statement.

mysql>
  1. Now we can create a database that will be used to store our scraped information:
mysql> create database scraping;
Query OK, 1 row affected (0.00 sec)
  1. Now use the new database:
mysql> use scraping;
Database changed
  1. And create a Planets table in the database to store our data:

mysql> CREATE TABLE `scraping`.`planets` (
`id` INT NOT NULL AUTO_INCREMENT,
`name` VARCHAR(45) NOT NULL,
`mass` FLOAT NOT NULL,
`radius` FLOAT NOT NULL,
`description` VARCHAR(5000) NULL,
PRIMARY KEY (`id`));
Query OK, 0 rows affected (0.02 sec)

Now we are ready to scrape data and put it into the MySQL database.