User:Steven Abouchedid/sandbox

Source: Wikipedia, the free encyclopedia.

Python

The main convenience of Python is to be able to import modules into the text, allowing for an extremely wide range of functions for any specialization. These modules can be easily installed using Windows command prompt. Once this is done, all that is needed for the new resources to work is the import (module name) function at the top of any Python document.

API Calls

Calling

API
modules are fairly simple with Python. It increases with difficulty depending on the API and security, but relative to other languages Python makes the process of calling APIs very convenient. Using an example API and the following code, an API call was successful.

import requests

r = requests.get('https://api.github.com/events')
print(r.content) #This gets the content of the api
print(r.text) #This gets the text of the api
print(r.status_code) #This gets the status of the api request (if successful, this will print '200')

This call was made to a sample GitHub API and the call prints three different requests. After using "pip install requests" in command prompt, this call was able to be successful on my machine.

SQL Database Connection Using PYODBC

To connect to the local MS SQL Server, a module called PYODBC was used, which allows for easy connection to any SQL server, as long as a the driver, server name, database, and any other security parameters were met. In order too connect to my local "Test" database, the following was placed at the top of the document.

import pyodbc 
cnxn = pyodbc.connect("Driver={SQL Server Native Client 11.0};"
                      "Server=DESKTOP-XXXXXXX;"
                      "Database=Test;"
                      "Trusted_Connection=yes;")

While this code will create a connection to the database, in order to retrieve data, it is necessary to use cursors.

Cursors

Cursors are structures that allow for traversal through the record set of data in a database. In order to use cursors in PYODBC, a cursor variable must be made. Then the SQL code is placed inside the cursor.execute() function.

cursor = cnxn.cursor()
cursor.execute('SELECT * FROM Database.dbo.DatabaseTable')

In order to display the raw data, use this:

rows = cursor.fetchall()
for row in rows:
    print(row)

While it is easiest to display raw data, it is also possible to display anything that the server has access too, such as the names of the tables, databases, etc. For example, the following function displays every table name, and checks if there is a table called 'Person Data'.

#Display Table names
for row in cursor.tables():
    if row.table_name == "sysdiagrams":
        break
    print(row.table_name)

# Does table 'PersonData' exist?
if cursor.tables(table='PersonData').fetchone():
    print('yes it does')

The result will look like this:

C:\>python table.py
CompanyData
Complaint
HistoryLog
PersonData
yes it does


API to HTML

This project is a culmination of all of the work that i have done at Agiline so far. The Github Page[1] here has all of the code described below, if you want to run it yourself. Thank you.

Adaptable HTML Table from Information Stored in JSON from an API

The goal of the assignment was to take data from an

CSS
. While HTML, Javascript, and CSS all have heavy documentation and optimization in regards to their functionality when used in conjunction, Python and SQL took much more time to integrate in this process. The methodology is explained below.

Converting an API into a SQL Table

There are 3 major problems associated with this step. The first is getting the data from the API, the second is to format the data into a usable form, and the last is making the table from the data in the API. While it is possible to directly convert the JSON data into HTML much more easily, the assignment required this path to be taken.

API Get Request

In order to first acquire the data from the API, the requests module for Python was imported. The data was then converted into a Unicode string variable.

import requests
#imports the requests module for sending and retrieving API data

r = requests.get('https://samples.openweathermap.org/data/2.5/forecast/daily?id=524901&lang=zh_cn&appid=b6907d289e10d714a6e88b30761fae22')
#sends a get requests that retrieves data from the openweathermap API and stores it in the 'r' variable

data = r.text
#takes the r variable and converts it into a unicode string and storing it into the 'data' variable

The API used was a sample OpenWeatherMap API[2], giving data on a week of weather reports in Moscow in JSON. Here is the text received from the API. Here is the link to the API page which was called. The formatting of the JSON, including the names of given data keys was particularly important in parsing the data in the next step.

{
        "cod": "200",
        "message": 0,
        "city": {
            "geoname_id": 524901,
            "name": "Moscow",
            "lat": 55.7522,
            "lon": 37.6156,
            "country": "RU",
            "iso2": "RU",
            "type": "city",
            "population": 0
        },
        "cnt": 7,
        "list": [{
            "dt": 1485766800,
            "temp": {
                "day": 262.65,
                "min": 261.41,
                "max": 262.65,
                "night": 261.41,
                "eve": 262.65,
                "morn": 262.65
            },
            "pressure": 1024.53,
            "humidity": 76,
            "weather": [{
                "id": 800,
                "main": "Clear",
                "description": "sky is clear",
                "icon": "01d"
            }],
            "speed": 4.57,
            "deg": 225,
            "clouds": 0,
            "snow": 0.01
        }, // Data Continues

Parsing the JSON Data

Parsing the JSON data was rather difficult. Despite the fact that there is integrated support for JSON data structures in SQL, the version of SQL that i was using did not support JSON since that support was introduced in the 2017 version, and I was using the 2014 version. In order to circumvent this issue, I decided to parse the data in Python, treating the JSON as a dictionary[3] in Python, and indexing for the pieces of data that I needed to make the table.

JSON Python Module

The first step was using the JSON Module[4] for python, which is natively implemented into Python since version 3. Using the library from this module, it was possible to convert the data string from the requests module into a Python dictionary.

import json
import requests
r = requests.get('https://samples.openweathermap.org/data/2.5/forecast/daily?id=524901&lang=zh_cn&appid=b6907d289e10d714a6e88b30761fae22')

data = r.text
# Remember that our data was stored in the 'data' variable as a unicode string.

json3 = json.loads(data)
# This command is from the JSON module.
# json.loads() is a funciton that allows us to take a string formatted as a JSON, and convert it into a Python dictionary. 
# In this case, we took the 'data' string, and made it into dictionary, represented by the variable 'json3'.

PYODBC and SQL Implementation

The function that we will create in Python doesn't create a SQL table, but rather implements data into an existing table. It is at this point where we need to have a table created before we move on further. I chose a few pieces of important data from the OpenWeatherMap sample API and created a SQL table. No data was actually implemented into the table, but rather the important data points were made into columns and given proper formatting.

CREATE TABLE Weather(
   WeatherID                     INT IDENTITY(1, 1) NOT NULL PRIMARY KEY 
  ,DT                            BIGINT NULL     --ID for day from API
  ,DayTemp                       VARCHAR(50) NULL
  ,LowTemp                       VARCHAR(50) NULL
  ,HighTemp                      VARCHAR(50) NULL
  ,NightTemp                     VARCHAR(50) NULL
  ,EveningTemp                   VARCHAR(50) NULL
  ,MorningTemp                   VARCHAR(50) NULL --Average/POIs of Temperature over different time periods
  ,Pressure                      VARCHAR(50) NULL
  ,Humidity                      VARCHAR(50) NULL
  ,WeatherDescription            VARCHAR(50) NULL
  ,Speed                         VARCHAR(50) NULL --Wind Speed
  ,Deg                           VARCHAR(50) NULL --Wind Direction
  ,Clouds                        VARCHAR(50) NULL
  ,Snow                          VARCHAR(50) NULL
);

Keep note that the "WeatherID" column has no value in the original JSON, but rather counts up for each new row, giving them a unique ID. Having this data, we were able to properly format our 'SQL Query' from inside Python. However, it is important to explain how exactly I planned to run a

SQL query in a Python script. I have explained this before in previous documentation, but to put it simply, it is possible using the PYODBC module[5]. It allows a SQL query to be directly ran inside of Python using cursors
. The goal now, is to construct a SQL query, using Python, that will automatically read the JSON and format the query based on the data it finds.

The Parsing Function

The format for updating a SQL table is as follows:

INSERT INTO /* Table Path */ (/* column names seperated by commas*/)
VALUES (/* Data for first row */), (/* Data for second row */), (...);

It is important to note that the order in which the data is added in the format must match the order of their respective columns in line 1. To build this query, a string is created from a while loop in Python, which runs through the data and adds the proper pieces of information from each identified key to the string. This loop runs until there is no more data to add from the JSON.

i = 0
query = "INSERT INTO Test.dbo.Weather (DT, DayTemp, LowTemp, HighTemp, NightTemp, EveningTemp, MorningTemp, Pressure, Humidity, WeatherDescription, Speed, Deg, Clouds, Snow)\n VALUES "
while i <= len(json3["list"]) - 1:
    query += "('"
    query += str(json3["list"][i]["dt"]) + "', '"
    query += str(json3["list"][i]["temp"]["day"]) + "', '"
    query += str(json3["list"][i]["temp"]["min"]) + "', '"
    query += str(json3["list"][i]["temp"]["max"]) + "', '"
    query += str(json3["list"][i]["temp"]["night"]) + "', '"
    query += str(json3["list"][i]["temp"]["eve"]) + "', '"
    query += str(json3["list"][i]["temp"]["morn"]) + "', '"
    query += str(json3["list"][i]["pressure"]) + "', '"
    query += str(json3["list"][i]["humidity"]) + "', '"
    query += str(json3["list"][i]["weather"][0]["description"]) + "', '"
    query += str(json3["list"][i]["speed"]) + "', '"
    query += str(json3["list"][i]["deg"]) + "', '"
    query += str(json3["list"][i]["clouds"]) + "', '"
    if "snow" in json3["list"][i]:                         #if-else statement Manages issue where
        query += str(json3["list"][i]["snow"]) + "'"       #sometimes the 'snow' key would not
    else:                                                  #always exist in each "dt" object  
        query += "No Snowfall" + "'"                   
    if i < len(json3["list"]) - 1:                         #creates another list if there is more
        query += "), "                                     #data, or ends loop with a semicolon
    else:
        query += ");"
    i += 1


When the 'query' variable is printed, this is what runs in command prompt:

C:\>API_to_SQL.py
INSERT INTO Test.dbo.Weather (DT, DayTemp, LowTemp, HighTemp, NightTemp, EveningTemp, MorningTemp, Pressure, Humidity, WeatherDescription, Speed, Deg, Clouds, Snow)
 VALUES ('1485766800', '262.65', '261.41', '262.65', '261.41', '262.65', '262.65', '1024.53', '76', 'sky is clear', '4.57', '225', '0', '0.01'), ('1485853200', '262.31', '260.98', '265.44', '265.44', '264.18', '261.46', '1018.1', '91', 'light snow', '4.1', '249', '88', '1.44'), ('1485939600', '270.27', '266.9', '270.59', '268.06', '269.66', '266.9', '1010.85', '92', 'light snow', '4.53', '298', '64', '0.92'), ('1486026000', '263.46', '255.19', '264.02', '255.59', '259.68', '263.38', '1019.32', '84', 'sky is clear', '3.06', '344', '0', 'No Snowfall'), ('1486112400', '265.69', '256.55', '266', '256.55', '260.09', '266', '1012.2', '0', 'light snow', '7.35', '24', '45', '0.21'), ('1486198800', '259.95', '254.73', '259.95', '257.13', '254.73', '257.02', '1029.5', '0', 'sky is clear', '2.6', '331', '29', 'No Snowfall'), ('1486285200', '263.13', '259.11', '263.13', '262.01', '261.32', '259.11', '1023.21', '0', 'light snow', '5.33', '234', '46', '0.04');

In order to finally run the statement, commands from the PYODBC module are run:

cursor = cnxn.cursor()
# connects cursor variable to database

cursor.execute(query)
# executes the 'query' variable as a query in the connected database

cnxn.commit()
# commits the connection

When this page is run, it updates the table in SQL with data from the API that matches the indexed keys in the parsing function. This is the complete Python script for the first file:

import pyodbc 
import json
import requests

cnxn = pyodbc.connect("Driver={SQL Server Native Client 11.0};"
                      "Server=DESKTOP-XXXXXXX;"
                      "Database=Test;"
                      "Trusted_Connection=yes;")

r = requests.get('https://samples.openweathermap.org/data/2.5/forecast/daily?id=524901&lang=zh_cn&appid=b6907d289e10d714a6e88b30761fae22')
data = r.text
json3 = json.loads(data)

i = 0
query = "INSERT INTO Test.dbo.Weather (DT, DayTemp, LowTemp, HighTemp, NightTemp, EveningTemp, MorningTemp, Pressure, Humidity, WeatherDescription, Speed, Deg, Clouds, Snow)\n VALUES "
while i <= len(json3["list"]) - 1:
    query += "('"
    query += str(json3["list"][i]["dt"]) + "', '"
    query += str(json3["list"][i]["temp"]["day"]) + "', '"
    query += str(json3["list"][i]["temp"]["min"]) + "', '"
    query += str(json3["list"][i]["temp"]["max"]) + "', '"
    query += str(json3["list"][i]["temp"]["night"]) + "', '"
    query += str(json3["list"][i]["temp"]["eve"]) + "', '"
    query += str(json3["list"][i]["temp"]["morn"]) + "', '"
    query += str(json3["list"][i]["pressure"]) + "', '"
    query += str(json3["list"][i]["humidity"]) + "', '"
    query += str(json3["list"][i]["weather"][0]["description"]) + "', '"
    query += str(json3["list"][i]["speed"]) + "', '"
    query += str(json3["list"][i]["deg"]) + "', '"
    query += str(json3["list"][i]["clouds"]) + "', '"
    if "snow" in json3["list"][i]:
        query += str(json3["list"][i]["snow"]) + "'"
    else:
        query += "No Snowfall" + "'"
    if i < len(json3["list"]) - 1:
        query += "), "
    else:
        query += ");"
    i += 1

cursor = cnxn.cursor()
cursor.execute(query)
cnxn.commit()

Moving the SQL Data to an HTML Table

This second step is also requires quite a few steps, as it involves taking the SQL data and converting it back into JSON, and the creating an HTML table using the JSON data as a DOM. I will not explain everything in the HTML file's formatting, as that is not the most important aspect of this project. I will be strictly talking about two major steps: Creating the JSON from SQL, and creating the HTML data layer and format layer.

Converting a SQL table into JSON.

Most of the work done in this section was done by Anthony Debarros[6]. I used much of his function in my own, only adding a few updates to his formatting. In general, he uses the built in collections module[7], which creates new data structures to store collections of data. Along with this, I used the PYODBC and JSON modules as well.

Heading the Python Script

import pyodbc 
import json
import collections

cnxn = pyodbc.connect("Driver={SQL Server Native Client 11.0};"
                      "Server=DESKTOP-XXXXXXX;"
                      "Database=Test;"
                      "Trusted_Connection=yes;")


cursor = cnxn.cursor()
cursor.execute('SELECT * FROM Test.dbo.Weather') #Grabs all the new data from the table, including the API data

rows = cursor.fetchall() #cursor.fetchall() grabs all of the rows from a query and makes them into a tuple called variable 'rows'

Here we import all the required modules for this new Python script and connect to the database, as well as establish some important variables. Note that the cursor.execute used a SQL query to grab all of the data from the table where I previously implemented my API data.

Building the JSON Dictionary

Next we use the 'OrderedDict' command from the collections module to help us create an ordered dictionary from the keys we choose.

objects_list = [] #Creates empty object where the JSON will later be stored
for row in rows: #Starts loop
    d = collections.OrderedDict() #Creates a new dictionary with the following keys containing data from the SQL table
    d['WeatherID'] = row.WeatherID
    d['DayID'] = row.DT
    d['Day Temperature'] = row.DayTemp
    d['Temperature Low'] = row.LowTemp
    d['Temperature High'] = row.HighTemp
    d['Night Temperature'] = row.NightTemp
    d['Evening Temperature'] = row.EveningTemp
    d['Morning Temperature'] = row.MorningTemp
    d['Pressure'] = row.Pressure
    d['Humidity %'] = row.Humidity
    d['Weather Description'] = row.WeatherDescription
    d['Wind Speed'] = row.Speed
    d['Wind Direction'] = row.Deg
    d['Cloudiness %'] = row.Clouds
    d['Snowfall'] = row.Snow
    objects_list.append(d) #Moves the new dictionary 'd' into the 'object_list' object.
j = json.dumps(objects_list) #Converts the 'objects_list' object (which now is formatted like a JSON) into a string

By inserting the Python dictionary into an object, we essentially have created a JSON from the data of the rows in our SQL table. The key references are all to do two things:

  1. Make a connection between the key objects in the dictionary and the rows of the SQL table.
  2. Rename the columns with the ability to use more characters.

Creating a .txt File Containing the JSON

Finally, we need to move our new JSON string into its own file. Luckily, python is able to natively build files:

objects_file = 'SqlWeather.txt' #Name of new Text File.
f = open(objects_file,'w')  
print (j, end="", file=f) #prints the 'objects_file' text into the 'SqlWeather.txt'
    
cnxn.close() #closes SQL connection

When the file is run, a new SqlWeather.txt file is created in the folder where the python script was run. It will look like this:

[{"WeatherID": 15, "DayID": 1485766800, "Day Temperature": "262.65", "Temperature Low": "261.41", "Temperature High": "262.65", "Night Temperature": "261.41", "Evening Temperature": "262.65", "Morning Temperature": "262.65", "Pressure": "1024.53", "Humidity %": "76", "Weather Description": "sky is clear", "Wind Speed": "4.57", "Wind Direction": "225", "Cloudiness %": "0", "Snowfall": "0.01"}, {"WeatherID": 16, "DayID": 1485853200, "Day Temperature": "262.31", "Temperature Low": "260.98", "Temperature High": "265.44", "Night Temperature": "265.44", "Evening Temperature": "264.18", "Morning Temperature": "261.46", "Pressure": "1018.1", "Humidity %": "91", "Weather Description": "light snow", "Wind Speed": "4.1", "Wind Direction": "249", "Cloudiness %": "88", "Snowfall": "1.44"}, {"WeatherID": 17, "DayID": 1485939600, "Day Temperature": "270.27", "Temperature Low": "266.9", "Temperature High": "270.59", "Night Temperature": "268.06", "Evening Temperature": "269.66", "Morning Temperature": "266.9", "Pressure": "1010.85", "Humidity %": "92", "Weather Description": "light snow", "Wind Speed": "4.53", "Wind Direction": "298", "Cloudiness %": "64", "Snowfall": "0.92"}, {"WeatherID": 18, "DayID": 1486026000, "Day Temperature": "263.46", "Temperature Low": "255.19", "Temperature High": "264.02", "Night Temperature": "255.59", "Evening Temperature": "259.68", "Morning Temperature": "263.38", "Pressure": "1019.32", "Humidity %": "84", "Weather Description": "sky is clear", "Wind Speed": "3.06", "Wind Direction": "344", "Cloudiness %": "0", "Snowfall": "No Snowfall"}, {"WeatherID": 19, "DayID": 1486112400, "Day Temperature": "265.69", "Temperature Low": "256.55", "Temperature High": "266", "Night Temperature": "256.55", "Evening Temperature": "260.09", "Morning Temperature": "266", "Pressure": "1012.2", "Humidity %": "0", "Weather Description": "light snow", "Wind Speed": "7.35", "Wind Direction": "24", "Cloudiness %": "45", "Snowfall": "0.21"}, {"WeatherID": 20, "DayID": 1486198800, "Day Temperature": "259.95", "Temperature Low": "254.73", "Temperature High": "259.95", "Night Temperature": "257.13", "Evening Temperature": "254.73", "Morning Temperature": "257.02", "Pressure": "1029.5", "Humidity %": "0", "Weather Description": "sky is clear", "Wind Speed": "2.6", "Wind Direction": "331", "Cloudiness %": "29", "Snowfall": "No Snowfall"}, {"WeatherID": 21, "DayID": 1486285200, "Day Temperature": "263.13", "Temperature Low": "259.11", "Temperature High": "263.13", "Night Temperature": "262.01", "Evening Temperature": "261.32", "Morning Temperature": "259.11", "Pressure": "1023.21", "Humidity %": "0", "Weather Description": "light snow", "Wind Speed": "5.33", "Wind Direction": "234", "Cloudiness %": "46", "Snowfall": "0.04"}]

Creating the HTML Page

In my HTML page, there is a lot of work put into the formatting and Javascript functions that allow for a highly stylized table. In this section I will only explain the bare minimum of what I implemented so that it is understood how to use the .txt file that was just created and create a table from it.

Getting the JSON.txt file using AJAX

AJAX is a Javascript method that allows data from different pages and servers to be accessed in a Javascript file. We will use AJAX to access the data from the JSON that we just created in the same folder as our new HTML page. To start off, we need to create a function that will hold our AJAX request:

<script>
    function loadDoc() { //loadDoc() function created. We will reference loadDoc() later when we call the function
        var xhttp = new XMLHttpRequest(); //this starts the new request in a var called 'xhttp'
        xhttp.onreadystatechange = function() { //onreadystatechange activates the function attached to it when the ready state changes (duh)
            if (this.readyState == 4 && this.status == 200) { //starts the function when the readyState is equal to 4 and the request was successful
                //here goes our function
            }
        };
        xhttp.open("GET", "SqlWeather.txt", true); 
        xhttp.send(); //these two lines get our data from the file we specify
    }

</script>

This is the skeleton of every AJAX request. in the function() 'IF' statement, we insert what we want to happen when our data is received. Now we have to tell the server what we want to do with our new information. Even though our function is only two lines, they are both supremely important.

var defaultData = JSON.parse(this.responseText);
//Multiple things are happening in this line so lets go through every part.
//We first create a Javascript variable called defaultData, where all of our text from the ajax request will go.
//JSON.parse() turns any string it is fed into a Javascript Object, meaning that our string is now treated as a JSON.
//Finally, this.responseText is how we reference that the this that we are going to make into a JSON is the text we got from our AJAX request.
//Overall, this line is very important as it puts our JSON into a variable as a Javascript Object

document.getElementById('tableGoesHere').innerHTML = json2table(defaultData);
//json2table() is the function that creates an HTML table from any JSON that its fed
//The only necessary parameter is the JSON object itself

While the json2table() function itself is imperative for the table to be generated, it was from another project before this one, thus I won't go into detail. The reference to how to build that is here[8] and the CSS and Javascript styling can be found here[9].

How can this be Improved?

What is important to note about this project is that the source must be edited only if one wishes to change where the source data comes from. While it is convenient that updating the table doesn't require editing the code, there are a few areas that if attacked, could make the entire application work with any JSON or API.

  1. Find a way to create multiple tables based on the actual structure of the JSON. A JSON like the one created from the SQL data would only take one table to interpret, but the beauty of JSON is that the data can be deeply nested, requiring multiple tables to interpret.
  2. Interpret columns based on the Key Names. If the key names could be read , and those names implemented, it wouldn't be too hard to create a loop that would both be able to create a SQL table and update it with data from python. It would likely rely on the same function as the one used to interpret the nested JSON.
  3. Interpret data types for SQL. Because SQL is a language that is able to classify data in multiple different ways, the python function would need to be updated so that when it reads data, it is able to know what category of data SQL would interpret it as, and build the query as such.

References

Sources

  1. ^ StevenAbouchedid (2019-07-24), Read The Wikipedia Article!!! Contribute to StevenAbouchedid/API-to-HTML development by creating an account on GitHub, retrieved 2019-07-25
  2. ^ "Weather API - OpenWeatherMap". openweathermap.org. Retrieved 2019-07-24.
  3. ^ "Dictionary Objects — Python 3.4.10 documentation". docs.python.org. Retrieved 2019-07-24.
  4. ^ "json — JSON encoder and decoder — Python 3.7.4 documentation". docs.python.org. Retrieved 2019-07-24.
  5. ^ Kleehammer, Michael (2019-07-24), Python ODBC bridge. Contribute to mkleehammer/pyodbc development by creating an account on GitHub, retrieved 2019-07-24
  6. ^ Debarros, Anthony. "Generate JSON from SQL". Anthony Debarros.
  7. ^ "8.3. collections — High-performance container datatypes — Python 2.7.16 documentation". docs.python.org. Retrieved 2019-07-24.
  8. ^ Horn, Travis (2017-09-21). "Building json2table: Turn JSON into an HTML table". Medium. Retrieved 2019-07-25.
  9. ^ "DataTables 1.10.19". cdn.datatables.net. Retrieved 2019-07-25.