Skip to content
123 changes: 114 additions & 9 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,36 +13,141 @@ This is an application which uses Node.js to connect to IBM Db2 Warehouse on Clo

## Steps

1. [Clone the repo](#1-clone-the-repo)
1. [Create IBM Db2 Warehouse on Cloud](#2-create-ibm-db2-warehouse-on-cloud)
1. [Create schema and tables](#3-create-schema-and-tables)
1. [Add Db2 credentials to .env file](#4-add-db2-credentials-to-env-file)
1. [Run the application](#5-run-the-application)
1. [Clone The Repo](#1-clone-the-repo)
2. [Create an IBM Db2 Instance](#2-create-an-ibm-db2-instance)
3. [Create Schema and Tables](#3-create-schema-and-tables)
4. [Add Db2 Credentials to .env File](#4-add-db2-credentials-to-env-file)
5. [Run The Application](#5-run-the-application)

### 1. Clone the repo

```bash
git clone https://github.com/IBM/crud-using-nodejs-and-db2.git
```

### 2. Create IBM Db2 Warehouse on Cloud
### 2. Create an IBM Db2 Instance

Once we have cloned our repository, the next thing we have to do is create our database that will hold our house sales data. There are two ways we can create our database. One way is creating IBM Db2 Warehouse on Cloud. This database will be hosted on the cloud. However, if you perfer to have your database on premise or locally, we can also use the Db2 Docker Image.

Choose which type of database you would like and follow the corresponding instructions:

1. [Create IBM Db2 Warehouse on Cloud](#2a-create-ibm-db2-warehouse-on-cloud)
2. [Create IBM Db2 Database Locally Using Docker Image](#2b-create-an-ibm-db2-on-premise-database)

#### 2a. Create IBM Db2 Warehouse on Cloud

Create the Db2 Warehouse on Cloud service and make sure to note the credentials using the following link:

* [**IBM Db2 Warehouse on Cloud**](https://cloud.ibm.com/catalog/services/db2-warehouse)

### 3. Create schema and tables
#### 2b. Create an IBM Db2 On Premise Database

Instead of creating the Db2 Warehouse on Cloud service, we can also have our database instantiated locally by using the free IBM Db2 Docker Image.

Prerequisite:

* A [Docker](https://www.docker.com) account
* [Docker Desktop](https://www.docker.com/products/docker-desktop) installed on your machine
* Logging into your Docker account on Docker Desktop

Steps to get your db2 running locally:

* Create a folder name `db2`
* Open a terminal window and make sure your current directory is the same as where your `db2` is located
* Run the commands

```bash
docker pull ibmcom/db2

docker run -itd --name mydb2 --privileged=true -p 50000:50000 -e LICENSE=accept -e DB2INST1_PASSWORD=hackathon -e DBNAME= homesalesdb -v db2:/database ibmcom/db2

docker exec -ti mydb2 bash -c "su - db2inst1"
```

Once this is done, it will create a db2 docker container with the follow customizations:

* IP Address/Domain: `localhost`
* Port: `50000`
* Database name: `homesalesdb`
* Username: `db2inst1`
* Password: `hackathon`


### 3. Create Schema and Tables
Now that we have created our databases, we need to import the data from the csv file into our database. We will be creating a schema called `DB2WML`. The two tables we will create are `HOME_SALES` and `HOME_ADDRESS`. `HOME_SALES` will store the data we retrieve from our csv file. `HOME_ADDRESS` is going to be the addresses associated with each home.

Depending on which type you have (Cloud or On-Premise), the steps will be a little different. Please follow the corresponding steps:

1. [Create Schema and Tables for IBM Db2 Warehouse on Cloud](#3a-create-schema-and-tables-for-ibm-db2-warehouse-on-cloud)
2. [Create Schema and Tables for IBM Db2 Docker Image](#3b-create-schema-and-tables-for-ibm-db2-docker-image)


#### 3a. Create Schema and Tables for IBM Db2 Warehouse on Cloud

In the Db2 warehouse resource page, click on `Manage` and go to DB2 console by clicking the button `Open Console`. In the console do the following to load your data.

* Click `Load` from the hamburger menu.
* Click `Browse files` or you can drag files, select the [data/home-sales-training-data.csv](data/home-sales-training-data.csv) and click `Next`
* Choose existing schema or create a new one named `DB2WML` by clicking `+ New Schema`
* Create a new table named `HOME_SALES` by clicking `+ New Table` on the schema you created and click `Next`
* Make sure the column names and data types displayed are correct, then cick `Next`
* Make sure the column names and data types displayed are correct, then click `Next`
* Click `Begin Load` to load the data

Once this is done it will create a table `HOME_SALES` under schema `DB2WML` which will be used by the Node.js application.
We also need to create a table for `HOME_ADDRESS`, which will store the addresses of each house data. We won't be able to use the same instructions we used for `HOME_SALES` since we have no data to load.

* Click `Run SQL` from the hamburger menu.
* Click `Blank`, which will open a blank sql editor
* Run the command

```bash
CREATE TABLE DB2WML.HOME_ADDRESS (ADDRESS1 VARCHAR(50), ADDRESS2 VARCHAR(50), CITY VARCHAR(50), STATE VARCHAR(5), ZIPCODE INTEGER, COUNTRY VARCHAR(50), HOME_ID INTEGER)
```

Once this is done it will create a table `HOME_SALES` and `HOME_ADDRESS ` under schema `DB2WML` which will be used by the Node.js application.


#### 3b. Create Schema and Tables for IBM Db2 Docker Image

Exit out of the container shell by CONTROL-C. Load the sample data into the onprem Db2 database:

```bash
docker cp data/home-sales-training-data.csv mydb2:home-sales-training-data.csv
```

Run the container and enter into the container shell:

```bash
docker exec -ti mydb2 bash -c "su - db2inst1"
```

Steps To Create Schema and Tables:


* Connect to the database `homesalesdb` NOTE: This command may not work for sometime, since the container takes some time to create the database. If this command doesn work, please wait a couple minutes and then try again.

```bash
db2 connect to homesalesdb
```

* Create Schema `DB2WML`

```bash
db2 'CREATE SCHEMA DB2WML'
```

* Create Table `HOME_SALES` and `HOME_ADDRESS` within Schema `DB2WML`

```bash
db2 'CREATE TABLE DB2WML.HOME_SALES (ID SMALLINT, LOTAREA INTEGER, BLDGTYPE VARCHAR(6),HOUSESTYLE VARCHAR(6), OVERALLCOND INTEGER, YEARBUILT INTEGER, ROOFSTYLE VARCHAR(7), EXTERCOND VARCHAR(2), FOUNDATION VARCHAR(6), BSMTCOND VARCHAR(2), HEATING VARCHAR(4), HEATINGQC VARCHAR(2),CENTRALAIR VARCHAR(1), ELECTRICAL VARCHAR(5), FULLBATH INTEGER, HALFBATH INTEGER, BEDROOMABVGR INTEGER, KITCHENABVGR VARCHAR(2), KITCHENQUAL VARCHAR(2), TOTRMSABVGRD INTEGER, FIREPLACES INTEGER, FIREPLACEQU VARCHAR(2), GARAGETYPE VARCHAR(7), GARAGEFINISH VARCHAR(3), GARAGECARS INTEGER, GARAGECOND VARCHAR(2), POOLAREA INTEGER, POOLQC VARCHAR(2), FENCE VARCHAR(6), MOSOLD INTEGER, YRSOLD INTEGER, SALEPRICE INTEGER )'

db2 'CREATE TABLE DB2WML.HOME_ADDRESS (ADDRESS1 VARCHAR(50), ADDRESS2 VARCHAR(50), CITY VARCHAR(50), STATE VARCHAR(5), ZIPCODE INTEGER, COUNTRY VARCHAR(50), HOME_ID INTEGER)'
```

* Load data from CSV file to table `HOME_SALES`

```bash
db2 'IMPORT FROM ../../../home-sales-training-data.csv OF DEL SKIPCOUNT 1 INSERT INTO DB2WML.HOME_SALES'
```

### 4. Add Db2 credentials to .env file

Expand Down
74 changes: 57 additions & 17 deletions server.js
Original file line number Diff line number Diff line change
Expand Up @@ -83,7 +83,6 @@ let connStr = "DATABASE="+process.env.DB_DATABASE+";HOSTNAME="+process.env.DB_HO

app.post('/getData', function(request, response){
console.log('GET DATA API CALL:');
console.log(request);
ibmdb.open(connStr, function (err,conn) {
if (err){
return response.json({success:-1, message:err});
Expand All @@ -101,7 +100,6 @@ app.post('/getData', function(request, response){

app.post('/getUniqueData', function(request, response){
console.log('GET UNIQUE DATA API CALL:');
console.log(request);
ibmdb.open(connStr, function (err,conn) {
if (err){
return response.json({success:-1, message:err});
Expand All @@ -116,7 +114,13 @@ app.post('/getUniqueData', function(request, response){
return response.json({success:-3, message:err});
}
conn.close(function () {
console.log(data2);
console.log(data);
console.log(data2.length);
if (data2.length == 0){
data2[0] = {'ADDRESS1': '', 'ADDRESS2': '','CITY': '','STATE': '','COUNTRY': '','ZIPCODE': '','HOME_ID': data[0]['ID']};
console.log(data2);
}

return response.json({success:1, message:'Data Received!', data:data,data2:data2 });
});
});
Expand All @@ -127,28 +131,58 @@ app.post('/getUniqueData', function(request, response){

app.post('/updateDataEntry', function(request, response){
console.log('UPDATE DATA API CALL:');
console.log(request);
ibmdb.open(connStr, function (err,conn) {
if (err){
return response.json({success:-1, message:err});
}


var str2 = "UPDATE DB2WML.HOME_ADDRESS SET ADDRESS1='"+request.body.addressInfo.address1+"',ADDRESS2='"+request.body.addressInfo.address2+"',CITY='"+request.body.addressInfo.city+"',STATE='"+request.body.addressInfo.state+"',COUNTRY='"+request.body.addressInfo.country+"' WHERE HOME_ID="+request.body.id+";";
var str2 = "UPDATE DB2WML.HOME_ADDRESS SET ADDRESS1='"+request.body.addressInfo.address1+"',ADDRESS2='"+request.body.addressInfo.address2+"',CITY='"+request.body.addressInfo.city+"',STATE='"+request.body.addressInfo.state+"',COUNTRY='"+request.body.addressInfo.country+"',ZIPCODE="+request.body.addressInfo.zipcode+" WHERE HOME_ID="+request.body.id+";";

var str4 = "INSERT INTO DB2WML.HOME_ADDRESS (ADDRESS1, ADDRESS2, CITY, STATE,ZIPCODE, COUNTRY,HOME_ID) VALUES ('"+request.body.addressInfo.address1+"', '"+request.body.addressInfo.address2+"', '"+request.body.addressInfo.city+"', '"+request.body.addressInfo.state+"', "+request.body.addressInfo.zipcode+", '"+request.body.addressInfo.country+"', "+request.body.id+");";



var str = "UPDATE DB2WML.HOME_SALES SET LOTAREA="+request.body.data.lotArea+", YEARBUILT="+request.body.data.yearBuilt+", BLDGTYPE='"+request.body.data.bldgType+"',HOUSESTYLE='"+request.body.data.houseStyle+"',OVERALLCOND="+request.body.data.overallCond+",ROOFSTYLE='"+request.body.data.roofStyle+"',EXTERCOND='"+request.body.data.exterCond+"',FOUNDATION='"+request.body.data.foundation+"',BSMTCOND='"+request.body.data.bsmtCond+"',HEATING='"+request.body.data.heating+"',HEATINGQC='"+request.body.data.heatingQC+"',CENTRALAIR='"+request.body.data.centralAir+"',ELECTRICAL='"+request.body.data.electrical+"',FULLBATH="+request.body.data.fullBath+",HALFBATH="+request.body.data.halfBath+",BEDROOMABVGR="+request.body.data.bedroomAbvGr+",KITCHENABVGR="+request.body.data.kitchenAbvGr+",KITCHENQUAL='"+request.body.data.kitchenQual+"',TOTRMSABVGRD="+request.body.data.tempotRmsAbvGrd+",FIREPLACES="+request.body.data.fireplaces+",FIREPLACEQU='"+request.body.data.fireplaceQu+"',GARAGETYPE='"+request.body.data.garageType+"',GARAGEFINISH='"+request.body.data.garageFinish+"',GARAGECARS="+request.body.data.garageCars+",GARAGECOND='"+request.body.data.garageCond+"',POOLAREA="+request.body.data.poolArea+",POOLQC='"+request.body.data.poolQC+"',FENCE='"+request.body.data.fence+"',MOSOLD="+request.body.data.moSold+",YRSOLD="+request.body.data.yrSold+",SALEPRICE="+request.body.data.salePrice+" WHERE ID="+request.body.id+";";

var str3 = "SELECT * FROM DB2WML.HOME_ADDRESS WHERE HOME_ID="+request.body.id + ";";

conn.query(str, function (err, data) {
if (err){
return response.json({success:-2, message:err});
}
conn.query(str2, function (err, data) {
conn.query(str3, function (err, data2) {
console.log(data);
if (err){
return response.json({success:-3, message:err});
}
conn.close(function () {
return response.json({success:1, message:'Data Edited!'});
});
else{
if (data2.length == 0 ){
conn.query(str4, function (err, data) {
if (err){
return response.json({success:-2, message:err});
}
else{
conn.close(function () {
return response.json({success:1, message:'Data Edited!'});
});
}
});
}
else{
conn.query(str2, function (err, data) {
if (err){
return response.json({success:-2, message:err});
}
else{
conn.close(function () {
return response.json({success:1, message:'Data Edited!'});
});
}
});
}
}

});
});
});
Expand Down Expand Up @@ -227,14 +261,20 @@ app.get('/predict', function(request, response){

app.post('/geocode', function(request, response){
// Using callback
geocoder.geocode(request.body.address1 + ", " + request.body.city + ", " + request.body.state + ", " + request.body.zipcode, function(err, res) {
if (err){
return response.json({success:-2, message:err});
}
else{
return response.json({success:1, message:"WE DID IT", data:res} );
}
});
if (request.body.address1 == ''){
return response.json({success:1, message:"no address"});
}
else {
geocoder.geocode(request.body.address1 + ", " + request.body.city + ", " + request.body.state + ", " + request.body.zipcode, function(err, res) {
if (err){
return response.json({success:-2, message:err});
}
else{
return response.json({success:1, message:"WE DID IT", data:res} );
}
});
}

})


Expand Down
6 changes: 5 additions & 1 deletion src/app/edit-data/edit-data.component.ts
Original file line number Diff line number Diff line change
Expand Up @@ -109,7 +109,6 @@ export class EditDataComponent implements OnInit {
console.log('rowID: ' + this.rowID);
})
this.getDataEntry();
console.log(this.model);
}


Expand Down Expand Up @@ -264,6 +263,7 @@ export class EditDataComponent implements OnInit {
console.log(data['message']);
}
else{
console.log(data['message']);
localStorage.setItem("dataUpdated","true");
this._router.navigate(['/viewData']);
}
Expand All @@ -279,10 +279,14 @@ export class EditDataComponent implements OnInit {
console.log(data['message']);
}
else{


this.data = data['data'][0];
this.data2 = data['data2'][0];
this.showMessage = false;
this.showData = true;
console.log(this.data2);
console.log(this.data);

}
})
Expand Down