OpenGL on MacOS (C / C++)

Disclaimer

OpenGL is deprecated on MacOS in favor of Metal. If you still want to use OpenGL, it is still possible to compile applications. Also as another disclaimer, the examples I post use deprecated OpenGL constructs. OpenGL changed over the time and added new features whereas older features were depreacted. As I am learning myself, I am using deprecated features until I get comfortable with the newer stuff. So be aware and inform yourself about the most current OpenGL features.

Compiling OpenGL applications on MacOS

Reading this post, it is necessary to import OpenGL headers from specific folders on MacOS instead of the default paths used on Windows and Linux.

#ifdef __APPLE__
#include <OpenGL/gl.h>
#include <GLUT/glut.h>
#else
#include <GL/gl.h>
#include <GL/glut.h>
#endif

The command line for compiling is:

g++ blender.c -framework OpenGL -framework GLUT

You will get a lot of warnings, because all OpenGL calls are deprecated but the compiler should generate a a.out nevertheless, which you can execute using

./a.out

Resolution on Retina-Displays

Running a glut application on a retina display displays in a real low resolution. I did not figure out the solution for this but there are some pages explaining solutions:

https://www.opengl.org/discussion_boards/showthread.php/178916-Using-the-retina-display-on-a-macbook-pro

http://iihm.imag.fr/blanch/software/glut-macosx/

Installing Python 3 on MacOS using homebrew

MacOS Systems come with python 2 preinstalled. Installing python 3 can be done using homebrew.

brew install python3
brew postinstall python3

Check for success using

python3 --version
pip3 -V

If brew complains that python3 is installed but not linked, use

brew link python

For me, linking failed with:

Warning: python 3.7.2 is already installed, it's just not linked
You can use `brew link python` to link this version.
MacBook-Pro:~ user$ brew link python
Linking /usr/local/Cellar/python/3.7.2... Error: Permission denied @ dir_s_mkdir - /usr/local/Frameworks

If the linking fails due to permission problems, you should read this thread.

They suggest a solution that entails creating and setting permissions on a folder:

sudo mkdir /usr/local/Frameworks

sudo chown $(whoami):admin /usr/local/Frameworks

Now try linking again, for me the link finally worked and python was available.

brew link python

python3 --version

Installing virtualenv

pip3 install virtualenv

 

Using virtualenv

Creation of a virtual environment:

$ virtualenv -p python3 <desired-path>

e.g. virtualenv -p python3 /home/user/dev/python/myproject/myenv

Activate the virtualenv:

$ source <desired-path>/bin/activate

e.g. source /home/user/dev/python/myproject/myenv/bin/activate

Deactivate the virtualenv:

$ deactivate

Testing with Jest

The Redux Documentation declares Jest as the unit testing framework of choice. This is a beginners introduction to Jest.

Taken from the Jest documentation:

Zero configuration – Jest is already configured when you use create-react-app or react-native init to create your React and React Native projects. Place your tests in a __tests__ folder, or name your test files with a .spec.js or .test.js extension. Whatever you prefer, Jest will find and run your tests.

As it turns out, Jest is already integrated into your codebase should you have used create-react-app.

Error fsevents unavailable

When npm run fails with the following output:

npm test

> cryptofrontend@0.1.0 test /Users/bischowg/dev/react/cryptofrontend
> react-scripts test

Error: `fsevents` unavailable (this watcher can only be used on Darwin)
    at new FSEventsWatcher (/Users/bischowg/dev/react/cryptofrontend/node_modules/sane/src/fsevents_watcher.js:41:11)
    at createWatcher (/Users/bischowg/dev/react/cryptofrontend/node_modules/jest-haste-map/build/index.js:780:23)
    at Array.map (<anonymous>)
    at HasteMap._watch (/Users/bischowg/dev/react/cryptofrontend/node_modules/jest-haste-map/build/index.js:936:44)
    at _buildPromise._buildFileMap.then.then.hasteMap (/Users/bischowg/dev/react/cryptofrontend/node_modules/jest-haste-map/build/index.js:355:23)
    at <anonymous>
    at process._tickCallback (internal/process/next_tick.js:160:7)
npm ERR! Test failed.  See above for more details.

read https://github.com/expo/expo/issues/854

npm r -g watchman
brew install watchman

I had to run the brew command three times before it finally worked. npm test should now work without issues.

Testing Action Creators

Given a file actions.js that contains the action creator

function receiveEntriesActionCreator(json) {
  return {
    type: RECEIVE_ENTRIES,
    entries: json
  }
}

you want to write a test that verifies that the action creator returns an action that has a type property with a value of RECEIVE_ENTRIES and an entries property that contains a specific javascript object.

In order to write the test, add a file called actions.test.js next to actions.js. In actions.test.js insert:

import * as actions from './actions.js';

import {
  RECEIVE_ENTRIES,
  ADD_ENTRY,
  UPDATE_ENTRY,
  DELETE_ENTRY
} from './actions.js'

test('receiveEntriesActionCreator returns a correct action', () => {

  const entries = [{ id: '12345', password: 'abcdef' }]

  const expectedAction = {
    type: RECEIVE_ENTRIES,
    entries
  }

  expect(actions.receiveEntriesActionCreator(entries)).toEqual(expectedAction)

});

On line 10, the test() function is called given a description of what the test is trying to verify. The second parameter is the code that the test should execute. Lines 12 to 17 assemble the expected result. On line 19, expect().toEqual() is called.

In the console type npm run to start a watcher that executes the tests after you save your changes to a file that has a unit test.

Node and MongoDB with Mongoose

This post describes how to add MongoDB with Mongoose as a database to your Node application.

Interesting links

https://mongoosejs.com/
Part 3 of the Mozilla Express Tutorial
https://mongoosejs.com/docs/index.html
https://medium.freecodecamp.org/introduction-to-mongoose-for-mongodb-d2a7aa593c57
https://www.jenniferbland.com/saving-data-to-mongodb-database-from-node-js-application-tutorial/

Installing and starting MongoDB

First install MongoDB on your system. For MacOS the instructions are available on the MongoDB homepage.
brew update
brew install mongodb
The MongoDB daemon that executes the database is started with:
/usr/local/bin/mongod --config /usr/local/etc/mongod.conf

The mongo shell

The mongo shell is a command line tool that allows you to connect to the mongodb and perform queries. It will connect to mongodb://127.0.0.1:27017 if no parameters are specified

/usr/local/bin/mongo

To display the database you are using, type db
To switch databases, issue the use <db> command
To show all databases that contain at least one document, type show dbs
To show all collections in a database, type show collections

MongoDB is case sensitive! The schema passwordentry is a different schema than PasswordEntry!

Inserting:

db.PasswordEntry.insertOne( { password: "test" } )

Select all entries:

db.PasswordEntry.find ( {} )

Using MongoDB from within Node

To install the mongoose npm dependency and a JSON middleware type

npm install mongoose
npm install body-parser save

MongoDB is a NoSQL Database that deals with documents that are stored in so called collections. The synonym for a collection is a table. The synonym for a record in a table is a document.

The fields of a document are defined using a schema. Given a schema, you can create a model. A model is a factory that creates documents which then are saved into collections.

Because models can only be defined once (otherwise an exception is thrown) in the application lifecycle (You can update them several times to add more fields, but you can create them only once), you need some centralized code that defines the schema and then creates the models.

const mongoose = require('mongoose');

mongoose.Promise = global.Promise;
mongoose.connect("mongodb://localhost:27017/crypto");

// define a schema
var passwordEntrySchema = new mongoose.Schema({
  password: { type: String, required: true }
});

// this adjusts the conversion of mongoose objects to json when serializing the objects
// it outputs the _id as id and removes the _id. It also removes the __v field.s
passwordEntrySchema.set('toJSON', {
  transform: function (doc, ret, options) {
    ret.id = ret._id;
    delete ret._id;
    delete ret.__v;
  }
});

// compile the schema into a model
let collectionName = 'peCollection';

// create and export the model
// models can only be created once, that is why this code is put into its own module
module.exports = mongoose.model('PasswordEntry', passwordEntrySchema, collectionName);

Once you have that centralized module, you can import the model into other places. Here is how to create a document and save it:

var express = require('express');
var router = express.Router();

const mongoose = require('mongoose');
var PasswordEntryModel = require('../mongo/mongo.js');

router.post('/add', function (req, res, next) {

  console.log('/add ', req.body);

  var myData = new PasswordEntryModel(req.body);
  myData.save()
    .then(item => {
      res.send("item saved to database");
    })
    .catch(err => {
      res.status(400).send("unable to save to database");
    });

  console.log('/add end');
});

module.exports = router;

It is assumed, that a JSON describing a PasswordEntryModel is posted to this router.

C++ on MacOS with Eclipse CDT

The Eclipse C++ Developer Tooling adds C++ support to Eclipse.

The installation URL is https://download.eclipse.org/tools/cdt/releases/9.6/ using this URL, you can install CDT form within Eclipse using the “Install new Software” feature.

Eclipse CDT requires a C++ compiler being installed on the System. Under MacOS the easiest way to install a compiler is to install XCode from the app store. This also installs the lldb debugger which Eclipse CDT has support for. The latest version of XCode does not come with the gdb debugger which is the default debugger in a vanilla installation of CDT. That means you have to change Eclipse CDT to use the lldb debugger instead of gdb.

In order to debug using lldb, you have to edit the workspace settings and change the default debugger from gdb to lldb. The settings for the debugger are contained in Eclipse Menu > Preferences > Run/Debug > Launching > Default Launchers. In the submenu, change the Debug option to lldb and save.

React Redux Cryptofrontend

This post describes how the Cryptofrontend was created.

Description of the Crypto-Application

The cryptoapp stores password entries in a server (which for this post is assumed to be working prerequisit) and it has a frontend that displays the entries to the user. The frontend will look very similar to an email client. On the left it lists all entries. The entries in the list are only showing part of the information so the user can quickly decide which entry he wants to use. Once the user clicks one of the entries, the entire data of that entry will be displayed on the right side of the application in a details view.

Basic CRUD operations on the password entries will be possible. If the user enters the master password in the details view, the password will be decrypted and copied into the users clipboard if that is possible. Otherwise the app just displays the plaintext password for the user to copy. The decryption and encryption happens in the user’s browser. Only encrypted data is transmitted to and from the backend.

Creating the Project Structure

Update node using the node version manager (nvm) to the latest version, otherwise the later steps probably will fail. First find the latest version using nvm ls-remote. Then install that version using nvm install <version>. nvm will automatically make installed version current so it is used when executing npm and node commands.

The first step is to generate the empty project structure using create-react-app command.

npx create-rect-app cryptofrontend

Make sure that the generation did finish successfully! The generator will create a cryptofrontend folder which can be opened in Visual Studio Code. If type yarn start inside the project folder, a server will be started and the site is automatically opened in a browser.

Organizing the Code

All actions and thunk functions go into the action folder. The action folder will contain all objects that can be dispatched. A better name for the actions folder would be dispatcheables.

All reducers go into the reducers folder.

The store creation goes into the store folder.

Installing the required npm Packages

npm i cross-fetch
npm i redux
npm i react-redux
npm i redux-logger
npm i react-thunk
npm i redux-thunk

Defining the Redux State

First let’s define what state the Redux container will contain. This is an example JavaScript Object that outlines what the state comprises. There is the id of the currently selected password entry. If no entry is selected, the value will not be there at all or it will be equal to or less than zero. After the selected id, there is an array of objects, each denoting one password entry.

{
  selected_entry_id: 1,
  entries: [
    {
      id: 1,
      password: 'ad2jd92jd9dj9j9'
    },
    {
      id: 2,
      password: 'sdfsdfsfsfsdfsdf'
    }
  ]
}

Retrieving Entries from the Server

Let’s develop the application in an iterative fashion by adding little bits of functionality until the app is complete. That way the entire complexity will not hit as as hard. In a sense this allows us to apply the principle of divide and conquer.

The first iteration will retrieve entries from the server and output the entries to the developer console. The entries have to be fetched from the backend and they have to be stored in the Redux store. Normally to alter the store, an action has to be dispatched and a reducer will create the new state given the current state and the values contained in the action. In order to talk to a backend, instead of dispatching an action and defining a reducer, the thunk middleware is used which makes it possible to dispatch a function.

This function is called fetchEntries(). Once fetchEntries() did receive the entries from the server, it will dispatch an action that carries the entries retrieved from the backend. This action will be handled by a reducer that inserts the entries into the state.

Reducers

Lets first talk about the reducers. As the first iteration only cares about the entries, there is only one reducer that insert entries into the store. The root reducer will only consist of this one reducer.

import { combineReducers } from 'redux'
import {
  RECEIVE_ENTRIES
} from '../actions/actions.js'

// deals with entries
function entriesReducer(state = {}, action) {

  // switch over all action types
  switch (action.type) {

    // entries were retrieved from the backend
    case RECEIVE_ENTRIES:

      // this reducer will take the entries from the action and put copy
      // them into the store
      return Object.assign({}, state, {
        entries: action.entries
      })

    default:
      return state
  }
}

The entriesReducer only reacts to the RECEIVE_ENTRIES action and it will just copy the entries that the action contained into the store. The file also builds the root reducer:

// build the rootReducer from several partial reducers
const rootReducer = combineReducers({
  entriesReducer
});

// default export of this file
export default rootReducer;

The root reducer currently only consists of the entriesReducer because there is no other reducer right now. (A currentlySelectedEntryReducer will follow in upcoming iterations).

The next thing needed is the RECEIVE_ENTRIES action itself.

Actions, Thunks and other Dispatcheable Objects

import fetch from 'cross-fetch'

export const RECEIVE_ENTRIES = 'RECEIVE_ENTRIES'

// this function is called an action creator as it constructs
// and returns an action
function receivePostsActionCreator(json) {
  console.log('receivePostsActionCreator ', json);
  return {
    type: RECEIVE_ENTRIES,
    entries: json
  }
}

// this function is a thunk. It is dispatcheable and performs
// the server communication. Once it is done, it will dispatch
// call the action creator to dispatch the RECEIVE_ENTRIES action
export function fetchEntriesThunk() {
  console.log('fetchEntriesThunk');
  return dispatch => {
    return fetch(`http://localhost:8081/todos`)
      .then(response => response.json())
      .then(json => dispatch(receivePostsActionCreator(json)))
      .catch((error) => {
        console.log("reset client error-------", error);
      });
  }
}

This file contains one action creator and one thunk. The thunk will use the action creator to dispatch the RECEIVE_ENTRIES action once it has retrieved all entries from the backend. The entries are put into the RECEIVE_ENTRIES action by the action creator. The action is dispatched and handled by the reducer defined in the last section. The reducer will create a new state from the entries contained in the action. This is how the entries travel from the backend all the way into the store.

Store Creation

This file creates a store and adds the thunk and logger middleware:

import rootReducer from '../reducers/reducers.js';
import { createStore, applyMiddleware } from 'redux'
import thunkMiddleware from 'redux-thunk'
import { createLogger } from 'redux-logger'

const loggerMiddleware = createLogger()

function configureStore(preloadedState) {
  return createStore(
    rootReducer,
    preloadedState,
    applyMiddleware(thunkMiddleware, loggerMiddleware)
  )
}

export default configureStore;

The configureStore() function is exported. It creates the store and adds the rootReducer. It also allows to add an optional initial state, which we do not use. Also, it applies middleware. It applies the thunk middleware which allows for dispatching functions instead of only action objects. It applies a loggerMiddleware which logs every state change to the developer console.

In the app.js file, testing code is added. The testing code will also fulfill all our goals that we set for the first iteration. It creates the store, dispatches the thunk and by doing so, retrieves data from the server and inserts that data into the store.

import configureStore from './store/store.js'
import { fetchEntriesThunk } from './actions/actions.js'

// create a store
const store = configureStore();

// Log the initial state
//console.log(store.getState());
store.dispatch(fetchEntriesThunk());
//console.log(store.getState());

When you load the cryptofrontend in your browser, this code is executed and the backend is accessed. In the developer tools console of your browser, you should see all the state changes that the store is going through.

Summary for the First Iteration

Without using any components, the goal of the first iteration was to integrate the Redux Store to the backend server. There is now a thunk that GETs all entries from the backend.

Adding an Entry into the Redux Store

This thunk will POST an entry to the redux store:

export function addEntryThunkActionCreator() {

  // DEBUG
  console.log('addEntryThunkActionCreator');

  // return the thunk
  return dispatch => {
    return fetch(`http://localhost:8081/todos/add`, {
      method: 'POST',
      headers: {
        'Accept': 'application/json',
        'Content-Type': 'application/json'
      },
      body: JSON.stringify({ id: 0, password: 'Textual content' })
    })
      .then(response => response.json())
      .catch((error) => {
        console.log("POST addentry error ", error);
      });
  }
}

Note that the new entry is not added to the Redux Store, it is only posted to the backend. Given this code, the app has to dispatch the thunk that retrieves all entries to sync up its state with the backend store. An alternative would be to dispatch an action that appends the new entry to the Redux store state in the success case. This alternative saves a lot of network traffic. The following code defines an action to append a new entry to the Redux store.

Selecting an Entry in the Redux Store

The next iteration will add an action for selecting one of the entries. The idea is that after selecting an entry, the details view has to render all the entries details. The currently selected item is actually part of the state in the store. That is why the store will be extended by an action for selecting an entry. Thanks for reading and stay tuned for the next iteration.

Building the components

 

React Redux Development

For a beginner, it can be very confusing to create a React Redux app as a lot of new concept have to be understood and combined before anything sensical can be built. This post is supposed to sketch out a process for getting a React Redux app up and running in a controlled manner.

The steps are ordered from the backend towards the frontend of the application. Starting from the backend that exposes a API, the Redux model/container and then finally the React components in the frontend are created.

Creating the Backend

  1. Create the database schema in a database
  2. Create a node server using express

Creating the Redux Part

  1. Define the JavaScript Object that describes the applications state
  2. Define the actions that can be performed on the state
  3. Define the action creators that create the actions
  4. Define the reducers for the actions. The default value of the reducers state parameter defines the initial state of the Redux Container.
  5. Define the root reducer as a combinedReducer from all reducers.
  6. Define thunk reducers that retrieve data from the backend
  7. Create the container with the initial state
  8. Write a test for the entire Redux part of the application

Creating the React Part

  1. Define the components. Define what properties the components have that means, what properties the components take the data from when rendering the GUI. The properties of the component act as the interface of the component. It is the task of the React-Redux mapStateToProps Method to adapt the Redux container state to the props of the component
  2. Connect the components to Redux using Redux-React. Use mapStateToProps to map the Redux container state to the components properties.

Node and Express

Node allows you to start servers written in JavaScript. For executing the JavaScript it uses Chrome’s V8 JavaScript engine.

Express is a framework that is based on Node and allows you to easily implement an API. You can define a API using the routing features of Express. Routing is Expresses term for the definition of how a server answers web requests.

Generating a project skeleton

Express has a tool that initializes a basic project structure. First install then use that tool:

npm install express-generator -g
express --view=pug <Application Name>
cd <Application Name>
npm install

Then you can start the server.

DEBUG=<Application Name>:* npm start

You can visit http://localhost:3000/ to see the servers routing for a GET to the root URL.

Adding a router

A router defines the servers reaction to calls to a URL. A router is registered or bound to a URL in the app.js file

var todosRouter = require('./routes/todos');
app.use('/todos', todosRouter);

The code above creates a router (the router is explained later) and the binds that router to the /todos URL. All calls to http://localhost:3000/todos will now be handled by the todosRouter. 

In order to add the router itself, create a new file in the routes folder called todos.js for the todosRouter.

var express = require('express');
var router = express.Router();

router.get('/', function (req, res, next) {
  res.setHeader('Content-Type', 'application/json');
  res.send(JSON.stringify({ a: 1 }));
});

module.exports = router;

Note that this router never refers to the /todos URL! Instead it defines a routing for GET requests to / (root). There is one important thing to understand here. All URLs inside a router definition are relative.

A router does not care which URL it is bound to, instead it only defines relative URLs inside its own local context. The absolute URL of a router is constructed from appending the local URLs to the URL that the router is currently bound to.

In the context of the current example, this means that the root of the todosRouter router actually resolves to the absolute URL http://localhost:3000/todos/, because the router was bound to http://localhost:3000/todos.

Connect to An SQL Database

Install the node sql driver

npm install mssql -s
Connect to the Database in a Router (MySQL as an example)

Create a new schema:

CREATE SCHEMA `cash` ;

Create a table in the schema:

CREATE TABLE `cash`.`account_entry` (
  `id` INT NOT NULL AUTO_INCREMENT,
  `amount` INT NULL COMMENT 'Amount in least significant currency units of the currency of the entries account. E.g. Cent for EUR if the account has EUR set as a currency.',
  PRIMARY KEY (`id`));

Define a connection settings object in one of the router files of the express project and use that connection settings object to connect to the schema using a user on the database server.

var express = require('express');
const sql = require('mssql');
var router = express.Router();

const sqlConfig = {
    user: 'root',
    password: 'test',
    server: '127.0.0.1:3306',
    database: 'cash'
}

/* GET home page. */
router.get('/', function (req, res, next) {

    const connection = sql.connect(sqlConfig, (err) => {
        if (err) {
            console.log('error');
            res.send('error');
        } else {
            res.send('DB connected');
        }
    });

    //res.render('index', { title: 'Express' });

});

module.exports = router;
Using an ORM Mapper (Sequelize)

An ORM mapper abstracts away all SQL interface code for translating between your programming language of choice and the SQL server. Instead of dealing with SQL queries, the ORM mapper lets you store and load data using objects used in your application.

When I started writing applications that used SQL databases as datastores, I wanted to write all the SQL code myself and I did not familiarize myself with ORM mappers at all. It is fun to write your own DAO and DTO class hierarchy to map from and to the SQL database but only for the first handfull of classes. As projects grow and more requirements and features and therefore objects to manage get added to the project, you quickly find yourself in a situation where you replicate your more or less dumb boilerplate code over and over for every small little insignificant new object. In reality the only thing you should care about is writing code for business logic. You should not spend time on SQL storage code if ever possible.

Another scenario is that of a high performance application. If you are required to squeeze out the last millisecond of your application, then maybe manually use SQL. In all other cases, I will recommend at least looking into the idea of using an ORM mapper. It will pay off quickly.

I really do not know which ORM mapper is suited best for your needs but as an example, lets use Sequelize. Sequelize internally connects to the SQL server, so it is possible to remove the SQL driver dependency from the example above.

Add sequelize to the project

npm install sequelize

As MySQL is beeing used in this example, install the MySQL dependencies.

npm install --save pg pg-hstore
npm install --save mysql
npm install --save mysql2

Add code to a router that inserts a new row into a table and returns the inserted element.

var express = require('express');
var Sequelize = require('sequelize');
var router = express.Router();

const sqlConfig = {
    user: 'root',
    password: 'test',
    server: '127.0.0.1:3306',
    database: 'cash'
}

var sequelize = new Sequelize(sqlConfig.database, sqlConfig.user, sqlConfig.password, {
    host: 'localhost',
    port: 3306,
    dialect: 'mysql',

    pool: {
        max: 5,
        min: 0,
        idle: 10000
    }
});

var account_entry = sequelize.define('account_entry', {
    'id': {
        type: Sequelize.INTEGER(11),
        allowNull: false,
        primaryKey: true,
        autoIncrement: true
    },
    'amount': { type: Sequelize.INTEGER }
});

autoIncrement: true,

    /* GET home page. */
    router.get('/', function (req, res, next) {

        sequelize.sync().then(function () {
            return account_entry.create({
                amount: 123
            });
        }).then(function (new_account_entry) {

            const msg = new_account_entry.get({
                plain: true
            });
            console.log(msg);
            res.send(msg);
        });

    });

module.exports = router;

Add the router to the application configuration (app.js)

var createError = require('http-errors');
var express = require('express');
var path = require('path');
var cookieParser = require('cookie-parser');
var logger = require('morgan');

var indexRouter = require('./routes/index');
var usersRouter = require('./routes/users');
var accountEntriesRouter = require('./routes/account_entries');

var app = express();

// view engine setup
app.set('views', path.join(__dirname, 'views'));
app.set('view engine', 'pug');

app.use(logger('dev'));
app.use(express.json());
app.use(express.urlencoded({ extended: false }));
app.use(cookieParser());
app.use(express.static(path.join(__dirname, 'public')));

app.use('/', indexRouter);
app.use('/users', usersRouter);
app.use('/account_entries', accountEntriesRouter);

// catch 404 and forward to error handler
app.use(function (req, res, next) {
    next(createError(404));
});

// error handler
app.use(function (err, req, res, next) {
    // set locals, only providing error in development
    res.locals.message = err.message;
    res.locals.error = req.app.get('env') === 'development' ? err : {};

    // render the error page
    res.status(err.status || 500);
    res.render('error');
});

module.exports = app;

Delete the table account_entry table created manually earlier. The ORM mapper will automatically generate a table creation SQL statement from the type definition and execute that statement agains the MySQL server.

When you call http://localhost:3000/account_entries the table is created in the MySQL schema as it did not exist already and an account_entry row is inserted into the database. All that is possible without writing a single line of SQL thanks to the ORM mapper which intersects your request to store an object and does all the required SQL boilerplate for you!

 

React Redux Fetch Data from an API with Redux Thunk

The GUI of a webapp oftentimes has to retrieve its data from a backend system. In this post we assume that the backend system is providing the data via an API that supports GET requests. This post rehashes the current best practices for calling a backend API and retrieving the data as described in the Redux Advanced Tutorial.