Skip to main content

expressJS and nodeJS module

 While refreshing my skill set, I became acquainted with the ExpressJS framework. Different frameworks share similar fundamental concepts. Here, I document the development process starting from the basic initial structure.

1. Install basic expressJS project

npm install express-generator
express --view=jade 2403_express_sampleexpress_sample
cd 2403_express_sampleexpress_sample
npm install express

At this point, you should have the basic structure of ExpressJS set up. Next, install Mongoose, which is used with MongoDB.

npm install mongoose

Conducted the first test.

npm run start
( Ctrl+c to stop )


2. Checking the ExpressJS structure

2-1. The startup file for ExpressJS.

In ./package.json, find "scripts":{}. Among them, "start" records the startup file.

Add another startup file: ”dev": "node ./test/loc". Then create loc.js file on the corresponding file path. This option is for testing, the way to run test:  

npm run dev

2-2. Find the port of expressJS service 

In the startup file `./bin/www.js` for ExpressJS, searching for "port" reveals a comment, "Get port from environment and store in Express," indicating the default port used by Express is 3000.

When ExpressJS is running, using a browser to navigate to "http://localhost:3000/" will display the default page.

2-3. expressJS routing 

All expressJS's routing modules are in  ./routes folder. Default Pages are  index.js and user.js 
page URIs are set in  ./app.js 
Searching for "routes/index" in app.js reveals that modules from the `./routes` directory are individually added.
After "routes/users" is loaded, it is named `usersRouter`, and when integrated into the main structure, its URI is set to '/users'.

app.use('/users', usersRouter);

Opening `./routes/users.js`, you'll notice that the router doesn't handle the module name.

router.get('/', function(req, res, next) ...

The router handles subsequent parameter values.

router.get('/:id', async function(req, res) {
  console.log("book id: "+ req.params.id)
Since the primary resources are set up in `./app.js`, I also placed the task of connecting Mongoose to MongoDB in this file.

3. Build nodeJS module 

3-1. Initialize module 

mkdir -p modules/bookmgr
cd modules/bookmgr
npm init

At this stage, npm prompts a series of questions to help developers set up the basic parameters needed for the module. The default name of the module is the directory name. 

One of the questions is about the startup file. npm only helps to create package.json, which will start from the startup file recorded in the json.

3-2. Build object within the module

Since modules are obtained using "require" and not "import," to be considered part of a module, an object needs to be inside the module and acquired via "require."

var somename = require([relative path]);

"require" doesn't create a new instance, so to create an instance of a class, you need to initialize it yourself, and the framework may not execute it immediately. This results in module internal console.log() being immediate, but this.log being slightly delayed. 

Setting up a consistent log variable within the module makes it easier to manage, though it might not be very useful for real-time debugging. Before the module development stabilizes, there will likely be many scattered console.log() statements.

Notes for module:

module.exports : Only with it can the main structure locate this module.

exports.init: Only with it can other files access the public functions. The module does not use export.

3-3. Install module 

Back to the root of expressJS project. (Here is 2403_express_sampleexpress_sample/)

npm install ./modules/bookmgr ;

After installation, the module appears in ./node_modules/bookmgr.

All subsequent changes in the original directory (./modules/bookmgr) need to be updated to node_modules by install.


sample code: https://github.com/atfuture7/testcode/tree/master/nodejs/10_mod_nodjs/

Comments

Popular posts from this blog

Bookmark service (MongoDB & Spring REST) -2/2

    I accidentally deleted my development VM. I got lucky having the habit of taking notes. This blog is useful. Development VM is doom to be lost. Rebuild it waste time, but having a clean slate is refreshing~. What concerns me more is my AWS free quota this month is reaching 85%. The second VM I launched but never being used might be the one to blame. (Of course, my mistake.) I terminated the wrong VM. Now I got Linux 2 built. Great, just threw away everything happened on AMI.  1st layer: Page Page class   Originally, I need to prepare getter/setter for all class properties for Spring. By using lombok, I only need to create constructors. lombok will deal with getter/setter and toString(). But there are chances to call getter/setter, but how? .......Naming convention.... Capitalize the 1st character with the prefix get/set.  Annotation @Data was used on this class.  Repository class Spring Docs: Repository https://docs.spring.io/spring-data/mongodb/docs/3....

Guide to Preserving HuggingFace Models in Google Colab Environments

Conclusion:  Step 1:  find the model path: ls ~/.cache  Step 2:  Copy the entire folder to Google Drive:  Step 3:  Set model path to the subfolder under snapshot: My Story: I initially began exploring Generative AI (GAI) and Google Colab through Stable Diffusion. In the past, as I mainly wrote server services and console applications, I was less familiar with data science modes like R and Jupyter that can maintain a paused state. I didn't quite understand the heavy burden on Colab of creating a temporary Stable Diffusion WebUI with .ipynb, as suggested by popular guides. I just found it troublesome that connections often took a long time and then dropped, requiring a restart. Recently, while testing new versions of the Stable Diffusion model, and facing challenges due to Colab's policies making various versions of WebUI difficult to run successfully, I started researching how to write my own test programs in Colab. Eventually, I understood that Colab is ess...

docker: storage and services

Docker Volume Without going through Docker, accessing files within a Docker image is not easy. Therefore, Docker has a volume feature that allows users to specify the location on the host file system to serve as the access directory for Docker. In other words, a volume is the NFS of a Docker image. Example: Using a volume to store SQLite files. ref:  SQLite for NodeJS: https://www.sqlitetutorial.net/sqlite-nodejs/ Initialize the project. # Setup ExpressJS framework express --view=pug 240324_ejs_sqlite_docker npm init npm install express npm install sqlite3 Test: npm start Initialize docker  Since the directory already contains NodeJS, the Docker Daemon will assist in the setup. Create new docker image docker build -t atfuture7/sqlite01 . Create a folder for Docker, create a container, run it.  mkdir docker_vol docker run -p 3000:3000 -v ./docker_vol:/data --name exp_sqlite atfuture7/sqlite01 After confirming that the container can run co...