From my perspective what I understand about this topic.
Reason learn: To apply in the system.
It is the concept data pipeline concepts. where the process of where data is pulled from one place(extract), reshaped/cleaned(transform), and loaded into another place(loaded)
Imagine that you have external SFTP server need to integrate with your system. SFTP Server → My System(I using Laravel).
Task of my system are:
- Load(Receiving the result)
- Extract(Sending out the result)
Later you have scripts in Laravel and Credential access to SFTP Server.
Format file used: CSV/Zip
Here others important stuff need in the process.
1. Schedular — Runs automatically at a fixed time. Example here is every night at 2am, my system automatically checks the SFTP folder for new files from server— whether files exist or not, the scheduler still runs.
2. Event Trigger — Runs only when something specific happens, not on a schedule. Like a motion sensor light — it only turns on when it detects movement.
3. Batch — A grouped collection of records processed together in one go, instead of one by one. It’s about how it processes data. Batch means “collect everything, process all at once
[The process Batch Out:]

Top comments (0)