DEV Community

Noel Worden
Noel Worden

Posted on • Edited on

Tweaking Logger Outputs on the Fly

The best thing I learned this week isn't exactly a mind bender, but it saves some clicks, and is worth writing about. I learned that you can pipe in custom logger output options right into an execution script.

I've been working on an Elixir/Phoenix project, and since the release of the production environment we have had to take steps to ensure data insertions are executable across multiple databases. The basic components are a CSV containing the data, a file containing the insertion functions, and a script to execute the process, which looks like this:

docker-compose exec web elixir -S mix run -e 'MyApp.Manipulations.TestDataManipulation.run()'

The logger output wasn't much of a concern, until we started running these scripts on our staging database and we needed to be absolute sure the data was inserting as expected. The process of inserting data to staging was:

  • Dry run the script locally
  • Save the logger output to a file
  • Dry run the script on staging
  • Save that logger output to a file
  • Diff the two files

While test running this the first time I realized that our local dev environment was printing out a much more detailed log than staging, which created an issue when attempting to diff the two files.

The local logging:

[debug] QUERY OK db=4.8ms queue=40.1ms idle=0.0ms
begin []
[debug] QUERY OK source="agents" db=2.9ms
SELECT a0."id", a0."name" FROM "agents" AS a0 WHERE (a0."legacy_id" = $1) [115]
[debug] QUERY OK source="form_as" db=8.1ms
SELECT s0."id", s0."vendors", s0."revenue_share", s0."legacy_id", s0."inserted_at"...
INSERT INTO "agents" ("legacy_id","name","inserted_at","updated_at") VALUES ($1,$2,$3,$4) ...
[debug] QUERY OK db=6.5ms
INSERT INTO "form_as" ("legacy_id","revenue_share","vendors","inserted_at","updated_at") ...
[debug] QUERY OK db=7.0ms
INSERT INTO "merchant_accounts" ("agent_id","mid","form_a_id","vendors","inserted_at"...
[info] Inserted data for %{name: "PERSON ONE", a_la_row_id: "115"}: %{agent: 1, merchant_account: 1, form_a: 1}
[debug] QUERY OK source="agents" db=0.4ms decode=0.1ms
SELECT a0."id", a0."name", a0."phone_number", a0."legacy_id", a0."inserted_at"...
[debug] QUERY OK source="form_as" db=0.3ms
SELECT s0."id", s0."vendors", s0."revenue_share", s0."legacy_id", s0."inserted_at"...
[debug] QUERY OK db=0.4ms
INSERT INTO "agents" ("legacy_id","name","inserted_at","updated_at") VALUES ($1,$2,$3,$4) ...
[debug] QUERY OK db=0.5ms
INSERT INTO "form_as" ("legacy_id","revenue_share","vendors","inserted_at","updated_at") ...
[debug] QUERY OK db=3.3ms
INSERT INTO "merchant_accounts" ("agent_id","mid","form_a_id","vendors","inserted_at"...
[info] Inserted data for %{name: "PERSON TWI", a_la_row_id: "123"}: %{agent: 1, merchant_account: 1, form_a: 1}

And the logging from staging:

[info] Inserted data for %{name: "PERSON ONE", legacy_id: "115"}
[info] Inserted data for %{name: "PERSON TWO", legacy_id: "123"}
[info] Inserted data for %{name: "PERSON THREE", legacy_id: "73"}
[info] Inserted data for %{name: "PERSON FOUR", legacy_id: "17"}
[info] Inserted data for %{name: "PERSON FIVE", legacy_id: "117"}
[info] Inserted data for %{name: "PERSON SIX", legacy_id: "84"}
[info] Inserted data for %{name: "PERSON SEVEN", legacy_id: "34"}
[info] Inserted data for %{name: "PERSON EIGHT", legacy_id: "69"}
[info] Inserted data for %{name: "PERSON NINE", legacy_id: "16"}
[info] Inserted data for %{name: "PERSON TEN", legacy_id: "114"}
[info] Inserted data for %{name: "PERSON ELEVEN", legacy_id: "78"}
[info] Inserted data for %{name: "PERSON TWELVE", legacy_id: "134"}
[info] Inserted data for %{name: "PERSON THIRTEEN", legacy_id: "113"}
[info] Inserted data for %{name: "PERSON FOURTEEN", legacy_id: "152"}
[info] Inserted data for %{name: "PERSON FIFTEEN", legacy_id: "138"}

The long way to manually sync the local logging output is go to the config > dev.exs file, and modify this line:

config :logger, :console, format: "[$level] $message\n"

To match the logging output of staging:

config :logger, level: :info

When finished running those scripts the changes would need to be reverted.

Its not a terrible workflow, but piping the logger configuration directly into the execution script is far more efficient:

docker-compose exec web elixir -S mix run -e 'Logger.configure(level: :info); MyApp.Manipulations.TestDataManipulation.run()'

With this simple addition to the script the logging detail levels match, and theres no need to temporary alter the config files.


This post is part of an ongoing This Week I Learned series. I welcome any critique, feedback, or suggestions in the comments.

Heroku

Build apps, not infrastructure.

Dealing with servers, hardware, and infrastructure can take up your valuable time. Discover the benefits of Heroku, the PaaS of choice for developers since 2007.

Visit Site

Top comments (0)

Qodo Takeover

Introducing Qodo Gen 1.0: Transform Your Workflow with Agentic AI

Rather than just generating snippets, our agents understand your entire project context, can make decisions, use tools, and carry out tasks autonomously.

Read full post

👋 Kindness is contagious

Please leave a ❤️ or a friendly comment on this post if you found it helpful!

Okay