nice post on pure aws dev, but i wonder non pure scenarios i have met in real life:
execute apache nifi flow which can only log to cluster log folders.
then execute emr spark jobs with 1000's lines of code (bad design but sometimes hot potaties land on our hands).
execute stored procedure inside snowflake or redshift with dont support externall application logging.
step functions, how different types of steps can access step context key/values.
lastly but probably most important, is how to organize and define static list of log keys (appid, stack id, stepid, errorlevel, error msgs etc etc) no matter what stack current step executing and again the problem several open sources put in ec2 clusters or saas solutions like snowflake don't really export logs as detailed as internally a stored procedure can access uding specialized internal systems sql views.
For further actions, you may consider blocking this person and/or reporting abuse
We're a place where coders share, stay up-to-date and grow their careers.
nice post on pure aws dev, but i wonder non pure scenarios i have met in real life: