So currently I am transitioning into the tech industry after spending some time in the music industry. I graduated from the University of New Haven back in 2016 with a degree in music/sound engineering and I've been playing guitar for about 16 years. A few weeks ago I started classes at Flatiron School with hopes of one day becoming a software developer.
Audio engineering is all about being able to follow the flow of the audio signal as it travels from point to point in your signal chain. The signal may start from an instrument and then a microphone will transduce the signal so that we may manipulate that signal by passing it through processors like compressors and EQ's. From there, the signal may then travel to a mixing console where it can be manipulated even further and then to a loudspeaker and ultimately end up in your ears.
(this patch bay's not even that bad)
Nowadays people can accomplish the same tasks while sitting in their bedrooms with nothing but a laptop. But both analog and digital techniques are equally legitimate methods of audio engineering, it all depends on what the project is really.
I just think it's cool how these industries can evolve so similarly and making these kinds of comparisons makes learning new things much easier for me. However I am only just breaking the surface of React so I might have a completely new perspective of all this later on.
If anybody else is or has been in a similar transitional phase like me going into software development, I'd love to hear about any kinds of comparisons you are making.