I'm curious as to whether students in computer science and/or code schools are getting to work with existing codebases in any way? "Legacy" or otherwise.
I definitely never did this in any types of education I was a part of, but I'm curious if anyone is doing it these days.
I had a friend who went to a bootcamp (sorry I don't remember which one) where the final project was two weeks. Week one students paired up to create a project, and then in week two you switched projects with another team and had to add features to their code without help from them.
I believe the focus was two-way, in the first week the focus to make sure your code was clean and documented enough for the next team not to have too hard of a time, and in the second week the focus was on working with a "legacy" app written by someone else and contribute to an unfamiliar codebase.
I've often wondered how you'd introduce this concept into a classroom setting. This is a really neat idea on how to do it. I think schools would benefit greatly from it.
I'd love to see schools or bootcamps introduce debugging of an intentional hard to find issue as well. Not sure how to do that one well either.
That's an awesome idea for a final project!
It will definitely help to prepare you for the things you will see once you start working.
Yup! I was really impressed when I heard that.
If I ever go into the bootcamp business that's one thing I'd adopt :)
WoW! What school is that 'cause it just earned some major brownie points from me.
Wish I remembered... :(
I'm a CS senior at a large public university, and this I think this is the biggest gap in a CS Degree vs. Getting a job as a software engineer. At no point in my education did we look at any existing codebases. The closest we got is using Django in web programming and using android in mobile (in that we had to interact with an existing code base). When I got to my first internship I realized very quickly that my education would not prepare me for grounding myself in a codebase and that was something I would have to learn on my own.
Yes. In the first module we swapped our project with someone else's. Digging around newbie vanilla JS that you didn't write was... fun!
In our last module we were assigned an open source repo and had a week to fix a bug. The project we contributed to was groomed over to ensure we had something somewhat easy to tackle as beginner, and some groups had a codebase with languages we weren't learning. Some of the people maintaining these projects did not like this because (I assume) some students weren't able to fix the bug in a week and never returned to it after our allotted time. We were quite busy with other projects and stressed with graduating, but I can see the frustration the maintainers might have felt with people not keeping their word.
Overall I enjoyed the experience, although I wish I would have spent more time on it.
I didn't get this experience in school. We had one assignment where we were given some code to QA. It wasn't a real-world app though, just pretty trivial functions with a test-suite we needed to modify. A few months into my first job out of college, I remember wishing I had more experience working with larger, preexisting codebases.
We've been thinking about adding an educational component to DevFlight in the future. Essentially, students/ new devs who want to simulate a professional software development experience working with existing codebases can closely collaborate with participating OSS maintainers. Devs become more effective software engineers and maintainers have access to devs willing to help them with their projects.
I wish I were able to get some actual experience in college too.
Also, your educational component for DevFlight sounds like a really neat idea (although I'm not a student anymore, so I probably wouldn't qualify).
Thanks! You never know, it could mature into an efficient, pleasant way to learn a new language/framework for seasoned devs :)
At the Minor
Web Development
at the AUAS we work with a couple of stakeholders (mostly agencies). We do occasionally fork the repository or get a clone of a codebase for students to work with.One of the things we almost do with every project is working with
real data
from the company / business. Such asrest
API's to, not necessarily the exact tech stack but the same raw data they use.Nope, never had to deal with legacy code at university. The closest I was to it was a networks class in which we had to implement communication through Serial Port (yeah, Serial Port) and we had to tweak some of the insides of the driver that the professor gave us, but still not really, which is sad considering that once you start working it's very unusual to be working with a clean slate.
You don't have to. At university, you're always free to do your own new shiny thing.
But you can, often professors have research projects which go many semesters and if you're interested in the topic, you have to work with the code-base last semesters students left.
Here in germany there are several ways where people wanting to work in IT have contact with existing code:
There are two types of Universities:
AFAIR both of these imply that you have to spend at least half a year at a company to work on "real" stuff and write your thesis after that. I mostly worked with the latter, so I can not say for sure things regarding the further.
Last year in my Master's (Netherlands) I had the chance to follow a course called Software Architecture, in which the objective was to take an existing Open Source project as a group, analyze its architecture (and different aspects to it), and write a report on it. What was awesome though was that we were very encouraged to use the knowledge we obtained from analyzing the project to make contributions towards the project. There was also a dedicated slack channel for sharing statuses on all the PRs of all the groups, and the lecturers would highlight the merged ones during lectures and on Twitter. Was a really great way of getting to learn about the Open Source project and the community.
For those interested, all the reports of all the projects were gathered at the end and published as an ebook. The link of my year is here. Not gonna mention my particular project to avoid self-promoting, but it's definitely web dev related 😉.
Edit: In the time of the course my group and I made an amount of code based PRs to our chosen project, most of which were merged.
When I've taken part of industry discussions with local schools about their co-op programmes the feedback on what they were missing was universally:
In our co-ops I've seen improvement on the first two - we don't have to train people on the basics of git anymore and the basics of unit tests are covered these days.
It's too bad that this is still a common gap in most people's education.
I'm still technically a student, so yeah some do work with legacy codebases (through internships). I've heard from some of my colleagues where they work on an existing product on day 1 (shudders), yet some finished without writing a production code.
Back in my student days, we had the opportunity to access code for on-going University projects (some of them anyway) or to actively enroll in those projects in any capacity we could provide. A few years later when I became a Teaching Fellow the school set a program in place where undergrad students had to "certificate" themselves in at least two "roles" (like Tester, Developer, Architect an such) before graduating.
That being said. Even in my country (our educational system focuses a lot on what we call "professional practice") it is rare for CS or IT students to get their hands on legacy projects that they can use to learn from their peers.
I did a yearlong capstone project, but aside from that I never worked on the same codebase for more than 6-8 weeks as part of my coursework. And even the capstone project was small enough that 1-2 strong developers could’ve handled it themselves.
I remember being concerned about this as a student, and definitely wished I’d had some experience with it when I got my first job.
For our 2nd semester software development lab we had to implement missing features inside a larger Java application. This application had been written some years ago and was reused every year for the lab.
So in a sense you could call it "legacy" because we had to implement solutions in an existing system, but there was always someone that knew how it had to be done because the system was designed to be incomplete as a learning tool.
It definitely helped me later when I took my first student job, because I had at least some experience with finding my way around an existing architecture with a couple of thousand lines of code.
I'm in a school of engineering that is not specialized in computer science, so I only had a handful of software development projects. If I'm counting well, that is 2 group projects and 1 or 2 alone.
All of them consisted in building something from scratch. Never had to deal with legacy. Only a bit during an internship.
I actually find this question extremely relevant — most schools don't teach how to deal with existing codebases. They teach you how to build systems, but not how to deal with systems other people have built. It seems even companies don't let interns deal too much with legacy code because of the burden of "getting into it".
I think if it was the case, it would not only teach the student/intern new skills but also let them develop empathy as to why it is important to write good code in the first place.
When I went to school for my degree a couple years ago, I can't recall ever working with a legacy codebase as part of an assignment. We had a number of assignments for creating fairly large codebases from scratch as a team but never working with one that had already been created.
For reference, I went to school at the University of Arkansas, Fort Smith and majored in Information Technology - Programming, class of 2015.
I'm in a year-long swe program ("boot camp") and the last module we have to create our own open source project and contribute to an existing open source project. That's the closest I'll get to legacy code in my program. Otherwise, I'm doing some freelance work on legacy wordpress websites.
I run Mayden Academy, a code school in Bath in the UK. Our students complete 7 large projects throughout the course, 2 of which are done with legacy applications. New features and bug fixes.
A few years ago we started two applications for internal use, to help run the school. These applications are now maintained and developed by our students, giving them the chance to work on real world applications with real users, and legacy code.
3 full years in college and the only time I ever worked on an existing codebase was during my internships.
We did one project in university where we analyzed the code/design patterns in JHotDraw (jhotdraw.org/) but it never involved making changes that we then submitted.
No. I am still a student and I can't believe that I am learning VB6 right now, and it is already "Legacy" :(
Never at university, only during internships and not with a 'legacy' codebase
For one of my courses at the University of Toronto we had to contribute a bug fix to Firefox. We were able to fix any existing bug (even if it was tiny) and if got accepted we got a bonus 1 or 2%.
No,never before in my college, infact just knew that there's something called as "codebase". 😄
In my company, trainee employee use legacy codebase for learning. I don't know how much it help them.
One of my non-breadth electives had a project which required us to have a PR accepted. Other than that, I barely did any modern programming in class
In my Operating Systems class, we worked with a precursor of the Linux kernel. Pretty sure that counts as legacy.