Just some thoughts to trends/things that some people say when I am doing interviews. Some I strongly disagree with, others ... I am curious what other peoples thoughts and experiences are with them.
- You can't keep up with the young 'uns
- You have to change jobs every year, or you will never advance
- If you learn on your own, without 'real' experience, it is worthless
- To really become an expert, you must focus on one specific field and ignore the others
- Details are not important. The compiler will catch it
This was first related to me by an applicant in her late twenties, with only a little working experience but lots of degrees. In the interview, she said that her goal was to move into project management because, "as you get older, your thought processes slow and you cannot code as fast as those young kids."
The sad part was, all the senior applicants I interviewed that day shared exactly the same viewpoint. All wanted to move out of coding and into project management. They wanted to get into management not because they enjoyed it or because it had to be done, but because they felt it was their only escape from obsolescence.
This is silly. This ignores all the experience and knowledge that older, more experienced coders have gained through painful experience. This assumes that a person's mental faculties begin deteriorating at such a young age (not true!) and that quantity of code equates to productivity. I know I certainly produced a lot more volume of code in my early programming years, but experience has taught me to produce more effective code. It takes fewer lines, has far fewer defects and is far easier to maintain later. If anything, the years have taught me that quality is far more critical than quantity. And woe to anyone that even thinks I'm close to being an old fart ...
2. You have to change jobs at least once a year, or you will never advance
This particular one really annoys me. It feels like the get-rich-quick days of the dotcom boom again. Demand for good technical people is high in China, especially so for those with FOSS skills. Because of this, opportunities abound and everyone is jumping from one job to another to get more money.
I suppose it does give a person more experience and probably a higher salary in the short run. But do you truly gain any skills when you jump around this quickly? By moving so much, you miss the big picture. Most major projects have lifecycles longer than a year.
An eye opening experience I find for most people is to be still around during the maintenance phase of a project. It is then that you realize that those fancy coding hacks you did, or that /* This will never happen */ section of code, will come back and bite you. You get to see the long-term price of taking shorts cuts and skimping on quality. It gives you an appreciation for doing things _right_, especially if you have been through two project lifecycles, where you have seen on where it went right and one where it went wrong.
3. If you learn on your own, without 'real' experience, it is worthless
One thing we always make a point to ask for is whether a person tinkers with technologies on their own, outside of what is required for work or class. More than one applicant has justified not doing so by saying, "if it was not done for work, it is not real world experience. It is worthless."
Is it? The best technical people I know are always voraciously learning. Some deliberately learn technologies far removed from their daily work, to stretch their brain to new ways of thinking. The diverse knowledge base they draw on seems to serve them well.
Yes, "real world" experience is valuable. You never truly appreciate the value of caches until you have been slashdotted, or when you max out all the CPUs on your server farm serving dynamic content. It is hard to really appreciate the value of distributed databases until you have so many transactions that the fastest hard disks in existence cannot write the transactions to disk as fast as they pour in.
However, they are not everything. There are many people with "real world" experience who still have not really learned the best practices and techniques. Unless you are really lucky, you cannot learn this unless you are constantly looking for it. The chances of you working with the best in your field are extremely limited, unless you're lucky enough to be working for someplace like google.
4. To really become an expert, you must focus on one specific field and ignore the others
Partially related to #3. I meet a lot of people with only Java experience. They know every single Java library, framework and technique there is. But nothing else. They have never touched another language, framework or system. They sneer at all others as inferior, but have never tried them.
Depth is valuable, but so is breadth. I have watched Java experts sink their systems as they tried to employ lower performance Java web caches, when squid outperforms the faster available system by an order of magnitude. I have seen people try to squeeze J2ME into embedded systems when an ncurses interface is the only thing that would have given a halfway decent response time.
As they say, when you only have a hammer, everything in the world looks like a nail. If you don't know anything else, you will always use the tools you know, even if they are not the most appropriate for the system. You miss the best of breed tools.
5. Details are not important. The compiler will catch it
Final peeve. Some applicants make careless and copious mistakes while answering their coding question. They explain this away as, "oh, the compiler will catch it when I run this on a computer."
The compiler is not your nanny. It is not your QA department. Even if you had a QA staff to back you up, the most qualified person to find errors in your code is you. There are insidious logic errors that no compiler will catch and even good code reviews and QA people may miss. The best and easiest time to catch errors is right after you have written the code, when all the variables, requirements and interfaces are fresh in your head. Spend the time to review your code and catch everything you can.
Attention to detail is really important. If you lack it, it comes back to haunt you. If you have it, your users may not be able to explain why, but they will like your work a lot more. It is like the Apple products which I rave about - the attention to detail and usability have given their products an incredible competitive advantage. Even with commodity, even sometimes inferior hardware, they can still command per unit profit margins that are the envy of other PC manufacturers.