Search Results

Search found 6697 results on 268 pages for 'learning'.

Page 23/268 | < Previous Page | 19 20 21 22 23 24 25 26 27 28 29 30  | Next Page >

  • I don't know C. And why should I learn it?

    - by Stephen
    My first programming language was PHP (gasp). After that I started working with JavaScript. I've recently done work in C#. I've never once looked at low or mid level languages like C. The general consensus in the programming-community-at-large is that "a programmer who hasn't learned something like C, frankly, just can't handle programming concepts like pointers, data types, passing values by reference, etc." I do not agree. I argue that: Because high level languages are easily accessible, more "non-programmers" dive in and make a mess, and In order to really get anything done in a high level language, one needs to understand the same similar concepts that most proponents of "learn-low-level-first" evangelize about. Some people need to know C. Those people have jobs that require them to write low to mid-level code. I'm sure C is awesome. I'm sure there are a few bad programmers who know C. My question is, why the bias? As a good, honest, hungry programmer, if I had to learn C (for some unforeseen reason), I would learn C. Considering the multitude of languages out there, shouldn't good programmers focus on learning what advances us? Shouldn't we learn what interests us? Should we not utilize our finite time moving forward? Why do some programmers disagree with this? I believe that striving for excellence in what you do is the fundamental deterministic trait between good programmers and bad ones. Does anyone have any real world examples of how something written in a high level language--say Java, Pascal, PHP, or Javascript--truely benefitted from a prior knowledge of C? Examples would be most appreciated. (revised to better coincide with the six guidelines.)

    Read the article

  • Entry / JR Php Programmer - What do I learn next?

    - by dtj
    I got very interested in programming toward the end of college. Took a few classes, but learned most everything on my own via books and such. Its mostly been Php and MySQL. Right out of school, I got a job working at a company for 2 years (web media) and ended up learning a lot of stuff and programming some things for them. I am no longer at that company but I am looking for my next steps as a programmer. I really enjoy Web Development and Php and MySQL seems to be my thing. Basically, I know how to do CRUD operations, i am mediocre at OOP and still have more to learn, I know HTML and CSS quite well, I know my way around a Unix terminal and can access MySQL through it and set up cron jobs and such. I know some basic Javascript. Whats a good next step? I don't anything about 3rd party services, PDO, APIs (twitter, facebook, etc), Drupal / Joomla, Unit Testing, E-Commerce, PECL, PEAR ....in other words A LOT I get easily overwhelmed by the amount of stuff there is to learn, so I'm sort of trying to find a path. Right now, I'm digging into OOP more, as that seems like a good conceptual first-step. Any suggestions?

    Read the article

  • Experience vs. versatility

    - by Florin Bombeanu
    Let's say a .NET programmer works at a company which provides software on demand, not as a product. The programmer works in WPF for a period of time and he/she invests lots of time in it. He/she get very good at WPF and Windows Forms and desktop development in general. But the company has to provide a web application now, so the developer has to learn MVC or Web Forms. He/she is not experienced in web development so he/she starts investing time in this new technology and in time they get good at it. But this time the company has to provide a Sharepoint solution, and so on. What is more important: Being very very good at a certain technology, Or be as versatile as possible knowing less in each technology but covering a greater area of expertise? Should the programmer keep studying and working in WPF until he/she reaches a guru level or is it a good thing that they had to learn other technologies as well? I agree with those of you who will say that when learning different technologies you will also learn things which are useful no matter the technology you're programming in. But eventually, when the programmer will want to change jobs, will it matter more that he/she knows some WPF, MVC or Sharepoint than the fact that he/she is insanely good at one of them? I would think the second one is more important since most companies are looking for a developer for a certain technology. I don't think there are many companies looking for technical know-it-all people. What do you think?

    Read the article

  • Can One Get a Solid Programming Foundation Without Going To College/University?

    - by Daniel
    First, I have already searched the site and read all the previous "self-taught vs. college" topics. The majority of the answers defended that going to college was the best choice, for two main reasons: Going to college gives you the paper, which is essential to landing jobs, especially in tough economic times. Going to college gives you a solid programming base, teaching you the principles that will be essential regardless of the language/path you take after. Here comes my question: I am not worried about reason 1 at all, because I already have my own company (I build websites/ do affiliate marketing) and a stable financial situation, so I am pretty sure I won't need to look around for a job. I am worried about reason 2 though. That is, I want to make sure I'll have as solid a programming foundation as anyone else out there, and I am wondering if that is possible with self-learning. Suppose I take my time to study the very basics, like discrete maths, algorithm design, programming logic, computer architecture, Assembly, C programming, databases and data structures - mostly using books,online resources and lots of coding. Say I spend 1-2 years covering those basics. Do you think my foundation would be solid, or still lack in comparison to someone who went to college?

    Read the article

  • How to practice object oriented programming?

    - by user1620696
    I've always programmed in procedural languages and currently I'm moving towards object orientation. The main problem I've faced is that I can't see a way to practice object orientation in an effective way. I'll explain my point. When I've learned PHP and C it was pretty easy to practice: it was just matter of choosing something and thinking about an algorithm for that thing. In PHP for example, it was matter os sitting down and thinking: "well, just to practice, let me build one application with an administration area where people can add products". This was pretty easy, it was matter of thinking of an algorithm to register some user, to login the user, and to add the products. Combining these with PHP features, it was a good way to practice. Now, in object orientation we have lots of additional things. It's not just a matter of thinking about an algorithm, but analysing requirements deeper, writing use cases, figuring out class diagrams, properties and methods, setting up dependency injection and lots of things. The main point is that in the way I've been learning object orientation it seems that a good design is crucial, while in procedural languages one vague idea was enough. I'm not saying that in procedural languages we can write good software without design, just that for sake of practicing it is feasible, while in object orientation it seems not feasible to go without a good design, even for practicing. This seems to be a problem, because if each time I'm going to practice I need to figure out tons of requirements, use cases and so on, it seems to become not a good way to become better at object orientation, because this requires me to have one whole idea for an app everytime I'm going to practice. Because of that, what's a good way to practice object orientation?

    Read the article

  • I don't know C. And why should I learn it?

    - by Stephen
    My first programming language was PHP (gasp). After that I started working with JavaScript. I've recently done work in C#. I've never once looked at low or mid level languages like C. The general consensus in the programming-community-at-large is that "a programmer who hasn't learned something like C, frankly, just can't handle programming concepts like pointers, data types, passing values by reference, etc." I do not agree. I argue that: Because high level languages are easily accessible, more "non-programmers" dive in and make a mess In order to really get anything done in a high level language, one needs to understand the same similar concepts that most proponents of "learn-low-level-first" evangelize about. Some people need to know C; those people have jobs that require them to write low to mid-level code. I'm sure C is awesome, and I'm sure there are a few bad programmers who know C. Why the bias? As a good, honest, hungry programmer, if I had to learn C (for some unforeseen reason), I would learn C. Considering the multitude of languages out there, shouldn't good programmers focus on learning what advances us? Shouldn't we learn what interests us? Should we not utilize our finite time moving forward? Why do some programmers disagree with this? I believe that striving for excellence in what you do is the fundamental deterministic trait between good programmers and bad ones. Does anyone have any real world examples of how something written in a high level language—say Java, Pascal, PHP, or Javascript—truely benefitted from a prior knowledge of C? Examples would be most appreciated.

    Read the article

  • Tried teaching myself to program before college, accidently overwhelmed myself, tips?

    - by Gunnar Keith
    I'm sixteen, I'm overly interested in programming, and I'm currently taking IT classes during my mornings in high school. Last year, I tried teaching myself to code. It was quite exciting, but all I did was watch TheNewBoston's videos on YouTube for Python. After his tutorials, I just did research, made some CMD programs, and that's it. After that, I got cocky and got my feet wet in many other languages. Java, C++, C#, Perl, Ruby... and it overwhelmed me. Which made it less fun to code. I want to go to college for a 2 year programming course. And I want to make writing code my profession. But how do you recommend I attack re-learning it all again? Start with Python? Don't even try? Also, I'm not 100% in math, but I'm good friends with a lot of programmers, who say they suck at math, but manage to code just fine. I'm not looking for negative feedback. I just want the proper head-start on things before college.

    Read the article

  • Expanding existing DVCS Wiki

    - by A Lion
    A portion of my job is to maintain technical documentation for a rapidly expanding manufacturing company. Because it is only a portion of my job and the company's product line is expanding so quickly, I can't stay on top of the documentation. As a result, I've been yearning for an information management system with a handful of specific features. I've found many products that have a subset, but none that have all the features I'm looking for. I'm at the point of picking an existing product and expanding it to cover my desired feature set, however, this will be a pet project and I will be learning the underlying language as I go. So, the main question is which existing product will be the easiest to expand to cover the full feature set and has a relatively easy to learn language? Alternatively, have I missed another existing program that will cover the feature set or should be in my list of "close, but not quite there"? Feature Set web interface based on a distributed version control system (e.g., git) easy to edit by logged in novices (e.g. wiki, multimarkdown) outputs in more traditional formats (e.g., doc, odt, pdf) edits held in queue until editor/engineer/manager approves them (e.g., MS Word editing) [this is the really big elephant in list - suggestions on where to start appreciated] edits held in queue specifically for engineer approval [extra limb of the elephant in the list] well-supported in the open source community Closest, but not quite there ikiwiki - http://ikiwiki.info (php) lots of awesome functionality and extensions, including easy to edit and based on DVCS lacks a review/forward for review queue appears to be well-supported within the OSS community gitit - http://gitit.net/ (haskell) easy to edit and based on DVCS lots of outputs in traditional formats a great web-based gui diff interface lacks a review/forward for review queue appears to be primarily maintained by one individual

    Read the article

  • What's the best way to learn/increase problem-solving skills?

    - by tucaz
    Hi all! I'm not sure this is the right place to ask this question, neither if this is the right way to ask this question but I hope you help me if it is not. I work as a programmer since I was 15 (will be 24 next week) so learning programming logic was somehow natural during the course of my career and I think that it helped me to get pretty good problem-solving. One thing none of us (programmers) can deny is that programming logic helps us in a lot of fields outside computer programming. So I'd say it is a very valuable resource that one should learn. My girlfriend is not a programmer and graduated in college on a non related course (Foreign Relations) because she didn't know what to study back then. As the years passed she discovered that she liked Logistics and started to work with it almost two years ago. However, since she does not have a technical background (not even basic Math) she is really having a hard time with it. She is already trying to catch up with Math, but even simple questions/brain-teasers are hard to her. For example, trying to find the missing numbers of this sequence: 0, 1, 1, 2, 3, 5, 8, _, _, 34 and so on. We know that this is Fibonacci but if we didn't we would probably be able to get to the correct answer just by "guessing" (using our acquired problem-solving skills). I'm not sure if problem-solving skills or logic are the correct name for it, but this is what I mean: quick solve problems, brain-teasers, find patterns, have a "sharp" mind. So, the question is: what is the best way for someone to learn this kind of skills without being a programmer (or studying algorithms and such)? If you say it is a book, could you please recommend one? Thanks a lot!

    Read the article

  • I know how to program, and how to learn how to program, but how/where do you learn how to make systems properly?

    - by Ryan
    There are many things that need to be considered when making a system, let's take for example a web based system where users log in and interact with each other, creating and editing content. Now I have to think about security, validation (I don't even think I am 100% sure what that entails), "making sure users don't step on each others feet" (term for this?), preventing errors in many cases, making sure database data doesn't become problematic through unexpected... situations? All these things I don't know how or where to learn, is there a book on this kind of stuff? Like I said there seems to be a huge difference between writing code and actually writing the right code, know what I mean? I feel like my current programming work lacks much of what I have described and I can see the problems it causes later, and then the problems are much harder to solve because data exists and people are using it. So can anyone point me to books or resources or the proper subset of programming(?) for this type of learning? PS: feel free to correct my tags, I don't know what I am talking about. Edit: I assume some of the examples I wrote apply to other types of systems too, I just don't know any other good examples because I've been mostly involved in web work.

    Read the article

  • Where to go after having a good grasp of a language?

    - by Alex M.
    I have been programming as a hobby for the past few years now (most of high school and 1 year in cs in college) and although I've came to the conclusion that a career in CS isn't for me I switched over to math (which pairs what I love about programming with my interest in physical sciences) but I miss writing code. Recently I've had an interest in low-level programming. Understanding how compilers work, learning some basics of assembly language and trying to get out of my comfort zone. The problem is that since I've been out of the CS programs, I'm not faced with much opportunities to write code. I do intend to take a few CS classes in college (a lot of CS stuff is opened to math majors) but that won't come for until next year. So I ask: What are the steps to take in order to keep improving as a programmer once you're passed the basic steps? How do you find projects to keep you going? Beside my newly discovered interest in assembly language, I've been writing code in C and have been interested in FOSS. Thanks!

    Read the article

  • Is dependency injection by hand a better alternative to composition and polymorphism?

    - by Drake Clarris
    First, I'm an entry level programmer; In fact, I'm finishing an A.S. degree with a final capstone project over the summer. In my new job, when there isn't some project for me to do (they're waiting to fill the team with more new hires), I've been given books to read and learn from while I wait - some textbooks, others not so much (like Code Complete). After going through these books, I've turned to the internet to learn as much as possible, and started learning about SOLID and DI (we talked some about Liskov's substitution principle, but not much else SOLID ideas). So as I've learned, I sat down to do to learn better, and began writing some code to utilize DI by hand (there are no DI frameworks on the development computers). Thing is, as I do it, I notice it feels familiar... and it seems like it is very much like work I've done in the past using composition of abstract classes using polymorphism. Am I missing a bigger picture here? Is there something about DI (at least by hand) that goes beyond that? I understand the possibility of having configurations not in code of some DI frameworks having some great benefits as far as changing things without having to recompile, but when doing it by hand, I'm not sure if it's any different than stated above... Some insight into this would be very helpful!

    Read the article

  • C++ Iterator lifetime and detecting invalidation

    - by DK.
    Based on what's considered idiomatic in C++11: should an iterator into a custom container survive the container itself being destroyed? should it be possible to detect when an iterator becomes invalidated? are the above conditional on "debug builds" in practice? Details: I've recently been brushing up on my C++ and learning my way around C++11. As part of that, I've been writing an idiomatic wrapper around the uriparser library. Part of this is wrapping the linked list representation of parsed path components. I'm looking for advice on what's idiomatic for containers. One thing that worries me, coming most recently from garbage-collected languages, is ensuring that random objects don't just go disappearing on users if they make a mistake regarding lifetimes. To account for this, both the PathList container and its iterators keep a shared_ptr to the actual internal state object. This ensures that as long as anything pointing into that data exists, so does the data. However, looking at the STL (and lots of searching), it doesn't look like C++ containers guarantee this. I have this horrible suspicion that the expectation is to just let containers be destroyed, invalidating any iterators along with it. std::vector certainly seems to let iterators get invalidated and still (incorrectly) function. What I want to know is: what is expected from "good"/idiomatic C++11 code? Given the shiny new smart pointers, it seems kind of strange that STL allows you to easily blow your legs off by accidentally leaking an iterator. Is using shared_ptr to the backing data an unnecessary inefficiency, a good idea for debugging or something expected that STL just doesn't do? (I'm hoping that grounding this to "idiomatic C++11" avoids charges of subjectivity...)

    Read the article

  • What should I "forget" when going to Javascript?

    - by ElGringoGrande
    I went from C=64 Basic and assembler to FORTRAN and C to C++ and Java. Professionally I started in Visual Basic for applications then to Visual Basic 4, 5, 6. After that VB.NET AND C# with some Java here and there. I have played with Ruby and Python and found both fun. During each step I never felt like I had to forget what I had learned before. I always felt like I was just learning better and/or slightly different ways of doing things but the difference was not major. The difference was like the difference between American, Australian and British English. (Maybe assembler was Latin and FORTRAN was Spanish.) But now I am using JavaScript to do real, actual work. (Before used it as a "Scripting" language pure a simple.) And I just feel like I have to forget some things to become proficient in it. It feels like some old Egyptian language. What should I forget? Is it just that code organization is different (no real classes so no one class one file)? Or is it something more basic?

    Read the article

  • Function calls to calls in windows api

    - by Apeee
    I am a beginner, and learning C, I find it hard to grasp the whole programming concept. so hopefully this would help to clear up some things along the way. When programming in windows, which is my aim for the time being, it is really hard for me to understand how windows communicate with the programs that run on it. A question i have been pondering about is how when you incorporate a function call which is in another memory location on the disk or memory(not a function you yourself wrote and is included in the compilation), especially the windows API, does the compiler know where the function location is so when the program is run it can call that function? For example, a very simple program that displays a window which reads hello world. You would have to call windows API functions to achieve such features as creating the window, its size, colors and so on... So basically what I am struggling to grasp is how the programs I write communicate with the platform, framework they are run on(generally windows for Windows API). Apart from clarification on this one above, i would love a resource that explains this concept further. Thanks for your time!

    Read the article

  • Should I avoid or embrace asking questions of other developers on the job?

    - by T.K.
    As a CS undergraduate, the people around me are either learning or are paid to teach me, but as a software developer, the people around me have tasks of their own. They aren't paid to teach me, and conversely, I am paid to contribute. When I first started working as a software developer co-op, I was introduced to a huge code base written in a language I had never used before. I had plenty of questions, but didn't want to bother my co-workers with all of them - it wasted their time and hurt my pride. Instead, I spent a lot of time bouncing between IDE and browser, trying to make sense of what had already been written and differentiate between expected behavior and symptoms of bugs. I'd ask my co-workers when I felt that the root of my lack of understanding was an in-house concept that I wouldn't find on the internet, but aside from that, I tried to confine my questions to lunch hours. Naturally, there were occasions where I wasted time trying to understand something in code on the internet that had, at its heart, an in-house concept, but overall, I felt I was productive enough during my first semester, contributing about as much as one could expect and gaining a pretty decent understanding of large parts of the product. I was wondering what senior developers felt about that mindset. Should new developers ask more questions to get to speed faster, or should they do their own research for themselves? I see benefits to both mindsets, and anticipate a large variety of responses, but I figure new developers might appreciate your answers without thinking to ask this question.

    Read the article

  • What to learn after standard C++?

    - by Luca Cerone
    I switched to C++ a few months ago, learning its syntax, the main features of the STL and what you can usually find in a "learn C++" manual. Now I would like to go further. What would be your recommendations? I would like to know what to learn next (not only about the language, but also debugging, frameworks etc. etc.) I know probably the answer depends on the specific needs of each user, so here is a list of mine: Cross Platform development Developing GUI for my programs Develop extendible software, allowing the use of plugins Use of scientific libraries Interact with databases (mainly MySQL) Having server/client functionalities (I'd like users of my programs to interact through internet.. as you might have guessed I am not a programmer by training so I might have used the wrong terms.. if so I apologize for that). Of course I know it takes time, but I would like to have a good list of references and resources to start (both books and websites are ok). Thanks a lot for your help!

    Read the article

  • can I achieve my dreams without a degree? [closed]

    - by Dhananjay
    It's really giving me a lot stress as my parents saying me to join college but I don't want. I know I can learn all programming by self studying but they keep saying join college otherwise no one will give you job. I always think positive but sometimes I also start thinking like them (what if my life will be spoiled if I do not go to college) There are so much things on internet. I can learn c++, objective c, java, AI, html, php all through internet (at least I think that I can learn whole by self studying and I can give 10 hour/day easily for studying) and I will keep practicing and become a good programmer in 2 years and then try to do some job for experience so no need to waste 4 years just for studying things which I can learn in 2 years and no need to waste money on college because they teach physics, chemistry all in first 2 years and I only want to study comp. Science. But now again I am thinking negative that what will happen if I do not get degree and what will I do after learning programming if I don't get job? Please suggest what should I do? Should I join college? or self study? Can I achieve my dreams without a degree if I study hard and learn many things? I have full confidence that I can self teach myself better than they will teach in college. I will open my app company and many more. But maybe I am over confident because I don't know what happens in real world. How they treat a person without degree, etc. Anyone of you had gone through this condition? What did you do?

    Read the article

  • Diving into a computer science career [closed]

    - by Willis
    Well first I would like to say thank you for taking the time to read my question. I'll give you some background. I graduated two years ago from a local UC in my state with a degree in cognitive psychology and worked in a neuroscience lab. During this time I was exposed to some light Matlab programming and other programming tidbits, but before this I had some basic understanding of programming. My father worked IT for a company when I was younger so I picked up his books and took learned things along the way growing up. Naturally I'm an inquisitive person, constantly learning, love challenges, and have had exposure to some languages. Yet at this point I was fully pursue it as a career and always had this in the back of my head. Where do I start? I'm 25 and feel like I still have time to make a switch. I've immersed myself in the terminal/command prompt to start, but which language do I focus on? I've read the A+ book and planning to take on the exam, then the networking exam, but I want to deal with more programming, development, and troubleshooting. I understand to get involved in open source, but where? I took the next step and got a small IT assistant job, but doesn't really deal with programming, development, just troubling shooting and small network issues. Thank you!

    Read the article

  • English major new to programming. What language should I learn first? [closed]

    - by PJKaka
    After working extensively an internet startup in a marketing positions, I've decided to wade into the entrepreneurship pool with a startup of my own. The only problem: I don't have any particular technical skills to speak of. Although I can find a technical co-founder, I'd rather not be the stereotypical 'business guy' drumming his fingers on the desk and asking 'how much longer?' as my technical co-founder codes away. I would like to understand code and what's happening in the backend, even if I don't end up being anything more than a 'passable' programmer. With this in mind, which language should I try to learn first? For the record, I'm quite proficient with HTML, CSS, and a bit of JavaScript. I have some familiarity with PHP because I've toyed around with WordPress a lot, but my knowledge is limited at best. My math skills are quite strong. I took some advanced calculus courses in college since I've always enjoyed the subject. While my goals are to learn web development, I wouldn't mind learning some hardcore object oriented programming skills in C or Java as well.

    Read the article

  • What should a programmer's yearly routine be to maximize their technical skills?

    - by sguptaet
    2 years ago I made a big career change into programming. I learned various technologies on my own without any prior experience. I really love it and feel lucky with all the resources around us to help us learn. Books, courses, open-source, etc. There are so many avenues. I'm wondering what a good routine would be to follow to maximize my software development skills. I don't believe just building software is the way, because that leaves no time for learning new concepts or technologies. I'm looking for an answer like this: Take a new concept sabbatical/workshop 2 weeks per year. Read 1 theoretical and 1 practical programming book per year. Learn 1 additional language every 2 years. Take a 1 week vacation every 6 months. Etc. I realize that the above might sound naive and unrealistic as there are so many factors. But I'd like to know the "recipe" that you think is best that will serve as a guide for people.

    Read the article

  • Will I be able to get programming interviews at good software companies with a non-CS degree?

    - by friend
    I'll be graduating in a year, but I'll have a degree in Economics. I'm pretty much done with my Economics coursework, and by the time next year comes around I will have devoted 1.5 years to learning CS. I will have almost finished the requirements to graduate with a degree in CS, but unfortunately my school requires a science series that would add another 6-9 months of study if I were to try and get the degree (not to mention a max unit cap). I have or will have taken: Objected Oriented Programming Discrete Math Data structures Calculus through multivariable (doubt this matters at all) Linear Algebra (same) Computer Organization Operating Systems Computational Statistics (many data mining projects in R) Parallel Programming Programming Languages Databases Algorithms Compilers Artificial Intelligence I've done well in the ones I've taken, and I hope to do well in the rest, but will that matter if I can't say to the HR people that I have a CS degree? I'd be happy to get an internship at first too, so should I just apply as if I'm an intern and not looking for fulltime, and then try and parlay that into something? Sidenote if you have time -- Is a computer networks or theory of computation class important? Would it be worth taking either of those in lieu of a class on my list? edit -- I know this isn't AskReddit or College Confidential; I know there will be some outrage at posting a question like this. I'm merely looking for insight into a situation that I've been struggling with, and I think this is the absolute best place to find an answer to this question. Thanks.

    Read the article

  • Is Reading the Spec Enough?

    - by jozefg
    This question is centered around Scheme but really could be applied to any LISP or programming language in general. Background So I recently picked up Scheme again having toyed with it once or twice before. In order to solidify my understanding of the language, I found the Revised^5 Report on the Algorithmic Language Scheme and have been reading through that along with my compiler/interpreter's (Chicken Scheme) listed extensions/implementations. Additionally, in order to see this applied I have been actively seeking out Scheme code in open source projects and such and tried to read and understand it. This has been sufficient so far for me understanding the syntax of Scheme and I've completed almost all of the Ninety-nine Scheme problems (see here) as well as a decent number of Project Euler problems. Question While so far this hasn't been an issue and my solutions closely match those provided, am I missing out on a great part of Scheme? Or to phrase my question more generally, does reading the specification of a language along with well written code in that language sufficient to learn from? Or are other resources, books, lectures, videos, blogs, etc necessary for the learning process as well.

    Read the article

  • How you choose your first job as a programmer? [on hold]

    - by sliter
    For Brief I am a recently graduated CS student. I am looking for a job these days, but I have no idea what kind of software development jobs I like(embedded system,web development or else...). And I am looking for your advice. Here is a little more While I was a student, I had an one year internship experience as a system engineer in a semi-conductor company where I wrote Linux driver, tuned system performance, etc.. I was happy about this experience as it allowed me to deepen my understanding of the operating system and different low level things. And I thought "Em, I will continue in the embedded area after I graduate". At the end of my study, I am doing an another internship in web development, both front-end and back-end. And I also enjoys a lot the process of learning new things and making it work (Backbone, Node, socketio, etc..). Now, when I am looking for a software development position, I do not know what to apply! All I know is that I want a job which allows me to keep up with the trends instead of repeating. But besides this, I've no idea what specific type of job I want to do. Turn back to embedded system? Continue with web development? Change to other promising areas(data mining)? All these development positions makes no big difference to me. But I think this is not good and I need some criteria at choosing. So I am looking for advice and I would really appreciate if you can share your experience.

    Read the article

  • Should I avoid or embrace asking questions of other developers on the job?

    - by T.K.
    As a CS undergraduate, the people around me are either learning or are paid to teach me, but as a software developer, the people around me have tasks of their own. They aren't paid to teach me, and conversely, I am paid to contribute. When I first started working as a software developer co-op, I was introduced to a huge code base written in a language I had never used before. I had plenty of questions, but didn't want to bother my co-workers with all of them - it wasted their time and hurt my pride. Instead, I spent a lot of time bouncing between IDE and browser, trying to make sense of what had already been written and differentiate between expected behavior and symptoms of bugs. I'd ask my co-workers when I felt that the root of my lack of understanding was an in-house concept that I wouldn't find on the internet, but aside from that, I tried to confine my questions to lunch hours. Naturally, there were occasions where I wasted time trying to understand something in code on the internet that had, at its heart, an in-house concept, but overall, I felt I was productive enough during my first semester, contributing about as much as one could expect and gaining a pretty decent understanding of large parts of the product. I was wondering what senior developers felt about that mindset. Should new developers ask more questions to get to speed faster, or should they do their own research for themselves? I see benefits to both mindsets, and anticipate a large variety of responses, but I figure new developers might appreciate your answers without thinking to ask this question.

    Read the article

< Previous Page | 19 20 21 22 23 24 25 26 27 28 29 30  | Next Page >