Search Results

Search found 15103 results on 605 pages for 'programmers notepad'.

Page 198/605 | < Previous Page | 194 195 196 197 198 199 200 201 202 203 204 205  | Next Page >

  • Can SAML use saleforce login information to log into another system inside a view?

    - by steve
    I want my sales people, whom use salesforce every day, to be able to view orders in a ecommerce system through a dashboard view in salesforce. The ecom is built and sitting on my web server but the sales reps dont like to log into too many things in one day so they are not using what I built them. I read recently that salesforce can use SAML but it was unclear as to what you can do with it. What I'd like, is to make a new dash board view that will open up the ecom inside of salesforce. The ecom uses a login system but if it is inside of saleforce would SAML automatically log into the ecom?

    Read the article

  • Are there any "best practices" on cross-device development?

    - by vstrien
    Developing for smartphones in the way the industry is currently doing is relatively new. Of course, there has been enterprise-level mobile development for several decades. The platforms have changed, however. Think of: from stylus-input to touch-input (different screen res, different control layout etc.) new ways of handling multi-tasking on mobile platforms (e.g. WP7's "tombstoning") The way these platforms work aren't totally new (iPhone has been around for quite awhile now for example), but at the moment when developing a functionally equal application for both desktop and smartphone it comes down to developing two applications from ground up. Especially with the birth of Windows Phone with the .NET-platform on board and using Silverlight as UI-language, it's becoming appealing to promote the re-use of (parts of the UI). Still, it's fairly obvious that the needs of an application on a smartphone (or tablet) are very different compared to the needs of a desktop application. An (almost) one-on-one conversion will therefore be impossible. My question: are there "best practices", pitfalls etc. documented about developing "cross-device" applications (for example, developing an app for both the desktop and the smartphone/tablet)? I've been looking at weblogs, scientific papers and more for a week or so, but what I've found so far is only about "migratory interfaces".

    Read the article

  • How do you stay productive when dealing with extremely badly written code?

    - by gaearon
    I don't have much experience in working in software industry, being self-taught and having participated in open source before deciding to take a job. Now that I work for money, I also have to deal with some unpleasant stuff, which is normal of course. Recently I was assigned to add logging to a large SharePoint project which is written by some programmer who obviously was learning to code on the job. After 2 years of collaboration, the client switched to our company, but the damage was done, and now somehow I need to maintain this code. Not that the code was too hard to read. Despite problems - each project has one class with several copy-pasted methods, enormous if nestings, Systems Hungarian, undisposed connections — it's still readable. However, I found myself absolutely unproductive despite working on something as simple as adding logging. Basically, I just need to go through the code step by step and add some trace calls. However, the idiocy of the code is so annoying that I get tired within 10 minutes of starting. In the beginning, I used to add using constructs, reduce nesting by reversing if's, rename the variables to readable names—but the project is large, and eventually I gave up. I know this is not the task I should be doing, but at least reducing the mess gave me some kind of psychological reward so I could keep going. Now the trick stopped working, and I still have 60% of my work to do. I started having headaches after work, and I no longer get the feeling of satisfaction I used to get - which would usually allow me to code for 10 hours straight and still feel fresh. This is not just one big rant, for I really do have an actual question: Is there a way to stay productive and not to fight the windmills? Is there some kind of psychological trick to stay focused on the task, instead of thinking “How stupid is that?” each time I see another clever trick by the previous programmer? The problem with adding logging is that I actually have to understand what the code does, and doing so hurts my brain in an unpleasant fashion.

    Read the article

  • Simulating simultaneous entities

    - by Steven Jeuris
    Consider the need to simulate a set of entitities in an accurate way. All entities exist in an artificial timeline. Within 'steps' of this timeline, all entities can do certain operations. It is imperative that timed events, are handled accurately, and not in processing order. So simple threading isn't a proper simulation, nor is procedurally walking across all entities. Processing may be slow, accuracy is key here. I have some ideas how to implement this myself, but most likely something like this has been done before. Are there any frameworks available for these purposes? Is there any particular paradigm more suitable?

    Read the article

  • Standards for how developers work on their own workstations

    - by Jon Hopkins
    We've just come across one of those situations which occasionally comes up when a developer goes off sick for a few days mid-project. There were a few questions about whether he'd committed the latest version of his code or whether there was something more recent on his local machine we should be looking at, and we had a delivery to a customer pending so we couldn't wait for him to return. One of the other developers logged on as him to see and found a mess of workspaces, many seemingly of the same projects, with timestamps that made it unclear which one was "current" (he was prototyping some bits on versions of the project other than his "core" one). Obviously this is a pain in the neck, however the alternative (which would seem to be strict standards for how each developer works on their own machine to ensure that any other developer can pick things up with a minimum of effort) is likely to break many developers personal work flows and lead to inefficiency on an individual level. I'm not talking about standards for checked-in code, or even general development standards, I'm talking about how a developer works locally, a domain generally considered (in my experience) to be almost entirely under the developers own control. So how do you handle situations like this? Are the one of those things that just happens and you have to deal with, the price you pay for developers being allowed to work in the way that best suits them? Or do you ask developers to adhere to standards in this area - use of specific directories, naming standards, notes on a wiki or whatever? And if so what do your standards cover, how strict are they, how do you police them and so on? Or is there another solution I'm missing? [Assume for the sake of argument that the developer can not be contacted to talk through what he was doing here - even if he could knowing and describing which workspace is which from memory isn't going to be simple and flawless and sometimes people genuinely can't be contacted and I'd like a solution which covers all eventualities.]

    Read the article

  • Building a Redundant / Distrubuted Application

    - by MattW
    This is more of a "point me in the right direction" question. I (and my team of 3) have built a hosted web app that queues and routes customer chat requests to available customer service agents (It does other things as well, but this is enough background to illustrate the issue). The basic dev architecture today is: a single page ajax web UI (ASP.NET MVC) with floating chat windows (think Gmail) a backend Windows service to queue and route the chat requests this service also logs the chats, calculates service levels, etc a Comet server product that routes data between the web frontend and the backend Windows service this also helps us detect which Agents are still connected (online) And our hardware architecture today is: 2 servers to host the web UI portion of the application a load balancer to route requests to the 2 different web app servers a third server to host the SQL Server DB and the backend Windows service responsible for queuing / delivering chats So as it stands today, one of the web app servers could go down and we would be ok. However, if something would happen to the SQL Server / Windows Service server we would be boned. My question - how can I make this backend Windows service logic be able to be spread across multiple machines (distributed)? The Windows service is written to accept requests from the Comet server, check for available Agents, and route the chat to those agents. How can I make this more distributed? How can I make it so that I can distribute the work of the backend Windows service can be spread across multiple machines for redundancy and uptime purposes? Will I need to re-write it with distributed computing in mind? I should also note that I am hosting all of this on Rackspace Cloud instances - so maybe it is something I should be less concerned about? Thanks in advance for any help!

    Read the article

  • Open Source Web-based CMS for writing and managing API documentation

    - by netcoder
    This is a question that have somewhat been asked before (i.e.: How to manage an open source project's documentation). However, my question is a little different because: We're not developing open source software, but proprietary software The documentation has to be hand-written, because we do not want to publish the actual software API documentation, but only the public API documentation I do want developers and project managers to write the documentation collaboratively Obviously, wikis are a solution, but they're very generic. I'm looking for a more specialized tool for this job. I've looked around and found a few like Adobe Robohelp, SaaS solutions and such, but I'd like to know if any open source software exists for that purpose. Do you know any Open Source Web-based CMS for writing and managing API and software documentation?

    Read the article

  • How do I inject test objects when the real objects are created dynamically?

    - by JW01
    I want to make a class testable using dependency injection. But the class creates multiple objects at runtime, and passes different values to their constructor. Here's a simplified example: public abstract class Validator { private ErrorList errors; public abstract void validate(); public void addError(String text) { errors.add( new ValidationError(text)); } public int getNumErrors() { return errors.count() } } public class AgeValidator extends Validator { public void validate() { addError("first name invalid"); addError("last name invalid"); } } (There are many other subclasses of Validator.) What's the best way to change this, so I can inject a fake object instead of ValidationError? I can create an AbstractValidationErrorFactory, and inject the factory instead. This would work, but it seems like I'll end up creating tons of little factories and factory interfaces, for every dependency of this sort. Is there a better way?

    Read the article

  • MonoGame; reliable enough to be accepted on iOS, Win 8 and Android stores?

    - by Serguei Fedorov
    I love XNA; it simplifies rendering code to where I don't have to deal with it, it runs on C# and has very fairly large community and documentation. I would love to be able to use it for games across many platforms. However, I am a little bit concerned about how well it will be met by platform owners; Apple has very tight rules about code base but Android does not. Microsoft's new Windows 8 platforms seems to be pretty lenient but I am not sure oh how they would respond to an XNA project being pushed to the app store (given they suddenly decided to dump it and force developers to use C++/Direct3D). So the bottom line is; is it safe to invest time and energy into a project that runs on MonoGame? In the end, is is possible to see my game on multiple platforms and not be shot down with a useless product?

    Read the article

  • Benchmarking CPU processing power

    - by Federico Zancan
    Provided that many tools for computers benchmarking are available already, I'd like to write my own, starting with processing power measurement. I'd like to write it in C under Linux, but other language alternatives are welcome. I thought starting from floating point operations per second, but it is just a hint. I also thought it'd be correct to keep track of CPU number of cores, RAM amount and the like, to more consistently associate results with CPU architecture. How would you proceed to the task of measuring CPU computing power? And on top of that: I would worry about a properly minimum workload induced by concurrently running services; is it correct to run benchmarking as a standalone (and possibly avulsed from the OS environment) process?

    Read the article

  • How to explain a layperson why a developer should not be interrupted while neck-deep in coding?

    - by András Szepesházi
    If you just consider the second part of my question, "Why a developer should not be interrupted while neck-deep in coding", that has been discussed a number of times by smart people. Heck, even the co-founder of SO, Joel Spolsky, wrote a blog post about "getting in the zone" and "being knocked out of the zone" and why it takes an average of 15 minutes to achieve productivity when participating in complex, software development related tasks. So I think the why has been established. What I'm interested in is how to explain all that to somebody who doesn't know beans about Beans (khmm I mean software development). How to tell the wife, or the funny guy from accounting at the workplace, or the long time friend who pings you on Skype every 30 minutes with a "Wazzzzzzup?!", that all the interruptions have a much deeper impact on your work than the obvious 30 seconds they took from your time. Obviously you can't explain it by sentences like "I have to juggle a lot of variable names in my short term memory" unless you want to be the target of blank stares or friendly abuse. I'd like to be able to explain all that to non-developers in a way that will make them clearly understand - without being offensive, elitist or too technical.

    Read the article

  • Dependency Checker/ Installer With Java/Ant

    - by jsn
    I need some kind of software to easily roll out code on new servers. I use Apache Ant for builds. However, say I want to set-up a new server fast and my Java program depends on GhostScript, if there any software that can automatically check the computer for it (and then maybe the PATH) and add it if is not there? I have already looked at Maven and Apache Ivy, however, I think these are only for .jar files (from what I saw). Thanks for any help.

    Read the article

  • Why is C++ predominant in programming contests and competitions?

    - by daniels
    I understand that C++ is a very fast language, but ain't C just as fast, or faster in some cases? Then you might say that C++ has OOP, but the amount of OOP you need for most programming puzzles is not that big, and in my opinion C would be able handle that. Here's why I am asking this: I am very interested in programming contests and competitions, and I am used to coding in C on those. However, I noticed that the vast majority of people use C++ (e.g., 17 out of 25 finalists on Google Code Jam 2011 used it, while no one used C), so I am wondering if I am at a disadvantage going with C. Apart from the Object Orientation, what makes C++ a more suitable language for programming competitions? What are the features of the language I should learn and use to perform better on the competitions? For background, I consider myself pretty proficient in C, but I am just starting to learn C++.

    Read the article

  • When does 'optimizing code' == 'structuring data'?

    - by NewAlexandria
    A recent article by ycombinator lists a comment with principles of a great programmer. #7. Good programmer: I optimize code. Better programmer: I structure data. Best programmer: What's the difference? Acknowledging subjective and contentious concepts - does anyone have a position on what this means? I do, but I'd like to edit this question later with my thoughts so-as not to predispose the answers.

    Read the article

  • Problem with MVC3 application

    - by Pravin Patil
    I am working on MVC3 application. I use entity framework, NInject, Fluent Validation and some more Nuget packages. I am using Tortoise SVN for versioning. Recently I changed the structure of my SVN repository, so my working copy of MVC3 app was moved to some different folder in the repository. Now when I checked out the copy from SVN, all the references that I had added through Nuget were lost(EF, NInject and rest nuget packages were showing yellow missing icon in references). This had happened to me prior to this also, when I tried to check out the app from svn to some other folder. I had to manually add all the references again through Nuget again. Am I doing anything wrong? Please guide. I hope I could explain my problem properly.

    Read the article

  • Can the overuse of custom taglibs disrupt the outsourcing of html designers?

    - by Renato Gama
    Yesterday me and a friend were talking about the overuse of custom taglibs! We create taglibs for everything! We create taglibs in order to wrap jQuery UI elements (tabs, button, etc), and other plugins elements as well. We often wrap them together in a single component. We use taglibs in a point that we almost have no pure html within the body tag. Our question is: is this a healthy habit??? Imagine two situations: 1) We hire an html designer and have the cost of a month for him to learn all this stuff. 2) We want to outsource the html development but no company would get our taglib library to learn, OR it become more expensive. We love taglibs as its been a lovely shortcut for javascipt development as we write it only once. What would be the best practices in this sense, and what would you suggest? We are looking for a future-proof solution (or an argument that agrees with ours).

    Read the article

  • Which is more valuable in product development: an action-oriented or visionary bent?

    - by Marc
    As a software development professional in a fairly conservative large-firm, I always had a much more action-oriented bent, as my job was fairly stable and all that mattered was doing as I was told and completing tasks that were germane to the career of a benevolent dictator (i.e., my boss' boss). Now that I'm no longer working for "the Man", I find it just as important to use the left side of my brain and wrap my head around this whole "vision thing". Which do you think is more important for software product development in a small, yet feisty start-up: Knowing the path or walking (or running) it?

    Read the article

  • What tools exist for assessing an organisation's development capability?

    - by Eric Smith
    I have a bit of a challenge at work at the moment. Presently (and in fact, for some time now), we have been experiencing the following problems with some in-house maintained applications: Defects (sometimes quite serious) being released into production; The Customer (that is, the relevant business unit) perpetually changing their minds (or appearing to do so) about what issue to work on next; A situation where everyone seems to be in a "fire-fighting" mode a lot of the time; Development staff responding to operational requests from business users; ("operational" here means something that needs to be done in order to continue with business, or perhaps just to make a business user's life a little less painful, as opposed to fixing a bug in the application, or enhancing the application); Now I'm sure this doesn't sound particularly new or surprising to most of the participants on this Q&A site and no prizes for identifying the "usual suspects" when it comes to root causes. My challenge is that I have to persuade the higher-ups to do uncomfortable things in order to address all of this. The folk I need to persuade come from a mixture of the following two cultures: Accounting; IT Infrastructure. I have therefore opted for a strategy that draws from things with-which folk from such a culture would be most comfortable (at least, in my estimation), namely: numbers and tangibles. Of course modern development practitioners know all too well that this sort of thing isn't easily solved using an analytical mindset (some would argue that that mindset is, in fact, entirely inappropriate). Never-the-less, this is the dichotomy with-which I am faced, so that's the stake that I've put in the ground. I would like to be able to do research and use the outputs to present findings in the form of metrics and measures. I am finding it quite difficult, though, to find an agreed-upon methodology and set of templates for assessing an organisations development capability--the only thing that seems applicable is the Software Engineering Institute's Capability Maturity Model. The latter, however, seems dated and even then rather vague. So, the question is: Do any tools or methodologies (free or commercial) exist that would assist me in completing this assessment?

    Read the article

  • Why do large IT projects tend to fail or have big cost/schedule overruns?

    - by Pratik
    I always read about large scale transformation or integration project that are total or almost total disaster. Even if they somehow manage to succeed the cost and schedule blow out is enormous. What is the real reason behind large projects being more prone to failure. Can agile be used in these sort of projects or traditional approach is still the best. One example from Australia is the Queensland Payroll project where they changed test success criteria to deliver the project. See some more failed projects in this SO question Have you got any personal experience to share?

    Read the article

  • Going back to ASP.Net Webforms from ASP.Net MVC. Recommend patterns/architectures?

    - by jlnorsworthy
    To many of you this will sound like a ridiculous question, but I am asking because I have little to no experience with ASP.Net Webforms - I went straight to ASP.Net MVC. I am now working on a project where we are limited to .Net 2.0 and Visual Studio 2005. I liked the clean separation of concerns when working with ASP.Net MVC, and am looking for something to make webforms less unbearable. Are there any recommended patterns or practices for people who prefer asp.net MVC, but are stuck on .net 2.0 and visual studio 2005?

    Read the article

  • Calendar like tool for managing multiple clients simultaneously?

    - by Yuji Tomita
    I've tried several systems to keep my clients requests / work organized, but somehow they still fall through the cracks. I don't like the idea of a bug tracker-like site for every client (I would not check them all). Ideally, it's all in one page but separated by client. What systems do you guys use? I'm about to just use a spreadsheet with columns for clients. Just curious if there's something better out there :) I've seen smartsheet in action which is basically a really nice spreadsheet that shows bars of time between things that are due. This looks promising.

    Read the article

  • Syntax logic suggestions

    - by Anna
    This syntax will be used inside HTML attributes. Here are a few examples of what I have so far: <input name="a" conditions="!b, c" /> <input name="b" /> <input name="c" /> This will make input "a" do something if b is not checked and c is checked (b and c are assumed to be checkboxes if they don't have a :value defined) <input name="a" conditions="!b:foo|bar, c:foo" /> <input name="b" /> <input name="c" /> This will make input "a" do something if bdoesn't have foo or bar values, and if c has the foo value. <input name="a" conditions="!b:EMPTY" /> <input name="b" /> Makes input "a" do something if b has a value assigned. So, essentially , acts as logical AND, : as equals (=), ! as NOT, and | as OR. The | (OR) is only needed between values (at least I think so), and AND is not needed between values for obvious reasons :) EMPTY means empty value, like <input value="" /> Do you have any suggestions on improving this syntax, like making it more human friendly? For example I think the "EMPTY" keyword is not really appropriate and should be replaced with a character, but I don't know which one to choose.

    Read the article

  • Why is prefixing column names considered bad practice?

    - by P.Brian.Mackey
    According to a popular SO post is it considered a bad practice to prefix table names. At my company every column is prefixed by a table name. This is difficult for me to read. I'm not sure the reason, but this naming is actually the company standard. I can't stand the naming convention, but I have no documentation to back up my reasoning. All I know is that reading AdventureWorks is much simpler. In this our company DB you will see a table, Person and it might have column name: Person_First_Name or maybe even Person_Person_First_Name (don't ask me why you see person 2x) Why is it considered a bad practice to pre-fix column names? Are underscores considered evil in SQL as well? Note: I own Pro SQL Server 2008 - Relation Database design and implementation. References to that book are welcome.

    Read the article

  • When is a Use Case layer needed?

    - by Meta-Knight
    In his blog post The Clean Architecture Uncle Bob suggests a 4-layer architecture. I understand the separation between business rules, interfaces and infrastructure, but I wonder if/when it's necessary to have separate layers for domain objects and use cases. What added value will it bring, compared to just having the uses cases as "domain services" in the domain layer? The only useful info I've found on the web about a use case layer is an article by Martin Fowler, who seems to contradict Uncle Bob about its necessity: At some point I may run into the problems, and then I'll make a Use Case Controller - but only then. And even when I do that I rarely consider the Use Case Controllers to occupy a separate layer in the system architecture. Edit: I stumbled upon a video of Uncle Bob's Architecture: The Lost Years keynote, in which he explains this architecture in depth. Very informative.

    Read the article

  • How and when to ask for a pay raise?

    - by Nico
    When should one ask for a pay raise? Will I know when the time is right for a pay raise? or should I just think "I deserve a pay raise for X and Y." When would be a moment to ask for a pay grade? For instance, if you are in a company that outsources to others, could it be the right moment to ask when they move you to a different physical workplace? Maybe a few weeks/months after you started working as a consultant at the client? Should you ask for one after engaging new technologies or something you've never worked with before? In short, should you ask for a raise for a "business motive" (they move you, they assign you new responsibilities), a "professional motive" (you are required to learn new languages or technologies), or a "personal motive" (you are having twins, your mother died and you need to arrange the funeral), or are all of the above potentially valid motives? How should one ask for it? Asking for a pay raise can be difficult for some people, how you deal with this? Do you just walk up to your manager and tell him "I need more money", "I think I deserve a pay raise"? Do you suggest you might have other offers on the table? Couldn't this be counterproductive if you actually really want to stay in the company you are in (because you like the environment, made a few friends, and like all the features they give you besides your pay grade; say: free sodas, parties, after-offices that happen pretty often, a ps3 you can grab when you are tired or want to chill out, courses, english classes, football games, etc, etc. [these would be my reasons not to leave]). I mean, how would you ask for a pay raise, effectively, but without pretending to threaten to leave the company if you don't get it? Because you don't actually want to. How would you deal with their answer? If they tell you they don't think you deserve a raise, would you ask for their reasons, would you get furious and trash the room? If they give you their reasons why they think you don't deserve a pay raise yet, would you discuss this with them or just take their opinion as factual? What if they ask you how much more you think you deserve to be being paid? Should you have thought this before-hand, or expect them to set the new grade? If they do agree to a pay raise, should you expect extra work to be thrown your way, or should everything remain the same, except your pay grade?

    Read the article

< Previous Page | 194 195 196 197 198 199 200 201 202 203 204 205  | Next Page >