Search Results

Search found 30414 results on 1217 pages for 'project failure'.

Page 639/1217 | < Previous Page | 635 636 637 638 639 640 641 642 643 644 645 646  | Next Page >

  • Technology/Techniques to prevent offensive images on a website

    - by Andreas Siegers
    I am planing to build a website which one of its main features is the usage of pictures. I was wondering what existing techniques are used to prevent offensive pictures (i.e pornography) to get loaded by users. i.e What does Facebook or Pinterest use? As well I would like to know what your recommendations would be to control offensive pictures to get uploaded to the site (OpenSource tools maybe..) Taking into consideration this is a personal project and will be developed with a very small budget. Thank you

    Read the article

  • Secure Web Apps from SQL Injection in ASP.Net

    In the first part of this two-part series you learned how SQL injection works in ASP.NET 3.5 using a MS SQL database. You were also shown with a real web application which was not secure against SQL injection attacks how these attacks can be used by the hacker to delete sensitive information from your website such as database tables. In this part you will learn how to start securing your web applications so they will not be vulnerable to these kinds of exploits. A complete corrected example of the insecure web application will be provided at the end of this tutorial.... ALM Software Solution ? Try it live! Requirements Management, Project Planning, Implementation Tracking & QA Testing.

    Read the article

  • Help Improve Oracle Products Usability at OOW

    - by Shay Shmeltzer
    We already wrote about all the great ADF related activities at OOW. But we wanted to also let you know about an additional activity you can participate in at OpenWorld: The Oracle Middleware User Experience team will be conducting focus groups and customer feedback activities at Oracle OpenWorld 2012 (Oct. 1st - Oct. 3rd). Customer participation helps Oracle develop outstanding products and solutions. Professionals of all types are invited to participate: Directors, Project & Product Managers, Finance, Sales, Human Resources, Marketing, Recruiters, Budget Managers,  and more. **To participate in these sessions you do not have to be registered for Oracle OpenWorld.** If you or someone you know is interested in participating, please email [email protected] with the following information: Name: Company Name:  Job Title: Email: Phone Number (work, mobile, include country code):

    Read the article

  • OpenWonderland goes to Europe

    - by user12652314
    Past my day job, I also have the privilege of being Vice President and Member of the Board of the OpenWonderland Foundation (Formerly Project Wonderland for ex-Sun folks). It's great to see the progress Wonderland is making as the platform-of-choice for open source collaborative 3-D virtual worlds built in Java. The upcoming and first european IE Summit in Madrid will show a number of projects and innovations for virtual and game-based learning. I also dropped Ultan a note after learning about the Gamification of Oracle Apps to see if collaborating with kids and schools is an area we can work together on to improve kids engagement and learning experiences online. Stay tuned.

    Read the article

  • Freelancers do you charge for going on site to do work?

    - by user35072
    I'm currently new to freelancing as a programmer and need to work what some of the "norms" are without making myself look like an amateur. I've already won some work from a local company doing C# development and already quoting an hourly rate for some work that i am doing from my office. However on one upcoming project I've been asked to come on-site (client office) to work a full week. Is it reasonable to charge more than my regular hourly rate for working on site? And how should i justify the extra charges?

    Read the article

  • How do I share my jQuery plugin with the world?

    - by Billy Moon
    I made a plugin for jQuery, which I think is quite useful. It combines an animated colours plugin with an easings plugin, and adds a completely new feature of being able to refer to colours numerically (more useful in hex notation, so 0xff00cc for example) and therefore manipulate them mathematically more easily. I created a repository on github, and it sits there, nobody looks at it. Mostly, I would like people to look at it, use it, and improve it, so I can use the improvements and so on. I think this idea of numerically manipulating colors could be interesting. It makes it easy to change the hue without changing the saturation for example. Combined with animated colors, I think something interesting could be done, but I don't know what exactly. How do I let people know it (or any other project) is there..? I was going to post it on http://plugins.jquery.com which is currently down. Are there any other outlets for this kind of code?

    Read the article

  • Links in my site have been hacked

    - by Funky
    In my site I prefix the images and links with the domain of the site for better SEO using the code below: public static string GetHTTPHost() { string host = ""; if (HttpContext.Current.Request["HTTP_HOST"] != null) host = HttpContext.Current.Request["HTTP_HOST"]; if (host == "site.co.uk" || host == "site.com") { return "http://www." + host; } return "http://"+ host; } This works great, but for some reason, lots of links have now changed to http://www.baidu.com/... There is no sign of this in any of the code or project, the files on the server also have a change date when i last did the publish at 11 yesterday, so all the files on there look fine. I am using ASP.net and Umbraco 4.7.2 Does anyone have any ideas? thanks

    Read the article

  • Sync, share and backup policy using NAS

    - by Cue
    Trying to come up with a way to keep in sync while sharing and keeping a backup of my music/photos and movies. Currently I have an iMac in Greece and a MBP with me in the UK. As a result I've ended up with 2 iPhoto and iTunes libraries not to mention Documents scattered here and there, user settings etc. I also like to have a backup in case of a drive failure or the need to clean install. It seems that iPhoto and iTunes don't work really well with networked libraries. The way I think about it is to have a NAS where I keep my iTunes and iPhoto library but also rsync daily to my MBP to have a local copy. That way my files are shared across the network as well as act like a backup. In addition I get to have my files wherever I take my MBP but also have the ability to clean install. The tricky part comes from keeping in sync the iMac which is miles away. Again I'm considering a mirror setup (NAS, rsync to the iMac) as well as an rsync between the two NAS. It pretty much resembles the way Dropbox works, sans the requirement to go through their servers but I'm no "superuser" and don't really know if it is even feasible to have such a setup. Looks like there are so many things that can go wrong.

    Read the article

  • XNA GameTime TotalGameTime slower than real time

    - by robasaurus
    I have set-up an empty test project consisting of a System.Diagnostics.Stopwatch and this in the draw method: spriteBatch.DrawString(font, gameTime.TotalGameTime.TotalSeconds.ToString(), new Vector2(100, 100), Color.White); spriteBatch.DrawString(font, stopwatch.Elapsed.TotalSeconds.ToString(), new Vector2(100, 200), Color.White); The GameTime.TotalGameTime displayed is slower than the stop watch (by about 5 seconds per minute) even though GameTime.IsRunningSlowly is always false, why is this? The reason this is an issue is because I have a server which uses stopwatch and it is faster than my client game. For instance my client notifies the server it has dropped a mine which explodes in one minute. Because the stopwatch is faster the server state explodes the mine before the client and they are out of sync. I don't want to have to notify the client when the server explodes it as this would use unnecessary bandwidth.

    Read the article

  • Can decoupling hurt maintainability in certain situations?

    - by Ceiling Gecko
    Can the fact that the business logic is mapped to interfaces instead of implementations actually hinder the maintenance of the application in certain situations? A naive example with the Java's Hibernate framework would be, that for example (provided I don't have the whole code-base in my head, the project structure is a mess and classes are named with arbitrary names) if I wish to see what's going on in a certain DAO, to see if it actually is doing what it's supposed to do, then instead of traversing backwards up the tree from the point where the data service is invoked (where the tree will end in an interface with no implementation details whatsoever apart from the signature) I have to for example go and look for a configuration XML file to see which class is mapped to said interface as the implementation before being able to access the actual implementation details. Are there any situations where having loose coupling can actually hurt maintainability?

    Read the article

  • How do I justify upgrading to Windows Server 2008?

    - by thebunk
    We're just about to start a new greenfield project - it's a highly functional web application using ASP.NET MVC3, SQL Server etc. We're also going to be using Windows Workflow Foundation for the first time. Our client only wants to use his existing Windows Server 2003 web servers. My main issue (other than it is 8 years old) is that we don't much experierence of WWF development, but understand that using AppFabric (Server 2008 only) will improve WWF development. It's a significant cost to the client, as we need fail-over servers and a UAT environment as well. Am I correct in my understanding, and what methodologies can I use to justify the cost of upgrading?

    Read the article

  • Nagios DNX plugins

    - by danneh3826
    I'm toying with the idea of multiple Nagios instances setup to monitor our infrastructure. I've looked at all the various methods of distributed Nagios checks, and I think DNX comes out the closest. DNX handles failure of worker nodes, that's fine. What happens if the main DNX server fails though? Is there a way to replicate the server too? I'm using AWS EC2 primarily, so I can utilise Elastic Load Balancing for the web UI, but I need to be able to handle the AZ where the monitoring server is to fail over, and essentially for a second to pick up the checking load (active/passive, active/active, so long as it doesn't fail completely) The other thing I'm trying to solve is an issue with routing. What I'd like is to have multiple nodes report a fault before Nagios confirms it as critical. Not the NRPE checks, as they're pretty self explanitory, but things more like check_ping. I often have routing issues out of AWS to certain datacenters, so Nagios can often report bad/no ping/timeout as a critical issue, even though the machine in question is working fine. Would it be possible to have a setup where a worker complains a service check is critical, and have a second worker node (positioned in another datacenter/AZ) also report the service as critical before the Nagios central server issues a critical alert? I realise I might be asking a bit much (how far down the line do you go setting up failover systems before it starts to get ridiculous), however surely someone must have thought of this scenario when developing DNX?

    Read the article

  • How to convince a non-technical client that their application spec needs to be simplified?

    - by Ryan
    Often times I am faced with the situation where a new client comes to me with an application that has literally 100s of unnecessary features and it is quite clear that things need to be drastically simplified for the project to have any chance of succeeding. How do you convince the client to take a more MVP approach and simplify? edit: So the current top answer is to provide the client with a time/cost estimate for the huge application. I'm not too fond of this answer because it doesn't address the real problem with this situation. And that is - it's a bad practice to spec out a massive application and then try and build it from the get go. I feel much more comfortable initially building a small, simple MVP foundation. And then adding small features to that foundation one by one. So how do I convince the client to approach building software in this way?

    Read the article

  • 150 TB and growing, but how to grow?

    - by seandavi
    My group currently has two largish storage servers, both NAS running debian linux. The first is an all-in-one 24-disk (SATA) server that is several years old. We have two hardware RAIDS set up on it with LVM over those. The second server is 64 disks divided over 4 enclosures, each a hardware RAID 6, connected via external SAS. We use XFS with LVM over that to create 100TB useable storage. All of this works pretty well, but we are outgrowing these systems. Having build two such servers and still growing, we want to build something that allows us more flexibility in terms of future growth, backup options, that behaves better under disk failure (checking the larger filesystem can take a day or more), and can stand up in a heavily concurrent environment (think small computer cluster). We do not have system administration support, so we administer all of this ourselves (we are a genomics lab). So, what we seek is a relatively low-cost, acceptable performance storage solution that will allow future growth and flexible configuration (think ZFS with different pools having different operating characteristics). We are probably outside the realm of a single NAS. We have been thinking about a combination of ZFS (on openindiana, for example) or btrfs per server with glusterfs running on top of that if we do it ourselves. What we are weighing that against is simply biting the bullet and investing in Isilon or 3Par storage solutions. Any suggestions or experiences are appreciated.

    Read the article

  • Configuring PAM with pam_mount; getting a dlopen() with an HX_Init error

    - by Jamie
    I'm trying to get automounting upon login working on Ubuntu 10.03 Beta 2. I didn't find a package for pam_mount, so I ended downloading it and building it. This required: sudo apt-get install build-essential pkg-config libxml2-dev libssl-dev libpam-dev Additionally, the libHX-dev is required but as of yesterday (23/4/2010) the package version provided (3.2) wasn't up to snuff (3.4) so I downloaded, compiled and installed that too. cd ./pam_mount-1.36/ && ./configure && make && sudo make install When I tried it (pam_mount) I got this in my auth log: Apr 23 12:18:02 ubuntu sshd[1195]: PAM unable to dlopen(/lib/security/pam_mount.so): /lib/security/pam_mount.so: undefined symbol: HX_init Apr 23 12:18:02 ubuntu sshd[1195]: PAM adding faulty module: /lib/security/pam_mount.so Apr 23 12:18:06 ubuntu sshd[1195]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.168.20.182 user=jrisk Apr 23 12:18:06 ubuntu sshd[1195]: pam_winbind(sshd:auth): getting password (0x00000388) Apr 23 12:18:06 ubuntu sshd[1195]: pam_winbind(sshd:auth): pam_get_item returned a password Apr 23 12:18:06 ubuntu sshd[1195]: pam_winbind(sshd:auth): user 'jrisk' granted access Apr 23 12:18:06 ubuntu sshd[1195]: Accepted password for jrisk from 192.168.20.182 port 4369 ssh2 Apr 23 12:18:06 ubuntu sshd[1195]: pam_unix(sshd:session): session opened for user jrisk by (uid=0) What do I need to do get HX_Init into the system? This is related to an answer I previously got here.

    Read the article

  • 100% uptime for a web application

    - by Chris Lively
    We received an interesting "requirement" from a client today. They want 100% uptime with off-site failover on a web application. From our web application's viewpoint, this isn't an issue. It was designed to be able to scale out across multiple database servers, etc. However, from a networking issue I just can't seem to figure out how to make it work. In a nutshell, the application will live on servers within the client's network. It is accessed by both internal and external people. They want us to maintain an off-site copy of the system that in the event of a serious failure at their premises would immediately pick up and take over. Now we know there is absolutely no way to resolve it for internal people (carrier pigeon?), but they want the external users to not even notice. Quite frankly, I haven't the foggiest idea of how this might be possible. It seems that if they lose Internet connectivity then we would have to do a DNS change to forward traffic to the external machines... Which, of course, takes time. Ideas? UPDATE I had a discussion with the client today and they clarified on the issue. They stuck by the 100% number, saying the application should stay active even in the event of a flood. However, that requirement only kicks in if we host it for them. They said they would handle the uptime requirement if the application lives entirely on their servers. You can guess my response.

    Read the article

  • 3D touch "Minority Report" style interface - what platform gets me there the fastest?

    - by Ross Braden
    I'm working on a project that requires touch interface, though the use case is desktop more than mobile. Want to start out platform agnostic, not a mobile app. There will be gridwork type of 3D objects and diagraming being represented - think AutoCAD or Minority Report. Want to build a prototype that will have hooks into a database to represent the data. Any advice on what tools to use both for the design and the development of the functionality is greatly appreciated. Thanks!

    Read the article

  • LAN or workgroup with Ubuntu server and windows clients

    - by Kenneth Fernando
    I have 35 windows7 standalone computers right now and planning to setup a LAN/workgroup using ubuntu as a server if its a LAN. Purpose for network.. Files from client computers to be accessed by me and vice versa through a shared folder on the server. Files only include word documents and very small project files. The network wont have Internet at the moment. What I would like to know.. How would i configure the ubuntu server to recognize the clients and will the clients be able to view the shared folders through their windows machines simultaneously? Would appreciate feedback. Thanks

    Read the article

  • Best strategy to discover a web service in a local network?

    - by Ucodia
    I am currently doing some research for a project. The setup is simple, I have a computer running a service in my home network and any device connected to that same network should be able to discover the service automatically and use it. I have no specific technology requirement whether it is on the server or client side. The client knows about the service definition. Other than that I have no idea what strategy to use, what technology to look at or whether I should go for a SOAP or a HTTP based service. I think going HTTP with REST API is the best for targeting all devices but I am opened to any suggestions. Thanks.

    Read the article

  • Agile Tools For Handling Multiple Projects

    - by f1dave
    Currently I'm leading our agile team in an iteration manager role as well as doing my regular dev work. One of the difficulties I'm facing as an IM is tracking burn-down/burn-up; not because I can't produce graphs, but because there's multiple projects that this team is working on at one time. At present I have an excel workbook with sheets that contain a whole bunch of graphs, both at an overall team and by-project level. It's clunky and I spend more time tweaking formulas and double checking calculations than I'd really like. As such, I'm interested to know if anyone has used a tool that can effectively produce these sorts of reports, burn-downs, and predictions across multiple projects. I've seen http://www.pivotaltracker.com/ do some nice things, and of course there's JIRA/Greenhopper, but I'm not aware of those being used to track the progress of multiple projects within one team. If anyone's got an idea of some tools, or has faced a similar problem before, I'd love to hear from you.

    Read the article

  • Is an 'if password == XXXXXXX' enough for minimum security?

    - by Prof Plum
    If I create a login for an app that has middle to low security risk (in other words, its not a banking app or anything), is it acceptable for me to verify a password entered by the user by just saying something like: if(enteredPassword == verifiedPassword) SendToRestrictedArea(); else DisplayPasswordUnknownMessage(); It seems to easy to be effective, but I certainly would not mind if that was all that was required. Is a simple check on username/password combo enough? Update: The particular project happens to be a web service, the verification is entirely server side, and it is not open-source. Does the domain change how you would deal with this?

    Read the article

  • Does saving my progress on a U1-synced file/folder put unneccesary strain on the servers?

    - by Chauncellor
    I love Ubuntu One and I use it all the time. I have my documents and music composition folders set to sync. It's been a real boon. However, sometimes I feel that constantly saving my progress forces the file to sync dozens and dozens of times to the servers. It seems wasteful to me so I've been disconnecting U1 until I'm finished working on a project. Is this an unnecessary action that I am taking? I know it's using Amazon's storage but I'm still paranoid that I'm costing Canonical money when I constantly save my progress.

    Read the article

  • DotNetNuke 5.3.0 Released

    I am happy to announce that the DotNetNuke 5.3.0 release is now available for download. This release marks the fourth month in a row where we have hit our targeted release date. That is a huge accomplishment for the project as the DotNetNuke Corporation engineering team is really starting to gel. During this release cycle we also had a number of significant contributions by core team members. Over the past year, as our development methodology has undergone change and we have hired more members for...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Powershell Script Scheduled Task Stopped Running (Could not Start)

    - by Hatsune Yuki
    I'm running a scheduled task (for Powershell Script) on Windows 2003 Server. I believe the script works fine. The task is scheduled to run every 10 minutes from 7:00am to 11:50pm everyday. However, it never gets to run more for than a day. It always stops some time in the afternoon (between 2pm and 6pm). I'm not sure exactly what happened but I always get the error The attempt to log on to the account associated with the task failed, therefore, the task did not run. The specific error is: 0x80070569: Logon failure: the user has not been granted the requested logon type at this computer. Verify that the task's Run-as name and password are valid and try again. It seems like most people with this error are saying that they need to make user "logon as a batch job". However, this option is greyed-out for me. I search for other places where users have similar problems but the solutions are not written in detail (some of them have something to do with GPO). I've only used the basic features of Windows Server and I have no clue how to get to the place they are referring to. Can someone please confirm whether "logon as a batch job" is indeed a solution and provide a detailed walkthrough on how to solve my problem? Thanks. p.s. someone suggested the website http://technet.microsoft.com/en-us/library/cc755659(v=ws.10) I tried to followed the method for web server with domain. However, got stuck on the 6th step where it mentions Group Policy Object. I don't know where it is.

    Read the article

  • Must all new features go through betatest?

    - by LTR
    Obviously, small usability fixes and bugfixes go directly into the stable product. What about small new features? Can you afford to just release them after internal testing, or do they have to be betatested by customers first? Situation: This is a young commercial project, produced by a one-person company. It has an existing userbase and is at it's second major version. Previous betatests have produced some results, however most feedback came from the stable product and not from beta versions.

    Read the article

< Previous Page | 635 636 637 638 639 640 641 642 643 644 645 646  | Next Page >