Search Results

Search found 25440 results on 1018 pages for 'agent based modeling'.

Page 603/1018 | < Previous Page | 599 600 601 602 603 604 605 606 607 608 609 610  | Next Page >

  • Do you tend to write your own name or your company name in your code?

    - by Connell Watkins
    I've been working on various projects at home and at work, and over the years I've developed two main APIs that I use in almost all AJAX based websites. I've compiled both of these into DLLs and called the namespaces Connell.Database and Connell.Json. My boss recently saw these namespaces in a software documentation for a project for the company and said I shouldn't be using my own name in the code. (But it's my code!) One thing to bear in mind is that we're not a software company. We're an IT support company, and I'm the only full-time software developer here, so there's not really any procedures on how we should write software in the company. Another thing to bear in mind is that I do intend on one day releasing these DLLs as open-source projects. How do other developers group their namespaces within their company? Does anyone use the same class libraries in personal and in work projects? Also does this work the other way round? If I write a class library entirely at work, who owns that code? If I've seen the library through from start to finish, designed it and programmed it. Can I use that for another project at home? Thanks, Update I've spoken to my boss about this issue and he agrees that they're my objects and he's fine for me to open-source them. Before this conversation I started changing the objects anyway, which was actually quite productive and the code now suits this specific project more-so than it did previously. But thank you to everyone involved for a very interesting debate. I hope all this text isn't wasted and someone learns from it. I certainly did. Cheers,

    Read the article

  • Managed Cloud Services Wins Another Prestigious Industry Award

    - by Dori DiMassimo-Oracle
    Over the last 90 days, Oracle Managed Cloud Services has been the proud recipient of TWO prestigious industry awards for service excellence and customer value leadership.  The most recent award is last month's 2014 Frost & Sullivan Best Practice Award - North America Managed Cloud Customer Value Leadership Award, which rated Oracle Managed Cloud Services as the clear leader versus other providers; Managed Cloud received an "exceptional" rating in 9 of 10 evaluation categories.  The research report  is an excellent look at our industry and what is valued by cloud customers looking for a managed solution.   In April, Managed Cloud was a repeat winner of the Outsourcing Excellence Award - 2014 Outsourcing Excellence Award - Best ITO Infrastructure (Sony Computer Entertainment America).  Last year we won the award for Best Cloud: 2013 Outsourcing Excellence Award - Best Cloud (Take-Two Interactive)  These awards are a great testimony of the transformation of Managed Cloud Services to a true Cloud-based business and a strategic and relevant part of the Oracle Cloud Solutions portfolio.  Frost & Sullivan, in particular, recognizes our vision and our capability of successfully managing business transactions in the cloud.

    Read the article

  • HTTP Caching Server that supports POST

    - by Jeroen
    I am hosting a REST service which is sending appropriate cache-control headers. I use Varnish as a caching server in front of my webserver. However, a limitation of varnish is that it doesn't support caching HTTP POST and HTTP PUT. Is there any alternate caching server that will be able to cache these requests? I understand that caching POST is a bit tricky because you cannot just cache based on the url as a key like for GET; it needs to actually inspect the request body. In case of multipart/form-data requests, there should probably be a limit on the size of the request body for it to be cached (so that big file uploads, etc won't be cached). Nevertheless I really want to be able to cache short HTTP POST, or at least the application/x-www-form-urlencoded ones.

    Read the article

  • Calculating RAM Performance? Example: DDR3-2133 CL9-11-10-28 1.65V vs DDR3-1600 CL10-10-10-30 1.5V

    - by user1131467
    How do you calculate the relative performance of PC RAM? For example, what is the relative performance of the following: G.Skill Ripjaws Z 8 x 4GB Kit, DDR3-2133, [email protected] G.Skill Ripjaws Z 4 x 8GB Kit, DDR3-1600, [email protected] If it's relevant, when they are used in a top of the line ASUS Rampage IV Extreme motherboard and Intel i7 3960X? By performance, I mean relative: read latency write latency read bandwidth write bandwidth Please include working. (I mean how did you arrive at the figures based on timing and DDR3-speed)

    Read the article

  • Air Canada Will No Longer Be My Airline

    - by D'Arcy Lussier
    If the constant labour disputes at Air Canada (the most recent being a week ago where pilots were locked out and mechanics and bag handlers were poised to strike) weren’t enough to make me reconsider moving all my flights to West Jet, this latest twist definitely will. CBC reported that Aveos, a privately held company that has the contract to provide maintenance for Air Canada, had suddenly and without notice shut its doors (read the story here) There’s something missing from the stories currently online though. Months ago, Air Canada gave their Winnipeg based maintenance staff an ultimatum – stay with Air Canada but be forced to relocate to a different city, or switch from Air Canada to Aveos and stay in Winnipeg. So all of those staff that Air Canada pushed into Aveos just so they could stay in Winnipeg are now out of a job with huge uncertainty around their future. Labour disputes that rise up continually and hamper personal travel and business, questionable timing of business decisions and the resulting impact on individuals…there’s too much drama in that company for me to rely on it for my travel needs. WestJet it is moving forward until Air Canada gets their act together – which probably means its WestJet for the foreseeable future. D

    Read the article

  • Leveraging NuGet as a central repository for PowerShell modules

    - by cibrax
    We have been working a lot lately with PowerShell as part of our star product at Tellago Studios, “Moesion”. One of the main features we provide in Moesion is the ability to execute PowerShell commands remotely in a given server using a web mobile interface (You can read more in my previous post about Moesion). One of the things we realized in all this time is that PowerShell lacks of a central repository where IT guys or we, the developers, can easily grab and reuse commands.  All the commands or modules are basically spread across multiple places or websites, like personal blogs, TechNet or CodePlex projects to name a few making the search of them very hard. You are usually limited to use your favorite search engine and copy what you find. In addition, there is not an easy way to reuse, extend or version these commands, which also limits any contribution that you could make to the community.  My friend Jose wrote a great post the other day about the importance of reusing PowerShell modules, and what is the mechanism to reuse them. Jose, however, based his post in a custom implementation using a GIT repository for storing the modules. We have NuGet in the .NET platform for sharing and reusing existing libraries or code, so why can’t just leverage it for reusing PowerShell modules as well ?. Some teams in Microsoft are using NuGet for distributing libraries and binaries so it would be a great thing for all of us if they also distribute the scripting interfaces in PowerShell using NuGet. This applies to the .NET OS community as well. In fact, it looks like Andrew Nurse had the same idea and implemented a project for this in BitBucket, PsGet.

    Read the article

  • How do I catalog files on several external hard drives that I want to store off-line? OSX

    - by raudi
    My partner, an artist, has more than 10 external hardisks both USB and firewire and every 2-3 months a new one has to be added (She's working with videos and pictures) currently its 10TB and growing so too much for a affordable NAS. Right now the files are not indexed and I think can not be searched with spotlight because not all drives can be connected at the same time. So if she wants to search for a file, she has to guess which disk/disks (based mostly on the date) and then search several drives. Now I'm looking for a solution to index/catalog the drives, something like GentibusCD Cathy Disclib (all these solutions are unfortunately Windows only) Is there any software for OSX that will catalog all the hard drives, so she can search the catalog, find the files, and get the ID of the disk / disk name that has the content? Preferably something with a GUI so my partner can also use it easily Preferably with Thumbnails for pictures/videos (But even an equivalent to "tree /F /A" would be better than nothing)

    Read the article

  • CodeStock 2012 Review: Eric Landes( @ericlandes ) - Automated Tests in to automated Builds! How to put the right type of automated tests in to the right automated builds.

    Automated Tests in to automated Builds! How to put the right type of automated tests in to the right automated builds.Speaker: Eric LandesTwitter: @ericlandesBlog: http://ericlandes.com/ This was one of the first sessions I attended during CodeStock 2012. Eric’s talk focused mostly on unit testing, and that the lack of proper unit testing can be compared to stealing from an employer. His point was that if you’re not doing proper unit testing then all of the time wasted on fixing issues that could have been detected with unit tests is like stealing money from employer. He makes the assumption that that time spent on fixing these issues could have been better spent developing new features that drive the business. To a point I can agree with Eric’s argument regarding unit testing and stealing from a company’s perspective. I can see how he relates resources being shifted from new development to bug fixes as stealing based on the fact that the resources used to fix bugs are directly taken from other projects. He also states that Boring/Redundant and Build/Test tasks should be automated because it reduces the changes of errors and frees up developer to do what they do best, DEVELOP! When he refers to testing, he breaks testing down in to four distinct types. Unit Test Acceptance Test (This also includes Integration Tests) Performance Test UI Test With this he also recommends that developers should not go buck wild striving for 100% code coverage because some test my not provide a great return on investment. In his experience he recommends that 70% test coverage was a very acceptable rate.

    Read the article

  • Word 2010 - Styled paragraph separating into separate numbered lines

    - by chez
    USING WORD 2010 I have a style "Heading 4 Par" which is a style based on Heading 4. It is a numbered style. My problem is when I apply the "Heading 4 Par" to say a 3 lined paragraph it separates each of the lines in the paragraph and numbers it. I always show the formatting characters and as far as I can see there is only ONE paragraph mark situated at the end of what is supposed to be a paragraph. eg. Original: 7.4 Text.... text con't..... Text... After Applying Format: 7.4 Text... 7.4 text con't... 7.4 text con't. After I've applied the format it behaves as though each line should be the start of a new paragraph but there is no paragraph mark to show this. This is driving me crazy! Help!

    Read the article

  • Using JCal Pro and Community Builder together for registration

    - by Kate
    Does anyone know if there is a way to use JCal Pro and Community Builder together in order to have users register for a specific event? When our new website was designed JCal Pro was implemented with the idea that individuals could look at the calendar and see what events they wanted to sign up for based spots open for that specific day. Now that we have more events and scheduling has become a major issue. For another project we installed CB in order to allow individuals to register and create profiles for an annual event. As I am looking at this nightmare I am living trying to organize various groupings of people I am assuming that there has to be some sort of a way to connect the two up so that those who have a user account through CB could also go in look at the calendar and sign up for an event. There is a JCal plug-in installed in CB however I have not had much luck at figuring out its functionality. I am running: Joomla 1.5.14 JCal Pro version 2.2.7.441 Community Builder: 1.2

    Read the article

  • List<T>.AddRange is causing a brief Update/Draw delay

    - by Justin Skiles
    I have a list of entities which implement an ICollidable interface. This interface is used to resolve collisions between entities. My entities are thus: Players Enemies Projectiles Items Tiles On each game update (about 60 t/s), I am clearing the list and adding the current entities based on the game state. I am accomplishing this via: collidableEntities.Clear(); collidableEntities.AddRange(players); collidableEntities.AddRange(enemies); collidableEntities.AddRange(projectiles); collidableEntities.AddRange(items); collidableEntities.AddRange(camera.VisibleTiles); Everything works fine until I add the visible tiles to the list. The first ~1-2 seconds of running the game loop causes a visible hiccup that delays drawing (so I can see a jitter in the rendering). I can literally remove/add the line that adds the tiles and see the jitter occur and not occur, so I have narrowed it down to that line. My question is, why? The list of VisibleTiles is about 450-500 tiles, so it's really not that much data. Each tile contains a Texture2D (image) and a Vector2 (position) to determine what is rendered and where. I'm going to keep looking, but from the top of my head, I can't understand why only the first 1-2 seconds hiccups but is then smooth from there on out. Any advice is appreciated.

    Read the article

  • Question regarding Readability vs Processing Time

    - by Jordy
    I am creating a flowchart for a program with multiple sequential steps. Every step should be performed if the previous step is succesful. I use a c-based programming language so the lay-out would be something like this: METHOD 1: if(step_one_succeeded()) { if(step_two_succeeded()) { if(step_three_succeeded()) { //etc. etc. } } } If my program would have 15+ steps, the resulting code would be terribly unfriendly to read. So I changed my design and implemented a global errorcode that I keep passing by reference, make everything more readable. The resulting code would be something like this: METHOD 2: int _no_error = 0; step_one(_no_error); if(_no_error == 0) step_two(_no_error); if(_no_error == 0) step_three(_no_error); if(_no_error == 0) step_two(_no_error); The cyclomatic complexibility stays the same. Now let's say there are N number of steps. And let's assume that checking a condition is 1 clock long and performing a step doesn't take up time. The processing speed of Method1 can be anywhere between 1 and N. The processing speed of Method2 however is always equal to N-1. So Method1 will be faster most of the time. Which brings me to my question, is it bad practice to sacrifice time in order to make the code more readable? And why (not)?

    Read the article

  • Apache Virtual Host Configuration

    - by Carl
    I have been searching the internet for an hour now, and I was hoping for a quick hint here so that I could solve my problem a wee bit faster. My virtual server is so far only accessible through an IP address, no DNS entry yet, and so far none needed either. The problem I have is with Apache2, the virtual hosts are puzzling me. What I need is: Access to my project (based on Symfony2) from the outside with the IP address Access to my project from localhost What I have got: Access from the outside results in rendering the websites in /var/www/vhosts/htdocs/default, while from the inside results in rendering the websites in /var/www. Why the difference? What is a recommended configuration for my use case?

    Read the article

  • Multiple URL's going to same page - Kosher for Google?

    - by Ashoka15
    I hear conflicting answers from people about this, and I'm a developer by trade, and my SEO knowledge is not what it should be. Here's my situation: I run a website that lists hotels, restaurants, bars, shops, etc for a small Asian beach town. Lots of establishments here are hotels with a restaurant and bar, as well as restaurants that are also bars. As en example, a Mexican restaurant that also functions as a full cocktail bar. I first set it up so each establishment has one page, but can create multiple pages based on their other areas of business. This forces people to create TWO listings under the same name, and most just add the exact same information onto each page, making things redundant. I am re-arranging the database so that a establishment has only ONE listing (one unique page referenced by the unique code '12345ABCDEF') that is accessible from browsing under "Restaurants" and "Bars", and has the URL structures: site.com/dining/mexican/12345ABCDEF/business-name.html site.com/bars/cocktail_bars/12345ABCDEF/business-name.html I could easily simplify the URL to just the unique code and name: site.com/12345ABCDEF/business-name.html But, I found that Google has parsed by URL structure and lists like this on their SERP: Home > Dining > Mexican With each pointing to the default page for homepage, restaurants and Mexican restaurants. If I simplify the URL structure, will I lose these associations? Could Google also be picking up this structure from my breadcrumb trail at the top of the page? What is the best way to set up URL's on these pages so I am not penalized by Google for having identical information on two URL's, while still being able to have places show up as they did with the old system?

    Read the article

  • Nginx proxy to Apache - resolve HTTP ORIGIN

    - by Fratyr
    I have a server setup with nginx serving static content and proxy all PHP/dynamic requests to apache on 127.0.0.1 I'm building an API for my databases, and I need to allow clients by their origin (domain name), rather than just IP. Based on CORS rules. So when I send an HTTP header header("Access-Control-Allow-Origin: www.client-requesting.myapi.com"); from my API server, I have to tell it which origin I allow, otherwise client side requests won't work to my API due to same-origin policy. The question is how can I know which domain name (if any) called my API? What should be the nginx and apache configuration to pass the origin parameter? I tried to google, and all I found is some possible solution with mod_rpaf, but I wanted to be sure. Thanks!

    Read the article

  • My Only Gripe With Programming

    - by David Espejo
    Is that im having trouble practicing problems. Even if I decide to practice the problems from my C++ book, they dont give any idea of the way the solution(program) should look like, so that I may compare to see if my program is similar in anyway. My book gives me to many generic "Write a program to do "this" " projects without really showing a concrete example of what "this" really is. In other words How Do I Know That I did "that". One problem in my book said to write a program that calculates the sales tax on a given item????? First of all slase tax differs on state(whats the state,) whats the item(a house, a dog,) How can I check this to see if im right. Programming books dont have answer keys! I know that there is no ABSOLUTE answer, thats just silly, programs can be written in many ways, but a sample of what one would look like based of the difficulty of the problem would really help! Is there a solution to this, maby a book that has worked out examples for the problems they give , or online sources that do something similar.(is there such thing as a programming book with an answer key?)

    Read the article

  • Apple file sharing: bind to a specific interface

    - by Cesar
    My customer have an office small office with just a wifi router. He use this router for internet connectivity and file transfer operations between the desktops. Recently the file transfer activity between desktop (all osx based) is increased a lot so he bought a switch (no connected to the router, too far away) for transfer the file over the cable instead over the wifi. Problem: How to bind the file sharing service just to the Ethernet interface and exclude the wifi interface ? (actually the service binds to the wifi automatically and there are no options about the interface binding)

    Read the article

  • mounting vsphere 4.0 file system in ubuntu linux

    - by sravan
    hi all, I am using VSpere 4.0 for my project work. I needed 4-5 servers my project work which is based on Database. I felt the virtualisation is very good to get the 5 servers running good at the same time. It was running good until few days back. Yesterday, it suddenly crashed and i had no idea of the reason.Today, it did not even boot up. Now, i need to take the data backup from that system. In order to do the same, i got the hard drive from the machine and tried to mount it on local linux machine.But, i was not successful. The disk was not even recognized by the linux machine. Can some one please tell me how to mount it and get the required data? Thank you all

    Read the article

  • 3D physics engine for accurate collision handling on desktop/laptop computers (non-console)

    - by Georges Oates Larsen
    What are your suggestions for a physics engine that satisfies the following criteria? Capable of calculating collisions between multiple concave mesh-based colliders Handles many collisions going on at once (for instance one mesh being wedged between two others, which themselves may be wedged between two meshes) Does not allow for collider passthrough, even at high speeds. For instance, if I am applying force to a programmatically hinged object that makes it spin, I do not want it to pass through another rigidbody that it collides with while spinning. I have this problem using PhysX As implied before, reacts well to hinged objects, preferably has its own implementation of a hinge, but I am willing to program my own. The important part is that it has some sort of interface that guarantees accurate collision tracking even when dealing with these things Platform independent -- runs on mac as well as PC, also not tied down to specific graphics cards I think that's the best way to explain what I am looking for. Basically, I need SUPER reliable collisions. Something that can't be accomplished with a simple ray casting approach that sends a ray from the last position of the object to the current position (as this object may be potentially large and colliding with small objects via rotation) Bonus points for also including an OPEN SOURCE engine.

    Read the article

  • Installing latest version of R-base

    - by Student
    I have been unsuccessfully trying to install the latest version (2.15.2) of r-base. Apparently, R package "Rcpp" would not install for R version 2.14.1 - the version that installs for me. I am not sure what/how/where to change my installation attempts which appear below. Please note that I am using ubuntu-12.04.1-server-i386. (1) ------------ The current installed version is R version 2.14.1 (2011-12-22) sudo apt-get install r-base Reading package lists... Done Building dependency tree Reading state information... Done r-base is already the newest version. (2) ------------ Including version information doesn't help: sudo apt-get install r-base=2.15.1-5ubuntu1 Reading package lists... Done Building dependency tree Reading state information... Done E: Version '2.15.1-5ubuntu1' for 'r-base' was not found (3) ------------- Changes based on CRAN Ubuntu instructions http://cran.r-project.org/bin/linux/ubuntu/README 3.1: Added to /etc/apt/sources.list deb http://lib.stat.cmu.edu/R/CRAN/bin/linux/ubuntu quantal/ 3.2: sudo apt-get update 3.3: sudo apt-get install r-base Reading package lists... Done Building dependency tree Reading state information... Done Some packages could not be installed. This may mean that you have requested an impossible situation or if you are using the unstable distribution that some required packages have not yet been created or been moved out of Incoming. The following information may help to resolve the situation: The following packages have unmet dependencies: r-base : Depends: r-base-core (= 2.15.2-1quantal2) but it is not going to be installed Depends: r-recommended (= 2.15.2-1quantal2) but it is not going to be installed Recommends: r-base-html but it is not going to be installed E: Unable to correct problems, you have held broken packages.

    Read the article

  • Why doesn't my PHP install see the MySQL extension?

    - by Evan Padd
    So I just set up PHP (Version 5.2.17), MySQL (5.5.28), and Apace (2.2) on my Windows 7 computer. I want to test a mysql connection, but the mysql extension is apparently not loaded (based on phpinfo()). Here's what I did: Changed the extension dir in the php.ini (extension_dir = ".;c:\php\ext") and uncommented the extension=php_mysql.dll line Copied libmysql.dll to the Apache's bin directory Added C:\php to the system's $PATH Restarted the server, then the computer And it's still not working. What did I miss? EDIT: I'm looking through phpinfo()'s outout and it says: "Server API | Apache 2.0 Handler" and I'm running 2.2. Is that a problem?

    Read the article

  • How to limit access to Exchange 2003 Mobile Activesync server by user?

    - by micilin
    So I was asked to set up an Exchange Activesync mobile gateway. That's done. It's a separat eExchange 2003 front-end server configured for SSL, and I've put an off-domain ISA server in front of it. Now I'm being asked to limit which users can connect to it. By default an Exchange front-end server allows any user who has a mail account to connect to the front -end server. So I'm looking at the permissions on the various IIS sites/apps on the server, but I know that it's easy to break Exchange Front-end server perms. So I've got the following in IIS: Exadmin Exchange EchWeb Microsoft-SErver-ActiveSync MobileAdmin OMA And a couple of others that I dont think are relevant. Can I change the permissions on one of these to restrict who can connect to Activesync? As a bonus: Can I do it in a way that does not affect ordinary browser based Exchange Access? Thanks in Advance!!

    Read the article

  • Architecture of a "website generator" web application

    - by Resorath
    What is the most maintainable and efficient way to architect a web application who's purpose is to host and generate websites which can be customized to a certain degree? There are a lot of these style of applications in the wild that generate all kinds of sites, from sites that host World of Warcraft guilds like guildlaunch to other sites like my wedding for wedding site hosting. My question is, what is the basic architecture that these sites operate on? I imagine there are two ways of thinking about this. A central set of code that all sites on the host run against, and it acts differently based on which site was visited. In this manner, when the base code is updated all sites are updated simultaneously. Or, the code for an individual site exists in a silo, and is simply replicated to a new directory each time a site is created. When an update needs to be applied, the code is pushed out to each site silo. In my case, I am working in PHP with the CodeIgniter framework, however the answer need not be limited to this case. Which method (if any) creates a more maintainable and efficient architecture to manage this style of web application?

    Read the article

  • Internet Explorer 9 is coming Monday to a web near you

    - by brian_ritchie
    Internet Explorer 9 is finally here...well almost.  Microsoft is releasing their new browser on March 14, 2011. IE9 has a number of improvements, including: Faster, Faster, Faster.  Did I mention it is faster?   With the new browsers coming out from Mozilla, Google, and Microsoft, there have been a flood of speed test coverage.  Chrome has long held the javascript speed crown.  But according to Steven J. Vaughan-Nichols over at ZDNET..."for the moment at least IE9 is actually the fastest browser I’ve tested to date."  He came to this revelation after figuring out that the 32-bit version of IE9 has the new Chakra JIT (the 64-bit version doesn't).  It also has a DirectX-based rendering engine so it can do cool tricks once reserved for desktop applications. Windows 7 Desktop Integration.  Read my post for more details.  Unfortantely, they didn't integrate my ideas...at least not yet :) Hot new UI.  Ok, they "borrowed" some ideas from Chrome...but that is the best form of flattery. Standards Compliance.  A real focus on HTML5 and CSS3.  Definite goodness for developers. So, go get yourself some IE9 on Monday and enjoy! 

    Read the article

  • How to schedule time-of-day upgrades

    - by Richard
    Hello, I'm responsible for about 30 Ubuntu computers at a private K-8 school. We have only a 3Mbps internet connection serving the entire campus, and I would like to ensure that updates are done in the middle of the night - so that daytime tasks are not slowed down. I'm using Ubuntu 10.04, and have set all computers to download and install security updates via the update manager. I have also installed cron-apt, and modified the config file to stagger the start times of the upgrades from about 10pm to 4am local time. HOWEVER - this morning I arrived at the school at 7:30am and all the computers were busy downloading a large security based update. Needless to say, all internet activity was slowed to a crawl (for the next 2 hours), and the computer users were very very upset. This was the event I'm trying so hard to prevent. It seems that my scheme to ensure middle of the night downloads failed, and I'm not sure why. I've also tried some schemes using unattended-upgrades & crontab, but there always seemed to be something scheduling upgrades to occur in addition to the ones I try to force at middle of the night. Is there a sure fire way to absolutely positively guarantee that updates will occur only at one specific time? It would be nice if the update manager just had a drop down menu to specify a designated time. Thanks in advance for any help you can give me.

    Read the article

< Previous Page | 599 600 601 602 603 604 605 606 607 608 609 610  | Next Page >