Search Results

Search found 29133 results on 1166 pages for 'week number'.

Page 511/1166 | < Previous Page | 507 508 509 510 511 512 513 514 515 516 517 518  | Next Page >

  • What to do with my "unmounted drive"?

    - by Taylor Guistwite
    I just recently followed the tutorials on http://www.ubuntu.com/download/ubuntu/download for installing the ubuntu server onto my 1TB Seagate External. I was planning on using this to install it on my macbook and in these instructions it states to preform this line of code Run diskutil unmountDisk /dev/diskN (replace N with the disk number from the last command; in the previous example, N would be 2) Now my HD prompts "The disk you inserted was not readable by this computer". Would I just run diskutil mountDisk /dev/diskN in order to be able to access all my files again? here is a screenshot to the instructions i followed http://i17.photobucket.com/albums/b97/hello_screamo/Screenshot2011-11-11at113914AM.png

    Read the article

  • Synergy - easy share of keyboard and mouse between multiple computers

    Did you ever have the urge to share one set of keyboard and mouse between multiple machines? If so, please read on... Using multiple machines Honestly, as a software craftsman it is my daily business to run multiple machines - either physical or virtual - to be able to solve my customers' requirements. Recent hardware equipment allows this very easily. For laptops it's a no-brainer to attach a second or even a third screen in order to extend your native display. This works quite handy and in my case I used to attached two additional screens - one via HD15 connector, the other via HDMI. But... as it's a laptop and therefore a mobile unit there are slight restrictions. Detaching and re-attaching all cables when changing locations is one of them but hardware limitations, too. After all, it's a laptop and not a workstation. I guess, that anyone working in IT (or ICT) has more than one machine at their workplace or their home office and at least I find it quite annoying to have multiple sets of keyboard and mouse conquering my remaining space on my desk. Despite the ugly looks of all those cables and whatsoever 'chaos of distraction' I prefer a more clean solution and working environment. This allows me to actually focus on my work and tasks to do rather than to worry about choosing the right combination of keyboard/mouse. My current workplace is a patch work of various pieces of hardware (approx. 2-3 years): DIY desktop on Ubuntu 12.04 64-bit, Core2 Duo (E7400, 2.8GHz), 4GB RAM, 2x 250GB HDD, nVidia GPU 512MB Dell Inspiron 1525 on Windows 8 64-bit, 4GB RAM, 200GB HDD HP Compaq 6720s on Windows Vista 32-bit, Core2 Duo (T5670, 1.8GHz), 2GB RAM, 160GB HDD Mac mini on Mac OS X 10.7, Core i5 (2.3 GHz), 2GB RAM, 500GB HDD I know... Not the latest and greatest but a decent combination to work with. New system(s) is/are already on the shopping list but I live in the 'wrong' country to buy computer hardware. So, the next trip abroad will provide me with some new stuff. Using multiple operating systems The list of hardware above already names different operating systems, and actually I have only one preference: Linux. But still my job as a software craftsman for Visual FoxPro and .NET development requires other OSes, too. Not a big deal, it's just like this. Additionally to those physical machines, there are a bunch of virtual machines around. Most of them running either Windows XP or Windows 7. Since years I have the practice that each development for one customer is isolated into its own virtual machine and environment. This keeps it clean and version-safe. But as you can easily imagine with that setup there are a couple of constraints referring to keyboard and mouse. Usually, those systems require their own pieces of hardware attached. As stated, I don't like clutter on my desk's surface, so a cross-platform solution has to come in here. In the past, I tried it with various applications, hardware or network protocols like X11, RDP, NX, TeamViewer, RAdmin, KVM switch, etc. but the problem in this case is that they either allow you to remotely connect to the other system or exclusively 'bind' your peripherals to the active system. Not optimal after all. Synergy to the rescue Quote from their website: "Synergy lets you easily share your mouse and keyboard between multiple computers on your desk, and it's Free and Open Source. Just move your mouse off the edge of one computer's screen on to another. You can even share all of your clipboards. All you need is a network connection. Synergy is cross-platform (works on Windows, Mac OS X and Linux)." Yep, that's it! All I need for my setup here... Actually, I couldn't believe it myself that I didn't stumble over synergy earlier but 'Get over it' and there we go. And despite the fact that it is Open Source, no, it's also for free. Donations for the developers are very welcome and recently they introduced Synergy Premium. A possibility to buy so-called premium votes that can be used to put more weight / importance on specific issues or bugs that you would like the developers to look into. Installation and configuration Simply download the installation packages for your systems of choice, run the installer and enter some minor information about your network setup. I chose my desktop machine for the role of the Synergy server and configured my screen setup as follows: The screen setup allows you currently to build or connect up to 15 machines. The number of screens can be higher as those machine might have multiple screens physically attached. Synergy takes this into the overall calculations and simply works as expected. I tried it for fun with a second monitor each connected to both laptops to have a total number of 6 active screens. No flaws after all - stunning! All the other machines are configured as clients like so: Side note: The screenshot was taken on Windows 8 and pasted via clipboard into Gimp running on Ubuntu. Resume Synergy is now definitely in my box of tools for my daily work, and amongst the first pieces of software I install after the operating system. It just simplifies my life and cleans my desk. Never again without Synergy!Now, only waiting for an Android version to integrate my Galaxy Tab 10.1, too. ;-) Please, check out that superb product and enjoy sharing one keyboard, one mouse and one clipboard between your various machines and operating systems.

    Read the article

  • Finding Near-Earth Asteroids

    - by TATWORTH
    One of the puzzling aspects of hunting for Near Earth Asteroids is that more has been spent on Hollywood films about potential disasters should one hit the Earth than on finding them in the first place. While there are a number of on-going asteroid search programs, these are all Earth-based at the moment. The limitations of them are:Each telescope can only observe for a maximum average of 12 hours per day.As far as I am aware, all these programs are in the visible light only. (Once an asteroid is found, then radar tracking is possible when it is close.)Being Earth based they cannot see inside the Earth's orbit.The Asteroids being generally dark, do not show up well in visible light.A private group are proposing a radical alternative to this by orbiting an infra-red telescope in the orbit of Venus. In Infra-red, the asteroids are more readily seen. Here are some details: Source SPACE.com: All about our solar system, outer space and exploration

    Read the article

  • Conscience and unconscience from an AI/Robotics POV

    - by Tim Huffam
    Just pondering the workings of the human mind - from an AI/robotics point of view (either of which I know little about)..   If conscience is when you're thinking about it (processing it in realtime)... and unconscience is when you're not thinking about it (eg it's autonomous behaviour)..  would it be fair to say then, that:   - conscience is software   - unconscience is hardware   Considering that human learning is attributed to the number of neural connections made - and repetition is the key - the more the connections, the better one understands the subject - until it becomes a 'known'.   Therefore could this be likened to forming hard connections?  Eg maybe learning would progress from an MCU to FPGA's - therefore offloading realtime process to the hardware (FPGA or some such device)? t

    Read the article

  • The JavaFX Community Site on Java.net

    - by Tori Wieldt
    Community activity surrounding JavaFX has been steadily growing, with tweets, blog posts, and projects increasing in number. We are pleased to announce that there is now a JavaFX community site on Java.net at the following URL: javafxcommunity.com  This site is an aggregator of JavaFX information, where you can find links to JavaFX blog posts, tweets, and other resources.  Gerrit Grunwald and Jim Weaver are the community leaders for this site, and they welcome your feedback on how to make the JavaFX Community site more useful to you! Learn more on Jim Weaver’s Rich-Client Java Blog. 

    Read the article

  • Making a collision detection system

    - by Sri Harsha Chilakapati
    I'm very new to game development (just started 3 months ago) and I've learning through creating a game engine. It's located here. In terms of collision, I know only brutefoce detection, in which case, the game slows down if there are a number of objects. So my question is How should I program the collisions? I want them to happen automatically for every object and call the object's collision(GObject other) method on each collision. Are there any new algorithms which can make this fast? If so, can anybody6 sh6ed some light on this topic? And I think of making it like the game maker Thanks

    Read the article

  • Services or Shared Libraries?

    - by Royal
    I work in an environment where we have several different web applications, where each of them have different features but still need to do similar things: authentication, read from common data sources, store common data, etc. Is it better to build the shared functionality into a set of services, to be called by the web apps, or is it better to make a shared library, which the webapps include? The services or libraries would need to access various databases, and it seems like keeping that access in a single place (service) is a good idea. It would also reduce the number of database connections needed. A service would also keep the logic in a single place, but then it could be argued that a shared library can do the same thing. Are there other benefits to be gained from using services over shared libraries?

    Read the article

  • How much code should I be responsible for?

    - by Mick
    Through colleagues and exit interviews, I have heard that at my small company I am "responsible" for anywhere from 3-10 times more code than I would be at another job. I'm trying to look for some sort of fuzzy metric that I can use to compare my workload with others in my field. By "code responsibility", I don't mean "I'm the only one who knows area X of the code base" (though sadly, it's often true in a startup environment), but rather am referring to a number like "code_base_size/number_of_developers". Are there any resources I can use to help me more accurately measure my work load than just counting lines of code?

    Read the article

  • OpenGL Drawing textured model (OBJ) black texture

    - by andrepcg
    I'm using OpenGL, Glew, GLFW and Glut to create a simple game. I've been following some tutorials and I have now a good model importer with textures (from ogldev.atspace.co.uk) but I'm having an issue with the model textures. I have a skybox with a beautiful texture as you can see in the picture That weird texture behind the helicopter (model) is the heli model that I've applied on purpose to that wall to demonstrate that specific texture is working, but not on the helicopter. I'll include the files I'm working on so you can check it out. Mesh.cpp - http://pastebin.com/pxDuKyQa Texture.cpp - http://pastebin.com/AByWjwL6 Render function + skybox - http://pastebin.com/Vivc9qnT I'm just calling mesh->Render(); before the drawSkyBox function, in the render loop. Why is the heli black when I can perfectly apply its texture to another quad? I've debugged the code and the mesh-render() call is correctly fetching the texture number and passing it to the texture-bind() function.

    Read the article

  • Google Analytics Export API - nextPagePath data

    - by Btibert3
    I am probably missing something obvious, but I do not understand when I query: start.date = DATE_START, end.date = DATE_END, dimensions = c("ga:pagePath","ga:previousPagePath"), metrics = c("ga:pageviews"), filters = mypageofinterest, table.id = "ga:mytable", max.results=RESULTS my data return as expected, all of the previous pages including (entrance). However, when I modify the code to be nextPagePath start.date = DATE_START, end.date = DATE_END, dimensions = c("ga:pagePath","ga:nextPagePath"), metrics = c("ga:pageviews"), filters = mypageofinterest, table.id = "ga:mytable", max.results=RESULTS only one line of data are returned; the pagepath and nextpagepath are identical with itself. I replicated this result using the Query Explorer. What am I missing or doing wrong? I was expecting to see a large number of "next" pages, including (exit). Thanks in advance.

    Read the article

  • Tic-Tac-Toe game AI

    - by David Jones
    I'm looking into creating a simple tic tac toe/noughts and crosses game in Actionscript3 and am trying to understand the ideas behind the AI used in a game like this. I've seen some simplistic examples online but from what I've read a game tree or something like minimax is the best way to go about this. Can anyone help explain or reference any good examples of this? I've seen that there is a library called as3ds - data structures for game developers which has a number of classes that might help tie this together? Any info/examples or help is much appreciated.

    Read the article

  • Essbase Data precision unraveled

    - by THE
    (guest reference added by Nancy) Anyone who has been working with Data import and exoport as well as the Essbase Excel Add In has probably come across a phenomenon that is called data precision: Lots of zeroes are added to any given number that has been calculated by Essbase, and this gets displayed as "10.0000000000001" or "9.99999999999999" instead of a simple "10" . This question is one of the recurring ones that Support get asked over and over again, and we therefore feel the need to give an explanation to it: I would like to point you to the note The Limits of Data Precision in Essbase (Doc ID 1311188.1) which explains in detail why these numbers are showing up and what to do about it.

    Read the article

  • How Do Search Engines Rank Combined Keywords?

    - by Itai
    Suppose: A site that ranks very well (1st result) for something like 'best blue widget'. It also ranks very well (1st page) for 'blue widget'. It ranks not so well (2nd page) for 'widget'. Obviously, the number of monthly searches are much higher for 'widget' than for 'blue widget', which is still higher than for 'best blue widget'. Now the actual question: When creating new external links, how does each of the following anchor texts affect SEO for of these searches? widget blue widget best blue widget [HINT: The answer should be a 3x3 table] [NOTE: Assume the site is relevant for all these keyword combination]

    Read the article

  • Create a fast algorithm for a "weighted" median

    - by Hameer Abbasi
    Suppose we have a set S with k elements of 2-dimensional vectors, (x, n). What would be the most efficient algorithm to calculate the median of the weighted set? By "weighted set", I mean that the number x has a weight n. Here is an example (inefficient due to sorting) algorithm, where Sx is the x-part, and Sn is the n-part. Assume that all co-ordinate pairs are already arranged in Sx, with the respective changes also being done in Sn, and the sum of n is sumN: sum <= 0; i<= 0 while(sum < sumN) sum <= sum + Sn(i) ++i if(sum > sumN/2) return Sx(i) else return (Sx(i)*Sn(i) + Sx(i+1)*Sn(i+1))/(Sn(i) + Sn(i+1)) EDIT: Would this hold in two or more dimensions, if we were to calculate the median first in one dimension, then in another, with n being the sum along that dimension in the second pass?

    Read the article

  • Is Domain Driven Design useful / productive for not so complex domains?

    - by Elijah
    When assessing a potential project at work, I suggested that it might be advantageous to use a domain driven design approach to its object model. The project does not have an excessively complex domain, so my coworker threw this at me: It has been said, that DDD is favorable in instances where there is a complex domain model (“...It applies whenever we are operating in a complex, intricate domain” Eric Evans). What I'm lost on is - how you define the complexity of a domain? Can it be defined by the number of aggregate roots in the domain model? Is the complexity of a domain in the interaction of objects? The domain that we are assessing is related online publishing and content management.

    Read the article

  • Are unit tests really used as documentation?

    - by stijn
    I cannot count the number of times I read statements in the vein of 'unit tests are a very important source of documentation of the code under test'. I do not deny they are true. But personally I haven't found myself using them as documentation, ever. For the typical frameworks I use, the method declarations document their behaviour and that's all I need. And I assume the unit tests backup everything stated in that documentation, plus likely some more internal stuff, so on one side it duplicates the ducumentation while on the other it might add some more that is irrelevant. So the question is: when are unit tests used as documentation? When the comments do not cover everything? By developpers extending the source? And what do they expose that can be useful and relevant that the documentation itself cannot expose?

    Read the article

  • Oracle VM Virtualbox 4.0 released !

    - by wim.coekaerts
    Another great day for the Virtualbox development team. As is custom, they churn out new features and enhancements at a record pace. You can find the changelog here and you can download your version of 4.0 here. Have at it. A bunch of changes, visually a new management console, a new method to install with a base install and the extention pack (for the add-on drivers and extra features), a number of bugfixes, multi-monitor support for Oracle Solaris and Linux, it's a long list. A great product with a great user base. Check out this survey!

    Read the article

  • Public Cloud, co-location and managed services ... what is the cloud?

    - by llaszews
    Recently I have had conversation with a number of people that are selling and implementing 'cloud' solutions. I put cloud in quotes as implementations like co-location (aka co-lo) and managed services (sometimes referred to as 'your mess for less') have become popular options for companies moving to the cloud. These are obviously not pure public cloud offerings and probably more of hybrid cloud implementations as the infrastructure (PasS and IaaS)is dedicated to a specific customer. This eliminates the security, multi-tenancy, performance and other concerns that companies have regarding public cloud. Are co-location and managed services cloud to you? Are they something your company is considering when you think about cloud ?

    Read the article

  • internal error message pops up after each time system is rebooted

    - by Biju
    I had installed ubuntu 12.04 using wubi. But each time i boot the system an internal error message pops up. As show below:- Executable path /usr/share/apport/apport-gpu-error-intel.py Package xserver-xorg-video-intel 2:2.17.0-1ubuntu4 Problem Type crash Apportversion 2.0.1-0ubuntu7 and so on.. I had earlier upgraded to ubuntu 12.04 from ubuntu 11.10. And encountered the same issue. Hence i uninstalled the OS and reinstalled using wubi. I had posted the same query in ubuntu.com/support (Question Number: 195525) But couldnt find a solution. I am using dell inspiron with intel pentium. Need ur help in resolving this issue. thanking u, Biju

    Read the article

  • Is it Considered Good SQL practice to use GUID to link multiple tables to same Id field?

    - by Mallow
    I want to link several tables to a many-to-many(m2m) table. One table would be called location and this table would always be on one side of the m2m table. But I will have a list of several tables for example: Cards Photographs Illustrations Vectors Would using GUID's between these tables to link it to a single column in another table be considered 'Good Practice'? Will Mysql let me to have it automatically cascade updates and delete? If so, would multiple cascades lead to an issues? UPDATE I've read that GUID (a hex number) Generally takes up more space in a database and slows queries down. However I could still generate 'unique' ids by just having the table initial's as part of the id so that the table card's id would be c0001, and then Illustrations be I001. Regardless of this change, the questions still stands.

    Read the article

  • Trying to make a universe [on hold]

    - by caters
    I am wanting to program a universe so that it starts with a big bang and atoms form and then molecules form and stars start to form and planets start to form and then moons around those planets. I have a few questions. If 400 IPMUs(In Program Mass Units) = 1 solar mass than how would I calculate the number of IPMUs for a spectral class of star given the range of solar masses for a main sequence star in that spectral class? How can I have planets not look like stars? Since whether it is a subdwarf, main sequence star, subgiant, giant, bright giant, supergiant, or hypergiant is mainly dependent on the radius and luminosity how can I have the radius and luminosity independent of the mass?

    Read the article

  • Switching to HTTPS - redirect question

    - by seengee
    Following the recent Google announcements about improved ranking for sites running on https we have a number of clients asking about this. Is it safe to just 301 redirect all pages to their SSL equivalent, for example in a common PHP include file: if($_SERVER['HTTPS']!="on"){ $redirect= "https://".$_SERVER['HTTP_HOST'].$_SERVER['REQUEST_URI']; header("Location:$redirect",true,301); exit(); } Obviously I'm aware this is also possible within a .htaccess file but that cannot be modified in our case. Obviously all internal links would be switched to https:// links but obviously we need to sort out incoming links from Google and elsewhere. Is this a sound approach? Are there any other gotchas to be aware of?

    Read the article

  • Verify uniqueness of new content

    - by rogerkk
    I'm working on a review site, where there is a minor issue with almost duplicate reviews across items. Just a few words are changed. It would be very nice to be able to uncover these duplicates before they are approved by a moderator, and I'm hoping someone could chime in on the best strategy to get there. The site is running Ruby on Rails on a Postgres database and using Thinking Sphinx for search (all on Heroku), and so far the best option I see is to be pulling all the reviews out of the db and using a module like amatch to compare the strings. Not very efficient, so in this case I guess I'll have to limit the number/age of reviews to scan for dupes. Anyone got a better idea?

    Read the article

  • Why Google Analytics is displaying wrong landing pages?

    - by Salman
    I see all of my pages as Landing Pages in Google Analytics which cannot be true as I did not post those pages anywhere and I don't see any traffic hitting directly to that page. Also, I am using virtual page views on few buttons and I see those virtual pages as Landing pages too. For example, /click/request-a-quote 35000 views 35000 is too big a number to be ignored. Even if I ignore Virtual Pages Views, I see a lot of pages as Landing Pages that I am 100% sure that visitors ( atleast not so many users) are NOT hitting directly. Any advice, how to debug it? PS: I'm using the following code: var _gaq = _gaq || []; _gaq.push(['_setAccount', '<']); _gaq.push(['_setDomainName', 'none']); _gaq.push(['setLocalGifPath', '/images/_utm.gif']); _gaq.push(['_setAllowLinker', true]); _gaq.push(['_trackPageview','account/phase1']);

    Read the article

  • Store VOD wmi data in a database directly or use CQRS?

    - by JD01
    I need to collect Video on demand bandwidth usage every few minutes (or maybe ever few seconds) and store this in a database so users can produce graphs on bandwidth usage over a period of time (few hours, days, weeks or even possibly months). So the sort of data that will be stored will be the number of users watching videos, current server bandwidth (Mb/s), multicast bit rate etc. I am wondering whether using CQRS would be a good approach with Event sourcing as I can then rebuild my objects to create different projections (I.e. different graphs/reports etc) but then again it seems like I am introducing complexity which might not be needed. Or would it be best to just put the data directly in a database (currently using PostGres) directly and query off that? Having thought about it, my table is a form of audit log anyway, so I don't think I need event sourcing at all. Any thoughts?

    Read the article

< Previous Page | 507 508 509 510 511 512 513 514 515 516 517 518  | Next Page >