Search Results

Search found 68825 results on 2753 pages for 'problem'.

Page 567/2753 | < Previous Page | 563 564 565 566 567 568 569 570 571 572 573 574  | Next Page >

  • Constant game speed independent of variable FPS in OpenGL with GLUT?

    - by Nazgulled
    I've been reading Koen Witters detailed article about different game loop solutions but I'm having some problems implementing the last one with GLUT, which is the recommended one. After reading a couple of articles, tutorials and code from other people on how to achieve a constant game speed, I think that what I currently have implemented (I'll post the code below) is what Koen Witters called Game Speed dependent on Variable FPS, the second on his article. First, through my searching experience, there's a couple of people that probably have the knowledge to help out on this but don't know what GLUT is and I'm going to try and explain (feel free to correct me) the relevant functions for my problem of this OpenGL toolkit. Skip this section if you know what GLUT is and how to play with it. GLUT Toolkit: GLUT is an OpenGL toolkit and helps with common tasks in OpenGL. The glutDisplayFunc(renderScene) takes a pointer to a renderScene() function callback, which will be responsible for rendering everything. The renderScene() function will only be called once after the callback registration. The glutTimerFunc(TIMER_MILLISECONDS, processAnimationTimer, 0) takes the number of milliseconds to pass before calling the callback processAnimationTimer(). The last argument is just a value to pass to the timer callback. The processAnimationTimer() will not be called each TIMER_MILLISECONDS but just once. The glutPostRedisplay() function requests GLUT to render a new frame so we need call this every time we change something in the scene. The glutIdleFunc(renderScene) could be used to register a callback to renderScene() (this does not make glutDisplayFunc() irrelevant) but this function should be avoided because the idle callback is continuously called when events are not being received, increasing the CPU load. The glutGet(GLUT_ELAPSED_TIME) function returns the number of milliseconds since glutInit was called (or first call to glutGet(GLUT_ELAPSED_TIME)). That's the timer we have with GLUT. I know there are better alternatives for high resolution timers, but let's keep with this one for now. I think this is enough information on how GLUT renders frames so people that didn't know about it could also pitch in this question to try and help if they fell like it. Current Implementation: Now, I'm not sure I have correctly implemented the second solution proposed by Koen, Game Speed dependent on Variable FPS. The relevant code for that goes like this: #define TICKS_PER_SECOND 30 #define MOVEMENT_SPEED 2.0f const int TIMER_MILLISECONDS = 1000 / TICKS_PER_SECOND; int previousTime; int currentTime; int elapsedTime; void renderScene(void) { (...) // Setup the camera position and looking point SceneCamera.LookAt(); // Do all drawing below... (...) } void processAnimationTimer(int value) { // setups the timer to be called again glutTimerFunc(TIMER_MILLISECONDS, processAnimationTimer, 0); // Get the time when the previous frame was rendered previousTime = currentTime; // Get the current time (in milliseconds) and calculate the elapsed time currentTime = glutGet(GLUT_ELAPSED_TIME); elapsedTime = currentTime - previousTime; /* Multiply the camera direction vector by constant speed then by the elapsed time (in seconds) and then move the camera */ SceneCamera.Move(cameraDirection * MOVEMENT_SPEED * (elapsedTime / 1000.0f)); // Requests to render a new frame (this will call my renderScene() once) glutPostRedisplay(); } void main(int argc, char **argv) { glutInit(&argc, argv); (...) glutDisplayFunc(renderScene); (...) // Setup the timer to be called one first time glutTimerFunc(TIMER_MILLISECONDS, processAnimationTimer, 0); // Read the current time since glutInit was called currentTime = glutGet(GLUT_ELAPSED_TIME); glutMainLoop(); } This implementation doesn't fell right. It works in the sense that helps the game speed to be constant dependent on the FPS. So that moving from point A to point B takes the same time no matter the high/low framerate. However, I believe I'm limiting the game framerate with this approach. Each frame will only be rendered when the time callback is called, that means the framerate will be roughly around TICKS_PER_SECOND frames per second. This doesn't feel right, you shouldn't limit your powerful hardware, it's wrong. It's my understanding though, that I still need to calculate the elapsedTime. Just because I'm telling GLUT to call the timer callback every TIMER_MILLISECONDS, it doesn't mean it will always do that on time. I'm not sure how can I fix this and to be completely honest, I have no idea what is the game loop in GLUT, you know, the while( game_is_running ) loop in Koen's article. But it's my understanding that GLUT is event-driven and that game loop starts when I call glutMainLoop() (which never returns), yes? I thought I could register an idle callback with glutIdleFunc() and use that as replacement of glutTimerFunc(), only rendering when necessary (instead of all the time as usual) but when I tested this with an empty callback (like void gameLoop() {}) and it was basically doing nothing, only a black screen, the CPU spiked to 25% and remained there until I killed the game and it went back to normal. So I don't think that's the path to follow. Using glutTimerFunc() is definitely not a good approach to perform all movements/animations based on that, as I'm limiting my game to a constant FPS, not cool. Or maybe I'm using it wrong and my implementation is not right? How exactly can I have a constant game speed with variable FPS? More exactly, how do I correctly implement Koen's Constant Game Speed with Maximum FPS solution (the fourth one on his article) with GLUT? Maybe this is not possible at all with GLUT? If not, what are my alternatives? What is the best approach to this problem (constant game speed) with GLUT? I originally posted this question on Stack Overflow before being pointed out about this site. The following is a different approach I tried after creating the question in SO, so I'm posting it here too. Another Approach: I've been experimenting and here's what I was able to achieve now. Instead of calculating the elapsed time on a timed function (which limits my game's framerate) I'm now doing it in renderScene(). Whenever changes to the scene happen I call glutPostRedisplay() (ie: camera moving, some object animation, etc...) which will make a call to renderScene(). I can use the elapsed time in this function to move my camera for instance. My code has now turned into this: int previousTime; int currentTime; int elapsedTime; void renderScene(void) { (...) // Setup the camera position and looking point SceneCamera.LookAt(); // Do all drawing below... (...) } void renderScene(void) { (...) // Get the time when the previous frame was rendered previousTime = currentTime; // Get the current time (in milliseconds) and calculate the elapsed time currentTime = glutGet(GLUT_ELAPSED_TIME); elapsedTime = currentTime - previousTime; /* Multiply the camera direction vector by constant speed then by the elapsed time (in seconds) and then move the camera */ SceneCamera.Move(cameraDirection * MOVEMENT_SPEED * (elapsedTime / 1000.0f)); // Setup the camera position and looking point SceneCamera.LookAt(); // All drawing code goes inside this function drawCompleteScene(); glutSwapBuffers(); /* Redraw the frame ONLY if the user is moving the camera (similar code will be needed to redraw the frame for other events) */ if(!IsTupleEmpty(cameraDirection)) { glutPostRedisplay(); } } void main(int argc, char **argv) { glutInit(&argc, argv); (...) glutDisplayFunc(renderScene); (...) currentTime = glutGet(GLUT_ELAPSED_TIME); glutMainLoop(); } Conclusion, it's working, or so it seems. If I don't move the camera, the CPU usage is low, nothing is being rendered (for testing purposes I only have a grid extending for 4000.0f, while zFar is set to 1000.0f). When I start moving the camera the scene starts redrawing itself. If I keep pressing the move keys, the CPU usage will increase; this is normal behavior. It drops back when I stop moving. Unless I'm missing something, it seems like a good approach for now. I did find this interesting article on iDevGames and this implementation is probably affected by the problem described on that article. What's your thoughts on that? Please note that I'm just doing this for fun, I have no intentions of creating some game to distribute or something like that, not in the near future at least. If I did, I would probably go with something else besides GLUT. But since I'm using GLUT, and other than the problem described on iDevGames, do you think this latest implementation is sufficient for GLUT? The only real issue I can think of right now is that I'll need to keep calling glutPostRedisplay() every time the scene changes something and keep calling it until there's nothing new to redraw. A little complexity added to the code for a better cause, I think. What do you think?

    Read the article

  • Inheritance, commands and event sourcing

    - by Arthis
    In order not to redo things several times I wanted to factorize common stuff. For Instance, let's say we have a cow and a horse. The cow produces milk, the horse runs fast, but both eat grass. public class Herbivorous { public void EatGrass(int quantity) { var evt= Build.GrassEaten .WithQuantity(quantity); RaiseEvent(evt); } } public class Horse : Herbivorous { public void RunFast() { var evt= Build.FastRun; RaiseEvent(evt); } } public class Cow: Herbivorous { public void ProduceMilk() { var evt= Build.MilkProduced; RaiseEvent(evt); } } To eat Grass, the command handler should be : public class EatGrassHandler : CommandHandler<EatGrass> { public override CommandValidation Execute(EatGrass cmd) { Contract.Requires<ArgumentNullException>(cmd != null); var herbivorous= EventRepository.GetById<Herbivorous>(cmd.Id); if (herbivorous.IsNull()) throw new AggregateRootInstanceNotFoundException(); herbivorous.EatGrass(cmd.Quantity); EventRepository.Save(herbivorous, cmd.CommitId); } } so far so good. I get a Herbivorous object , I have access to its EatGrass function, whether it is a horse or a cow doesn't matter really. The only problem is here : EventRepository.GetById<Herbivorous>(cmd.Id) Indeed, let's imagine we have a cow that has produced milk during the morning and now wants to eat grass. The EventRepository contains an event MilkProduced, and then come the command EatGrass. With the CommandHandler, we are no longer in the presence of a cow and the herbivorious doesn't know anything about producing milk . what should it do? Ignore the event and continue , thus allowing the inheritance and "general" commands? or throw an exception to forbid execution, it would mean only CowEatGrass, and HorseEatGrass might exists as commands ? Thanks for your help, I am just beginning with these kinds of problem, and I would be glad to have some news from someone more experienced.

    Read the article

  • Try out ubuntu on thinkpad x40 (via usb)?

    - by Oliver
    I am trying to try out ubuntu (i,e, install it within windows) on my thinkpad x40. I followed the instructions on how to create a bootable usb (http://www.ubuntu.com/download/ubuntu/download). No problem. The issue I have now is that th eusb ports on my x40 were burned before (common problem with the machine), i.e. do not work. So I got a USB notebook card from belkin. However it does not seem to recognize the usb stick from bios, thus I cannot boot from it. I also do not have a cd rom. Then tried to run wubi from the usb stick, briefly appears in task manager, no further action. So I tried it with wubi.exe, but same thing. Downloaded it to desktop, run it, briefly appears in the task manager under processes, no further action. Any one idea? I have enough memory and enough freed hd space. Thanks. Oliver

    Read the article

  • Only 1080p youtube videos work properly

    - by oshirowanen
    I have Ubuntu 12.04 64bit installed/full updated and have noticed that I can only play 1080p youtube videos properly on full screen. All other resolutions, 480, 720, 1440 etc do not go into full screen properly. Please have a look at the attached images. You will notice that only the 1080p video has the video control bar right at the bottom with a respectable text size. All other videos don't have the video control bar right at the bottom on the screen, and the control bar text is too small. I've had this problem for ages, but finally decided to ask about it. 480p: Notice the bottom control bar, it's not at the bottom on the screen and the video has borders on the sides: 720p: Notice the bottom control bar, it's not at the bottom on the screen and the video has borders on the sides and the text is tiny: 1080p: This is the only one that works as expected, i.e. controls right at the bottom, and good font size and no borders on the sides: 1440p: Notice the bottom control bar, it's not at the bottom on the screen and the video has borders on the sides: How do I correct this problem so I can play all different video resolutions at full screen. The flash version I have installed in Chromium is Adobe Flash Player - Version: 11.2 r202 - Shockwave Flash 11.2 r202

    Read the article

  • How to reset display settings in XFCE \ Ubuntu 12.04 and also flgrx drivers

    - by Agent24
    I recently upgraded to Ubuntu 12.04 and since I hate unity I installed the Xubuntu package and am using XFCE instead. Since I have a Radeon HD5770 I also installed the fglrx drivers. This all went fine (aside from the fact that the post-release update fglrx drivers have an error on installation and Ubuntu thinks they're not installed when they actually are. I configured my display settings (dual monitors, a 17" CRT on VGA and a 17" LCD on DVI) in the amdcccle program and everything was perfect. THEN, 2 days ago, I accidentally clicked on the "Display" settings in XFCE "settings" manager. After that, everything got screwed. Now, I normally run the CRT at 1152x854 and the LCD at 1280x1024 with the CRT as my primary monitor (with panel) and the LCD without panels etc just to display other windows when I want to drag them over there. The problem is now that if I set my CRT to 1152x864, it stays at 1280x1024 virtually and half the stuff falls off the screen. It also puts the LCD at 1280x1024 BUT then overlays the CRT's display ontop with different wallpaper in an L shape down the right-hand and bottom edges. In short, nothing makes sense and everything is FUBAR. I tried uninstalling fglrx through synaptic, and renaming xorg.conf and also the xfce XML file that has monitor settings but it still won't make sense. Unity on the other hand can currently set everything normally so the problem appears to be only with XFCE. In any case, I can't even get the fglrx drivers back, when I re-installed them, I can't run amdccle anymore as it says the driver isn't installed!! Can someone help me reset my XFCE settings so the monitors aren't screwed with some incorrect virtual desktop size and also so I can get fglrx drivers back and working? I really don't want to have to format and reinstall and go through all the hassle but it looks like I may have to :(

    Read the article

  • Keeping Aspect Screen Ration While Stays in Center

    - by David Dimalanta
    I sqw and I tried this suggestion on PISTACHIO BRAINSTORMIN* on how to make a good and adaptive screen ration. For every different screen size, let's say I put the perfect circle as a Texture in LibGDX and played it on screen. Here's the blueberry image example and it's perfectly rounded: When I played it on the Google Nexus 7, the circle turn into a slightly oblonng shape, resembling as it was being flatten a bit. Please observe this snapshot below and you can see the blueberry is almost but slightly not perfectly rounded: Now, when I tried the suggested code for aspect ratio, the perfect circle retained but another problem is occured. The problem is that I expecting for a view on center but instead it's been moved to the right offset leaving with a half black screen. This would be look like this: Here is my code using the suggested screen aspect ratio code: Class' Field // Ingredients Needed for Screen Aspect Ratio private static final int VIRTUAL_WIDTH = 720; private static final int VIRTUAL_HEIGHT = 1280; private static final float ASPECT_RATIO = ((float) VIRTUAL_WIDTH)/((float) VIRTUAL_HEIGHT); private Camera Mother_Camera; private Rectangle Viewport; render() // Camera updating... Mother_Camera.update(); Mother_Camera.apply(Gdx.gl10); // Reseting viewport... Gdx.gl.glViewport((int) Viewport.x, (int) Viewport.y, (int) Viewport.width, (int) Viewport.height); // Clear previous frame. Gdx.gl.glClearColor(0, 0, 0, 1); Gdx.gl.glClear(GL10.GL_COLOR_BUFFER_BIT); show() Mother_Camera = new OrthographicCamera(VIRTUAL_WIDTH, VIRTUAL_HEIGHT); Was this code useful for screen aspect ratio-proportion fixing or it is statically dependent on actual device's width and height? *see http://blog.acamara.es/2012/02/05/keep-screen-aspect-ratio-with-different-resolutions-using-libgdx/#comment-317

    Read the article

  • Unity3d: calculate the result of a transform without modifying transform object itself

    - by Heisenbug
    I'm in the following situation: I need to move an object in some way, basically rotating it around its parent local position, or translating it in its parent local space (I know how to do this). The amount of rotation and translation is know at runtime (it depends on several factors, the speed of the object, enviroment factors, etc..). The problem is the following: I can perform this transformation only if the result position of the transformed object fit some criterias. An example could be this: the distance between the position before and after the transformation must be less than a given threshold. (Actually the conditions could be several and more complex) The problem is that if I use Transform.Rotate and Transform.Translate methods of my GameObject, I will loose the original Transform values. I think I can't copy the original Transform using instantiate for performance issues. How can I perform such a task? I think I have more or less 2 possibilities: First Don't modify the GameObject position through Transform. Calculate which will be the position after the transform. If the position is legal, modify transform through Translate and Rotate methods Second Store the original transform someway. Transform the object using Translate and Rotate. If the transformed position is illegal, restore the original one.

    Read the article

  • Flash player stops working in Firefox

    - by Alin
    First of all please be merciful as I am a 2 days ubuntu user. I've only used Windows OSs before. Hardware configuration -Quad Core Q6600 CPU -4 GB RAM -Geforce 8800GT 512 MB RAM -Seagate 320GB Harddisk I installed ubuntu x64 and everything went smooth. I had Adoble flash player installed from Ubuntu Software Center. The problem is that the flash videos on youtube (basically on all pages) looked pixelate. I use Firefox 3.6 that came with ubuntu by default. After reading around here I saw something about overriding hardware acceleration but that didn't work either. Or maybe I just didn't do it correctly. The problem was that if I opened 2-3 youtube tabs, when closing one of them, in 95% of the cases, the flash plugin would crash in remaining tabs and a black screen was instead of the video. After doing some more search I discovered the FlashAid extension for Firefox. I installed it and it made an update of flash plugin. Now the videos look good but now each time I watch a video, when switching to another one the flash plugin crashes and I can see insead of the vide the firefox message with Plugin has crasshed, refresh the page. Basically this is what I miss to have a complete pleasant ubuntu experience. Any ideas how to fix this? Thank you

    Read the article

  • Keeping Aspect Screen Ratio While Stays in Center

    - by David Dimalanta
    I sqw and I tried this suggestion on PISTACHIO BRAINSTORMIN* on how to make a good and adaptive screen ration. For every different screen size, let's say I put the perfect circle as a Texture in LibGDX and played it on screen. Here's the blueberry image example and it's perfectly rounded: When I played it on the Google Nexus 7, the circle turn into a slightly oblonng shape, resembling as it was being flatten a bit. Please observe this snapshot below and you can see the blueberry is almost but slightly not perfectly rounded: Now, when I tried the suggested code for aspect ratio, the perfect circle retained but another problem is occured. The problem is that I expecting for a view on center but instead it's been moved to the right offset leaving with a half black screen. This would be look like this: Here is my code using the suggested screen aspect ratio code: Class' Field // Ingredients Needed for Screen Aspect Ratio private static final int VIRTUAL_WIDTH = 720; private static final int VIRTUAL_HEIGHT = 1280; private static final float ASPECT_RATIO = ((float) VIRTUAL_WIDTH)/((float) VIRTUAL_HEIGHT); private Camera Mother_Camera; private Rectangle Viewport; render() // Camera updating... Mother_Camera.update(); Mother_Camera.apply(Gdx.gl10); // Reseting viewport... Gdx.gl.glViewport((int) Viewport.x, (int) Viewport.y, (int) Viewport.width, (int) Viewport.height); // Clear previous frame. Gdx.gl.glClearColor(0, 0, 0, 1); Gdx.gl.glClear(GL10.GL_COLOR_BUFFER_BIT); show() Mother_Camera = new OrthographicCamera(VIRTUAL_WIDTH, VIRTUAL_HEIGHT); Was this code useful for screen aspect ratio-proportion fixing or it is statically dependent on actual device's width and height? *see http://blog.acamara.es/2012/02/05/keep-screen-aspect-ratio-with-different-resolutions-using-libgdx/#comment-317

    Read the article

  • Ubuntu security with services running from /opt

    - by thejartender
    It took me a while to understand what's going on here (I think), but can someone explain to me if there are security risks with regards to my logic of what's going on here as I am trying to set up a home web server as a developer with some good Linux knowledge? Ubuntu is not like other systems, as it has restricted the root user account. You can not log in as root or su to root. This was a problem for me as I have had to install numerous applications and services to /opt as per user documentation (XAMPPfor Linux is a good example). The problem here is that this directory is owned by root:root. I notice that my admin user account does not belong to root group through the following command: groups username so my understanding is that even though the files and services that I place in /opt belong to root, executing them by means of sudo (as required) does not mean that they are run as root? I imagine that the sudo command is hidden somewhere under belonging to the root user and has a 775 permission? So the question I have is if running a service like Tomcat, Apcahe, etc exposes my system like on other systems? Obviously I need to secure these in configurations, but isn't the golden rule to never run something as root? What happens if I have multiple services running under same user/group with regards to a compromised server?

    Read the article

  • ubuntu 12.10 installation failure

    - by Eidelmaim
    Here i am asking this question again because someone deemed it a duplicate to another topic which i over looked and NOTHING AT ALL in that topic pertained to my problem. If your going to close a topic believing it is a duplicate at least do somke reasearch into WHY you think its a duplicate and provide a like to a better source. How do i get past this installation username and password issue? I downloaded ubuntu 12.10 directly from Ubuntu.com and created a bootable USB with linuxlive. after loading the boot drive and ubuntu begins, it goes directly from the purple ubuntu startup screen directly to a black DOS like prompt asking for a ubuntu login. this is COMPLETLY before any installation -begins. i need some help with this. FYI : this is what it is saying after it goes to the login area in the DOS (Full black screen) like screen. Ubuntu 12.10 ubuntu tty1 ubuntu login: now i will provide a few images of the problem i am having. and because i DONT HAVE ANY OS on the computer BECAUSE ubuntu WONT go PAST this... i have to snap these pictuires with a cell phone and upload on another PC. these links are in chronological order from time of pressing power button to time i am presented with log in screen : image 1 image 2 can only post 2 links in messages... will post additional links in comments So, again... how do i get past this ? this is entireley before ubuntu is installed on my system. my PC specs... Homebuilt computer: Motherboard is a Asus Sabertooth x58 with a intel core i7 processor. Mushkin memory @ 12gbs. 4ea. Seagate 150gb hard drives. nvidia GTX 260 graphics card. i initially attempted to install to raid 5. failed. i broke down the raid and attempted to install to a single drive with all other drives disconnected from the PC. again, thanks in advanced for any assistance.

    Read the article

  • Is aspect oriented programming a misnomer?

    - by glenviewjeff
    From everything I have learned about "Aspect Oriented Programming" or "Aspect Oriented Software Development," labeling it as a programming paradigm or methodology appears to be inaccurate. From what I can tell it is not a fundamental technique for programming. To nail down what is meant by "paradigm" and "methodology," please refer to the following definitions from the American Heritage Dictionary. Compare how well or poorly "Object-Oriented Programming" applies to each vs. how well AOP fits. Paradigm: A set of assumptions, concepts, values, and practices that constitutes a way of viewing reality for the community that shares them, especially in an intellectual discipline. Methodology: A body of practices, procedures, and rules used by those who work in a discipline or engage in an inquiry; a set of working methods. "Evidence-based medicine" satisfies the definition of paradigm, but "hysterectomy-based medicine" would be a misnomer because the problem space is too narrow. I am getting the impression that AOP may be misnamed because based on the "oriented-programming" suffix, AOP is alleging to be both a paradigm and a methodology in the same way "Object-Oriented Programming" is. Both of these terms (paradigm and methodology) indicate a fundamental technique, where what I understand about aspects is a technology for solving a narrow problem scope, maybe comparable in magnitude to the static variable feature of Java. If it's true that aspects solve a narrow set of problems, and AOP isn't a misnomer, then why shouldn't all programming techniques be given the "oriented-programming" suffix, such as "inheritance-oriented programming," "dependency-oriented programming," or "scope-oriented programming?"

    Read the article

  • Setting different default applications for different Desktop Environment

    - by Anwar
    I am using Ubuntu 12.04 with default Unity interface. I installed later the KDE desktop, XFCE, LXDE, gnome-shell and Cinnamon. The KDE comes with different default applications than Unity, such as kwrite for text editing, konsole as virtual terminal, kfontview for font viewing and installing, dolphin as File browser etc. Other DE come with some other default applications. The problem arises when you want to open a file such as a text file, with which can both be opened by gedit and kwrite, I want to use kwrite on KDE and gedit on Unity or Gnome. But, there is no way to set like this. I can set default application for text file by changing respective settings in both KDE and Unity, but It become default for both DE. For example, If I set kfontviewer as default font viewing application in KDE, it also opens fonts when I am in Unity or Gnome and vice versa. This is a problem because, loading other DE's program takes long time than the default one for the used DE. My question is: Can I use different default applications for different DE? How?

    Read the article

  • Drawing particles as a smooth blob

    - by Nömmik
    I'm new to game/graphics development and I'm playing around with particles (in 2D). I want to draw particles close to each other as a blob, just as liquid/water. I do not want to draw big circles overlapping as the blob won't be smooth (and too big). I don't really know physics but I assume what I want is something looking similar to surface tension. I haven't been able to find anything on stackexchange or on Google (maybe I do not know the correct keywords?). So far I have found two possible solutions, but I am unable to find any concrete information about algorithms. One of them is to calculate the concave hull of particles I consider being a blob. I can calculate the blob by creating an equivalence class (on the relation "close to each other"). Strangely enough I haven't been able to find any algorithm explaining how to calculate the concave hull. Many posts (and among stackexchange) links to libraries or commercial products that do this (I need libraries to work in C#), but never any algorithm. Also this solution might have a problem with a circle of particles, which would not detect the empty space in the middle. While researching concave hull I stumbled upon something called alpha shapes. Which seems to be exactly what I want to do, however just as with concave hull I haven't found any source explaining how they actually work. I have found some presentation materials but not enough to go on. It's like a big secret everyone knows except me :-/ After calculating the concave hull or alpha shape I want to make it a Bézier curve to make it smooth and nice. Although I do find my approach a bit too complex, maybe I am trying to solve this the wrong way? If you can either suggest any other solution to my problem, or explain the pieces I am missing I would be very happy and grateful :-) Thanks.

    Read the article

  • Installation hangs at "Retrieving file 43 of 105" Virtualbox OS X 10.7 Host Ubuntu 12.04 x86 guest

    - by goodcop
    This is my second attempt at installing Ubuntu. In my first attempt, I selected "download updates" and "install third-party components"; in my second attempt, I deselected both. Still experiencing the same problem. In my first attempt at installing Ubuntu, after the installation stuck at "Retrieving file 43 of 105", I selected "skip" and the installation completed. After I started to run the OS, I received a notification that language support was incomplete. When I tried to update it, the Ubuntu Software Centre updating process hung on "waiting for jockey-backend to exit", seemingly indefinitely. At that point, I decided to reinstall the system (since the whole process is only supposed to take 45 minutes or less), but, as I mentioned above my results were the same. I'm new to Ubuntu. Any advice? Where are the files (including file 43) being retrieved from? Online or from the ubuntu installation iso? I have searched many forums for an answer to this problem, and have seen others with the same issue but I haven't found a solution. Thanks.

    Read the article

  • I have a ESXi 5.0 installed and when I am Installing Ubuntu Server 12.04 LTS it is giving an error saying grub installation failed?

    - by Rishee
    I have a ESXi 5.0 installed and when I am Installing Ubuntu Server 12.04 LTS (32 bit) it is giving an error saying grub installation failed? Please check the below screenshot of the error. I have other Ubuntu servers running fine on this esxi server, so I don't think problem is with ESXi. I have 32 GB of ram spare on this ESXi and have given 2 GB of RAM to this 12.0 LTS VM. I have given 2 cores of processor. I have tried supplying different ISO Image to this VM as I thought the 1st image that I downloaded has errors.. But defiantly that's not the case as all 3 different ISO images that I downloaded of Ubuntu server 12.04 LTS (32-bit) can't be corrupt!! Just to make sure that the Image does not have problem I used that Image to install it for testing on stand alone system. it works fine there!! This is a production ESXi server with which I can't play with, how ever I can play whith the Ubuntu Server 12.04 LTS (32-Bit) VM that we have created on that ESXi. I need help on this as soon as possible. go live date of this server is really close. (This Question is already there on Super user and Server Fault.)

    Read the article

  • Does ssh key need to be named id_rsa?

    - by dustyprogrammer
    I have come across this problem a couple of times when creating build servers with keyed authentication. I was wondering if anyone else has experience this. I have a couple of keys for my current user that may connect to different machines. Let say machine1 and machine2. I have pasted my public key into their respective authorized_keys file. The first one I have named the first key id_rsa and the second key bender. When I try to connect to bender I get the following output with my verbose ssh connection debug1: SSH2_MSG_NEWKEYS sent debug1: expecting SSH2_MSG_NEWKEYS debug1: SSH2_MSG_NEWKEYS received debug1: SSH2_MSG_SERVICE_REQUEST sent debug1: SSH2_MSG_SERVICE_ACCEPT received debug1: Authentications that can continue: publickey debug1: Next authentication method: publickey debug1: Trying private key: /home/bozo/.ssh/.ssh/identity debug1: Trying private key: /home/bozo/.ssh/.ssh/id_rsa debug1: Trying private key: /home/bozo/.ssh/id_dsa debug1: No more authentication methods to try. Permission denied (publickey). It only offers the id_rsa key, as you can see above. Is this correct? If so why? How do I get it to offer more keys? I know it is a problem I see intermittently, because I at home I have multiple keys without much trouble. I would also appreciate a overview on how the pub and private keys interact with the client and server. I thought I had a pretty decent idea, but apparently I am missing something. Please and thank you.

    Read the article

  • How to evaluate a user against optimal performance?

    - by Alex K
    I have trouble coming up with a system of assigning a rating to player's performance. Well, technically there is is a trivial rating system, but I don't like it because it would mean assigning negative scores, which I think most players will be discouraged by. The problem is that I only know the ideal number of actions to get the desired result. The worst case is infinite number of actions, so there is no obvious scale. The trivial way I referred to above is to take score = (#optimal-moves - #players-moves), with ideal score being zero. However, psychologically people like big numbers. No one wants to win by getting a mark of 0. I wonder if there is a system that someone else has come up with before to solve this problem? Essentially I wish to score the players based on: How close they've come to the ideal solution. Different challenges will have different optimal number of actions, so the scoring system needs to take that into account, e.g. Challenge 1 - max 10 points, Challenge 2 - max 20 points. I don't mind giving the players negative scores if they've performed exceptionally badly, I just don't want all scores to be <=0

    Read the article

  • How to check Early Z efficiency on AMD GPU with Windows 7

    - by Suma
    I have a game using DirectX 9, and a development station using Win 7 x64. I am still able to get access to another station with Vista x64 / dual booted with WinXP x86. I wanted to check early Z efficiency in the game and to my sadness all tools I have tried seem to be unable to perform this task: AMD PerfStudio AMD GPUPerfStudio 2 does not support DirectX 9 at all AMD GPUPerfStudio 1.2 does not install correctly on Windows 7. When I have tweaked the MSI package (a simple OS version check adjustment was needed), it complained the drivers I have do not provide needed instrumentation. The drivers old enough to support the GPUPerfStudio would most likely not be able to operate with my Radeon 5750 card (though this is something I am not 100 % sure, I did not attempt to try any older drivers, not knowing which I should look for) PIX PIX does not seem to contain any counters like this. It offers some ATI specific counters, but when I try to activate them, the PIX reports "PIX encountered a problem while attaching to the target program." I do not want to upgrade to DX 10/11 just to be able to profile the game, but it seems without the step I am somewhat locked with a toolset which is no longer supported. I see only one obvious options which would probably work, and that is using WinXP (or with a little bit of luck even Vista) station, perhaps with some older AMD card, to make sure GPUPerfStudio 1.2 works. Other than that, can anyone recommend other options how to check GPU HW counters (HiZ / EarlyZ in particular, but if others would be enabled as well, it would be a nice bonus) for a DirectX 9 game on Windows 7, preferably on AMD GPU? (If that is not possible, I would definitely prefer switching GPU to switching the OS, but before I do so I would like to know if I will not hit the same problem with nVidia again)

    Read the article

  • A new CAPTCHA using sentences?

    - by Xeoncross
    I was just thinking about how recaptcha is getting harder when I thought about another posible solution. Images won't last forever so we will need something else some day - like human logic or emotion. Google and others are trying grouping images by category (find the image that doesn't belong) but that requires a large amount of images and doesn't work for the blind. Anyway, what if a massive collection of text was gathered (public-domain books from each language) and a sentence was shown to the user with 1 (or 2) words that were a select box of choices? Only computers that knew correct English/Spanish/German grammar would be able to tell which of the words belonged in the sentence. Would there be any problems with this approach? I would assume that it would be easy enough for anyone that knew the language that the sentense was displayed in to figure out the answer easier than trying to read the reCAPTCHA text. Plus, storing an insane number of sentences would only take a couple gigabytes of space and wouldn't take anywhere near the CPU time creating images/audio takes. In other words, anyone could host their own captcha system with minimal impact on system performance. Is there a problem with this approach? More specifically I'm looking for the main problem with this approach. migrated from stackoverflow

    Read the article

  • Sorting versus hashing

    - by Paul Siegel
    My problem is as follows. I have an array of n strings with m < n of them distinct. I want to create a one-to-one function which assigns each of the m distinct strings to the numbers 0 ... m-1. For example, if my strings are: Bob, Amy, Bob, Charlie, Amy then the function: Bob -> 0, Amy -> 1, Charlie -> 2 would meet my needs. I have thought of three possible approaches: Sort the list of strings, remove duplicates, and construct the function using a search algorithm. Create a hash table and check each string to see if it is already in the table before inserting it. Sort the list of strings, remove duplicates, and put the resulting list into a hash table. My code will be written in Java, and I will likely use standard Java algorithms: merge sort for sorting, binary search for searching, and whatever the standard Java hash table algorithm is. Question: Assume that after creating the function I will have to evaluate it on each of the n original strings. Which of the three approaches is fastest? Is there a better way? Part of the problem is that I don't really know what's going on "under the hood" in standard hashing algorithms. Any help would be appreciated.

    Read the article

  • How do you keep from running into the same problems over and over?

    - by Stephen Furlani
    I keep running into the same problems. The problem is irrelevant, but the fact that I keep running into is completely frustrating. The problem only happens once every, 3-6 months or so as I stub out a new iteration of the project. I keep a journal every time, but I spend at least a day or two each iteration trying to get the issue resolved. How do you guys keep from making the same mistakes over and over? I've tried a journal but it apparently doesn't work for me. [Edit] A few more details about the issue: Each time I make a new project to hold the files, I import a particular library. The library is a C++ library which imports glew.h and glx.h GLX redefines BOOL and that's not kosher since BOOL is a keyword for ObjC. I had a fix the last time I went through this. I #ifndef the header in the library to exclude GLEW and GLX and everything worked hunky-dory. This time, however, I do the same thing, use the same #ifndef block but now it throws a bunch of errors. I go back to the old project, and it works. New project no-worky. It seems like it does this every time, and my solution to it is new each time for some reason. I know #defines and #includes are one of the trickiest areas of C++ (and cross-language with Objective-C), but I had this working and now it's not.

    Read the article

  • internet sharing over wifi between ubuntu 11.10 and windows7

    - by Vivek Pradhan
    So i have looked up a lot of forums for these solution but haven't really found one that works. I have a laptop running ubuntu 11.10 connected to the ethernet cord at home and i want to share the internet over wifi with my friend's laptop running windows7. I did not think it would be so difficult, I tried a lot of things: I went to networks and then wireless and set up a wifi hotspot, my laptop got connected to it and the ipv4 was set to "shared to other computers". Thankfully the network was discoverable and after some tries my friend was able to connect to the network, but it still showed that there was no internet access. I tried coniguring the tcp addresses of the wifi on his laptop. Bottom line: did not work. I went to the network manager, clicked on create a new wireless network, created one with security WEP 40/128 bit passphrase, tried the other 2 options later, and did the same thing as with the wifi, this network was also not discoverable initially and after some tries, we could connect to it but then ICS was not working although there was a working internet on my ubuntu laptop. I would really appreciate if some one who has faced a similar problem and got it fixed to please give me a step by step solution of how to get this work, because this is a prety common problem with ubuntu distros I have noticed. Note: This however seems to work with ubuntu to ubuntu connections, but specifically cross OS connections like windows or my phone running android ICS are not able to see or if connect not able to share internet on my laptop

    Read the article

  • "Failed to create swap space" error during installation

    - by Welsh Heron
    I've been trying to install Ubuntu for the past two days or so, but I've been running into a problem: every time I run the installation program on the LiveCD, I always get the same (or a very similar) error: "Failed to create Swap space The creation of swap space in partition #3 of SCSI5 (0,0,0)(sda) failed." So far, I've run DBAN (Darik's Boot and Nuke) on my HDD once, to make absolutely sure that everything on it had been erased. Then, I simply put in the LiveCD, and let it run the automated install. I get the above error directly after I tell it to automatically partition the HDD (it will work for a second or so, then this will pop up), forcing me back to the screen that lets me choose whether I want to automatically or manually partition the HDD. Well, after failing to install the software manually, I did a little research and learned enough about partitioning Linux to use the 'Manual partitioning' option. I partitioned the HDD as follows (it's a 1TB drive): /home - (the rest)- ext2, / - 20GB - ext2, /boot - 100MB - ext2, /swap - 8GB /EFIboot - 40MB The only difference when I tried this method was that I got THIS message: "Failed to create Swap space The creation of swap space in partition #2 of SCSI5 (0,0,0)(sda) failed." Basically, the only difference was that there was now a '2' instead of a '3'. If I may ask, what exactly am I doing wrong? I've tried looking around the internet (that's basically all I've done for the last two days), but no one seems to have the same problem that I have, and I've tried most of the solutions for similar problems (DBAN, formatting partitions in ext2 format, etc). The only thing I haven't tried is using the terminal to manually partition the HDD...and I actually DID try to do this, but I wasn't able to get past 'su' 's password demand, so I wasn't able to use the terminal. Thank you for your help in advance. ~Welsh

    Read the article

  • Howto fix "[Errno 13] Permission denied" in mailman mailing lists

    - by Michael
    After migrating domains from one plesk server onto another, I got several of those mails every day: (the target mailbox does not exist, so I get those as undeliverable mail bounces) Return-Path: <[email protected]> Received: (qmail 26460 invoked by uid 38); 26 May 2012 12:00:02 +0200 Date: 26 May 2012 12:00:02 +0200 Message-ID: <20120526100002.xyzxx.qmail@lvpsxxx-xx-xx-xx.dedicated.hosteurope.de> From: [email protected] (Cron Daemon) To: [email protected] Subject: Cron <list@lvpsxxx-xx-xx-xx> [ -x /usr/lib/mailman/cron/senddigests ] && /usr/lib/mailman/cron/senddigests Content-Type: text/plain; charset=ANSI_X3.4-1968 X-Cron-Env: <SHELL=/bin/sh> X-Cron-Env: <HOME=/var/list> X-Cron-Env: <PATH=/usr/bin:/bin> X-Cron-Env: <LOGNAME=list> List: xyzxyz: problem processing /var/lib/mailman/lists/xyzxyz/digest.mbox: [Errno 13] Permission denied: '/var/lib/mailman/archives/private/xyzxyz' I tried to fix the permissions myself, but the problem still exists.

    Read the article

< Previous Page | 563 564 565 566 567 568 569 570 571 572 573 574  | Next Page >