Search Results

Search found 16809 results on 673 pages for 'nothing 2 lose'.

Page 153/673 | < Previous Page | 149 150 151 152 153 154 155 156 157 158 159 160  | Next Page >

  • google changing crawl speed: doesn't seem to work. Why?

    - by Olivier Pons
    I've changed 3 days ago the google crawling speed of mywebsite. Here it is: This means: 2 demands by second. I've got the message on the google webmasters tools that the change speed has been taken in account: But after more than three days, nothing happens: still one request every ten seconds See here: My webserver is very fast and can handle up to twenty simultaneous connexions. And my website is brand new, this means google is almost the only one here crawling my website. After more than 30000 successful requests (= no 404), I think there's something going on... or maybe this is just a bug? Has anyone ever had this problem?

    Read the article

  • Plymouth did not install properly

    - by David Starkey
    I was installing the plymouth manager in hopes of making a custom loading screen. While the terminal was working, my computer unexpectedly powered off. I can open up the manager and it appears to do what it is supposed to (minus the fact that I can't make my own theme) and the screen only shows on powering down. Anyway, all of the advice I have seen so far have resulted in errors and nothing getting fixed. I do not have permissions to simply select the folder and delete it for some reason and I have not been able to find out how to grant myself those permissions. I guess my question then is how do I get rid of the plymouth manager so I can reinstall it properly? Already tried: -Installation - http://www.noobslab.com/2011/11/install-plymouth-manager-and-change.html -Removal - How to remove Plymouth Boot Animation manager and keep the default boot screen -Permissions - How do I change my user permissions to edit /etc/apt/sources.list? -Theming Guide - http://brej.org/blog/?p=158

    Read the article

  • How to reduce the tearing while watching videos?

    - by Leonardo TM
    I'm using the Ubuntu 10.10 with both VLC player and MPlayer and already installed the ATI drivers. I watched the same videos on Windows 7 and Ubuntu. But on Ubuntu the images have a lot of tearing. I tried some newbish-configs on my ATI config tool, but nothing changed. I tried videos in mkv, avi, rmvb... and in all kinds of resolutions. I would love to see some tips or maybe a solution to this problem. (I have searched for similar questions but I didn't find any :/ ) The english is not my primary lang so sorry for my mistakes. Thanks in advance! []'s Leonardo My HW Config (Acer Aspire 5740G-6979) -ATI Mobility Radeon 5650 -Intel Core i5-430M -4GB Ram

    Read the article

  • wakeonlan from remote host

    - by takeshin
    I have setup wake on lan service on my server. Everything works fine on local area network: root@server$: poweroff user@local$ wakeonlan AA:BB:CC:DD:EE:FF and the server wakes up. AA:BB:CC:DD:EE:FF is a MAC address of my server, which has IP 192.168.1.2 and hostname: example.com. It is connected to the router, which has IP 192.168.1.1 (public: xxx.xxx.xxx.xxx) When the server is up, I can ping: ping example.com or login via ssh: ssh [email protected] So far, so good. Now I'm able to wake the server up from local area, but how to wake the server from the remote location? I tried: user@local$ wakeonlan -i xxx.xxx.xxx.xxx AA:BB:CC:DD:EE:FF, but it does not work (nothing happens;). Do I have to configure my router somehow to forward magic packets? How?

    Read the article

  • Cannot install shell-themes using gnome-tweak-tool

    - by Chris
    How do I fix this? I will attempt to upload a screenshot of the problem. Notice there is an error like triangle near the shell themes and nothing is select-able. Also there are no shell extensions under the shell extensions tab. I have come across many postings on how to fix this but none worked for me. I currently have 12.04 LTS. I have a custom Phenom quad core machine with Radeon HD 5770 graphics if that helps.

    Read the article

  • How will Quantum computing affect us?

    - by CiscoIPPhone
    I am interested in quantum computing, but have not studied it in depth. Things like Shor's algorithm intrigue me. My question is: If quantum computing took off in a big way (i.e. functional quantum home computers were available) how would it affect us programmers and software developers? Would we have to learn how to make use of superposition and entanglement - would it change how we write algorithms? Would more mathematical programmers be required/would we need new skills? Would it change nothing at all from our perspective (i.e. would it be abstracted)? Your opinion is welcome.

    Read the article

  • Game Maker Studio Gravity Problems

    - by Dusty
    I've started messing around with Game Maker Studio. The problem I'm having is trying to get a gravity code for orbiting. Here's how i did it in XNA foreach (GravItem Item in StarSystem.ActiveItems.OfType<GravItem>()) { if (this != Item) { Velocity += (10 * Vector2.Normalize(Item.Position - this.Position * (this.Mass * Item.Mass) / (Vector2.DistanceSquared(this.Position, Item.Position)) / (this.Mass)); } } Simple and works well, things or bit and everything is nice. but in Game maker i don't have the luxury of Vector2's or a For-each loop to loop threw all the objects that have a mass. I've tried a few different things but nothing seems to work distance = distance_to_object(obj_moon); //--Gravity hspeed += (0.5 * (distance) * (Mass * obj_moon.Mass) / (sqr(distance)) / Mass) vspeed += (0.5 * (distance) * (Mass * obj_moon.Mass) / (sqr(distance)) / Mass) thanks for the help

    Read the article

  • Polishing the MonologFX API

    - by HecklerMark
    Earlier this week, I released "into the wild" a new JavaFX 2.x dialog library, MonologFX, that incorporated some elements of DialogFX and new features I'd been working on over time. While I did try to get the API to a point of reasonable completion (nothing is ever truly "finished", of course!), there was one bit of functionality that I'd included without providing any real "polish": that of the button icons. Good friend and fellow JFXtras teammate José Pereda Llamas suggested I fix that oversight and provide an update (thanks much, José!), thus this post. If you'd like to take a peek at the new streamlined syntax, I've updated the earlier post; please click here if you'd like to review it. If you want to give MonologFX a try, just point your browser to GitHub to download the updated code and/or .jar. All the best,Mark

    Read the article

  • Long pause in pxe/preseed install

    - by Bo Kersey
    I'm encountering a long pause during unattended (preseeded) install of precise server via pxe. The install eventually works, but appears to just sit there for 20 or 30 minutes after it authenticates the mirror. There is nothing in the logs during this time, even with full debug. Log excerpt: Aug 2 13:29:59 net-retriever: Signature made Fri Aug 2 06:05:28 2013 UTC using DSA key ID 437D05B5 Aug 2 13:29:59 net-retriever: gpgv: Aug 2 13:29:59 net-retriever: Good signature from "Ubuntu Archive Automatic Signing Key <[email protected]>" Aug 2 13:29:59 net-retriever: Aug 2 13:50:10 anna[5072]: DEBUG: resolver (ext2-modules): package doesn't exist (ignored) Aug 2 13:50:10 anna[5072]: DEBUG: resolver (efi-modules): package doesn't exist (ignored) Bug related to this problem that is not being worked. I'm surprised that no one else is complaining about this. Or, is there some other way to do an unattended install that everyone else is using?

    Read the article

  • Using Hadooop (HDInsight) with Microsoft - Two (OK, Three) Options

    - by BuckWoody
    Microsoft has many tools for “Big Data”. In fact, you need many tools – there’s no product called “Big Data Solution” in a shrink-wrapped box – if you find one, you probably shouldn’t buy it. It’s tempting to want a single tool that handles everything in a problem domain, but with large, complex data, that isn’t a reality. You’ll mix and match several systems, open and closed source, to solve a given problem. But there are tools that help with handling data at large, complex scales. Normally the best way to do this is to break up the data into parts, and then put the calculation engines for that chunk of data right on the node where the data is stored. These systems are in a family called “Distributed File and Compute”. Microsoft has a couple of these, including the High Performance Computing edition of Windows Server. Recently we partnered with Hortonworks to bring the Apache Foundation’s release of Hadoop to Windows. And as it turns out, there are actually two (technically three) ways you can use it. (There’s a more detailed set of information here: http://www.microsoft.com/sqlserver/en/us/solutions-technologies/business-intelligence/big-data.aspx, I’ll cover the options at a general level below)  First Option: Windows Azure HDInsight Service  Your first option is that you can simply log on to a Hadoop control node and begin to run Pig or Hive statements against data that you have stored in Windows Azure. There’s nothing to set up (although you can configure things where needed), and you can send the commands, get the output of the job(s), and stop using the service when you are done – and repeat the process later if you wish. (There are also connectors to run jobs from Microsoft Excel, but that’s another post)   This option is useful when you have a periodic burst of work for a Hadoop workload, or the data collection has been happening into Windows Azure storage anyway. That might be from a web application, the logs from a web application, telemetrics (remote sensor input), and other modes of constant collection.   You can read more about this option here:  http://blogs.msdn.com/b/windowsazure/archive/2012/10/24/getting-started-with-windows-azure-hdinsight-service.aspx Second Option: Microsoft HDInsight Server Your second option is to use the Hadoop Distribution for on-premises Windows called Microsoft HDInsight Server. You set up the Name Node(s), Job Tracker(s), and Data Node(s), among other components, and you have control over the entire ecostructure.   This option is useful if you want to  have complete control over the system, leave it running all the time, or you have a huge quantity of data that you have to bulk-load constantly – something that isn’t going to be practical with a network transfer or disk-mailing scheme. You can read more about this option here: http://www.microsoft.com/sqlserver/en/us/solutions-technologies/business-intelligence/big-data.aspx Third Option (unsupported): Installation on Windows Azure Virtual Machines  Although unsupported, you could simply use a Windows Azure Virtual Machine (we support both Windows and Linux servers) and install Hadoop yourself – it’s open-source, so there’s nothing preventing you from doing that.   Aside from being unsupported, there are other issues you’ll run into with this approach – primarily involving performance and the amount of configuration you’ll need to do to access the data nodes properly. But for a single-node installation (where all components run on one system) such as learning, demos, training and the like, this isn’t a bad option. Did I mention that’s unsupported? :) You can learn more about Windows Azure Virtual Machines here: http://www.windowsazure.com/en-us/home/scenarios/virtual-machines/ And more about Hadoop and the installation/configuration (on Linux) here: http://en.wikipedia.org/wiki/Apache_Hadoop And more about the HDInsight installation here: http://www.microsoft.com/web/gallery/install.aspx?appid=HDINSIGHT-PREVIEW Choosing the right option Since you have two or three routes you can go, the best thing to do is evaluate the need you have, and place the workload where it makes the most sense.  My suggestion is to install the HDInsight Server locally on a test system, and play around with it. Read up on the best ways to use Hadoop for a given workload, understand the parts, write a little Pig and Hive, and get your feet wet. Then sign up for a test account on HDInsight Service, and see how that leverages what you know. If you're a true tinkerer, go ahead and try the VM route as well. Oh - there’s another great reference on the Windows Azure HDInsight that just came out, here: http://blogs.msdn.com/b/brunoterkaly/archive/2012/11/16/hadoop-on-azure-introduction.aspx  

    Read the article

  • Agile Manifesto, Revisited

    - by GeekAgilistMercenary
    Again, conversations give me a zillion things to write about.  The recent conversation that has cropped up again is my various viewpoints of the Agile Manifesto.  Not all the processes that came after the manifesto was written, but just the core manifesto itself.  Just for context, here is the manifesto in all the glory. We are uncovering better ways of developing software by doing it and helping others do it. Through this work we have come to value: Individuals and interactions over processes and tools Working software over comprehensive documentation Customer collaboration over contract negotiation Responding to change over following a plan That is, while there is value in the items on the right, we value the items on the left more. Several of the key signatories at the time went on to write some of the core books that really gave Agile Software Development traction.  If you check out the Agile Manifesto Site and do a search for any of those people, you will find a treasure trove of software development information. My 2 Cents First off, I agree with a few people out there.  Agile is not Scrum for instance.  Do NOT get these things confused when checking out Agile, or pushing forward with Scrum.  As David Starr points out in his blog entry, "About 35 minutes into this discussion, I realized I hadn?t heard a question or comment that wasn?t related to Scrum. I asked the room, ?How many people are on an agile team that is NOT using Scrum?? 5 hands. Seriously, out of about 150 people of so. 5 hands." So know, as this is one of my biggest pet peves these days, that Scrum is not Agile.  Another quote David writes, "I assure you, dear reader, 2 week time boxes does not an agile team make." This is the exact problem.  Take a look at the actual manifesto above.  First ideal, "Individuals and interactions over processes and tools".  There are a couple of meanings in this ideal, just as there are in the other written ideals.  But this one has a lot of contention with a set practice such as Scrum.  There are other formulas, namely XP (eXtreme) and Kanban are two that come to mind often.  But none of these are Agile, but instead a process based on the ideals of Agile. Some of you may be thinking, "that?s the same thing".  Well, no, it is not.  This type of differentiation is vitally important.  Agile is a set of ideals.  Processes are nice, but they can change, they may work for some and not others.  The Agile Manifesto covers the ideals behind what is intended, that intention being to learn and find new ways to build better software. Ideals, not processes.  Definition versus implementation.  Class versus object.  The ideals are of utmost importance, the processes are secondary, the first ideal is what really lays this out for me "Individuals and interactions over processes and tools".  Yes, we need tools but we need the individuals and their interactions more. For those coming into a development team, I hope you take this to mind.  It is of utmost importance that this differentiation is known and fought for.  The second the process becomes more important than the individuals and interactions, the team will effectively lose the advantages of Agile Ideals. This is just one of my first thoughts on the topic of Agile.  I will be writing more in the near future about each of the ideals.  I will make a point to outline more of my thoughts, my opinions, and experience with the ideals of Agile and the various processes that are out there.  Maybe, I may stumble upon something new with the help of my readers?  It would be a grand overture to the ideals I hold. For the original entry, check out my personal blog with other juicy tech tidbits, rants, raves, and the like. Agilist Mercenary

    Read the article

  • The enterprise vendor con - connecting SSD's using SATA 2 (3Gbits) thus limiting there performance

    - by tonyrogerson
    When comparing SSD against Hard drive performance it really makes me cross when folk think comparing an array of SSD running on 3GBits/sec to hard drives running on 6GBits/second is somehow valid. In a paper from DELL (http://www.dell.com/downloads/global/products/pvaul/en/PowerEdge-PowerVaultH800-CacheCade-final.pdf) on increasing database performance using the DELL PERC H800 with Solid State Drives they compare four SSD drives connected at 3Gbits/sec against ten 10Krpm drives connected at 6Gbits [Tony slaps forehead while shouting DOH!]. It is true in the case of hard drives it probably doesn’t make much difference 3Gbit or 6Gbit because SAS and SATA are both end to end protocols rather than shared bus architecture like SCSI, so the hard drive doesn’t share bandwidth and probably can’t get near the 600MiBytes/second throughput that 6Gbit gives unless you are doing contiguous reads, in my own tests on a single 15Krpm SAS disk using IOMeter (8 worker threads, queue depth of 16 with a stripe size of 64KiB, an 8KiB transfer size on a drive formatted with an allocation size of 8KiB for a 100% sequential read test) I only get 347MiBytes per second sustained throughput at an average latency of 2.87ms per IO equating to 44.5K IOps, ok, if that was 3GBits it would be less – around 280MiBytes per second, oh, but wait a minute [...fingers tap desk] You’ll struggle to find in the commodity space an SSD that doesn’t have the SATA 3 (6GBits) interface, SSD’s are fast not only low latency and high IOps but they also offer a very large sustained transfer rate, consider the OCZ Agility 3 it so happens that in my masters dissertation I did the same test but on a difference box, I got 374MiBytes per second at an average latency of 2.67ms per IO equating to 47.9K IOps – cost of an 240GB Agility 3 is £174.24 (http://www.scan.co.uk/products/240gb-ocz-agility-3-ssd-25-sata-6gb-s-sandforce-2281-read-525mb-s-write-500mb-s-85k-iops), but that same drive set in a box connected with SATA 2 (3Gbits) would only yield around 280MiBytes per second thus losing almost 100MiBytes per second throughput and a ton of IOps too. So why the hell are “enterprise” vendors still only connecting SSD’s at 3GBits? Well, my conspiracy states that they have no interest in you moving to SSD because they’ll lose so much money, the argument that they use SATA 2 doesn’t wash, SATA 3 has been out for some time now and all the commodity stuff you buy uses it now. Consider the cost, not in terms of price per GB but price per IOps, SSD absolutely thrash Hard Drives on that, it was true that the opposite was also true that Hard Drives thrashed SSD’s on price per GB, but is that true now, I’m not so sure – a 300GByte 2.5” 15Krpm SAS drive costs £329.76 ex VAT (http://www.scan.co.uk/products/300gb-seagate-st9300653ss-savvio-15k3-25-hdd-sas-6gb-s-15000rpm-64mb-cache-27ms) which equates to £1.09 per GB compared to a 480GB OCZ Agility 3 costing £422.10 ex VAT (http://www.scan.co.uk/products/480gb-ocz-agility-3-ssd-25-sata-6gb-s-sandforce-2281-read-525mb-s-write-410mb-s-30k-iops) which equates to £0.88 per GB. Ok, I compared an “enterprise” hard drive with a “commodity” SSD, ok, so things get a little more complicated here, most “enterprise” SSD’s are SLC and most commodity are MLC, SLC gives more performance and wear, I’ll talk about that another day. For now though, don’t get sucked in by vendor marketing, SATA 2 (3Gbit) just doesn’t cut it, SSD need 6Gbit to breath and even that SSD’s are pushing. Alas, SSD’s are connected using SATA so all the controllers I’ve seen thus far from HP and DELL only do SATA 2 – deliberate? Well, I’ll let you decide on that one.

    Read the article

  • Installing 12.04 Ubuntu Studio on VMware Workstation 7, won't install VMware Tools

    - by Chase Kelley
    I'm attempting to install Ubuntu Studio 12.04 on my laptop by using VMware Workstation 7.1.5, and I've encountered a problem. The install goes well until the installation of Ubuntu has completed and the installation of VMware Tools starts; after that it just stops. I have waited about an hour and a half and nothing has changed. The installation is on VMware Easy Install, and I am running Windows Vista 32-bit with 3 GB system RAM and 2 GB of RAM on the virtual machine. Any help is greatly appreciated, thank you!

    Read the article

  • Black Login Screen after installing updates 12.04

    - by general_guts
    I love my lixux 12.04..till yesterday it installed new updates and i clicked on restart.. As system went to grub all normal & loaded normal..then instead of pretty desktop (auto login turned on) i have black screen asking for login and password.. Why? How i get my desktop back to before? Please help new noob!!! This is due to updates i am sure, nothing in hardware has changed and no other display settings changed.. using AMD diver from there site linux one for my raedeon 6850 and have catalyst driver working fine.. i have tried typing in weird commands but they didnt do much like sudo start lightdm and sudo startx .. didnt do anything just froze.. i dont exactly no what these commands do but something to do with black screen..so i thought id try it.. Thanks

    Read the article

  • How to change metamode in xorg.conf so that my game in FHD will be displayed only in the external monitor?

    - by Patryk
    I would like to launch my game only in the external monitor which I have attached to my laptop with hdmi cable. This is my current xorg.conf # nvidia-xconfig: X configuration file generated by nvidia-xconfig # nvidia-xconfig: version 319.60 (buildmeister@swio-display-x64-rhel04-15) Wed Sep 25 15:17:31 PDT 2013 Section "ServerLayout" Identifier "Layout0" Screen 0 "Screen0" InputDevice "Keyboard0" "CoreKeyboard" InputDevice "Mouse0" "CorePointer" EndSection Section "Files" EndSection Section "InputDevice" # generated from default Identifier "Mouse0" Driver "mouse" Option "Protocol" "auto" Option "Device" "/dev/psaux" Option "Emulate3Buttons" "no" Option "ZAxisMapping" "4 5" EndSection Section "InputDevice" # generated from default Identifier "Keyboard0" Driver "kbd" EndSection Section "Monitor" Identifier "Monitor0" VendorName "Unknown" ModelName "Unknown" HorizSync 28.0 - 33.0 VertRefresh 43.0 - 72.0 Option "DPMS" EndSection Section "Device" Identifier "Device0" Driver "nvidia" VendorName "NVIDIA Corporation" EndSection Section "Screen" Identifier "Screen0" Device "Device0" Monitor "Monitor0" DefaultDepth 24 Option "metamodes" "1024x786,NULL;1280x720,NULL;NULL,1680x1050;NULL,1920x1080" SubSection "Display" Depth 24 EndSubSection EndSection Although with this config nothing changes (I have my game displayed in the laptop screen and exceeding a little bit onto the external monitor since I set it to be in 1920x1080) I have read this https://help.ubuntu.com/community/XineramaHowTo but with no luck of solving this issue. The only temporary solution for this problem now is to manually switch of the laptop display and then launch the game.

    Read the article

  • Ubuntu 12.04 Bootloader failed to install

    - by Chris
    Sorry about the excessively long question, but I figured giving more information would be better. I recently bought a new desktop for myself, running Windows 7. It has two hard drives, and I wanted to install Ubuntu on a small partition on the second hard drive. I created 25GB "free space" in Windows and ran a LiveCD install. I wanted to select the install options myself but accidentally selected "Install alongside Windows 7," but it seemed to pick up the free space and installed itself there as I wanted it to. However, I was told that the bootloader installation had failed. I chose to "Cancel installation," leaving my computer unable to boot. I wiped my computer and reinstalled Windows. After that, I tried installing Ubuntu through Windows using WUBI, once using files from my LiveCD and once downloading everything again. Both times the install succeeded, but both times when I restarted and tried to load Ubuntu, it gave me an error - wubildr.mbr was corrupt or missing. I checked in Windows - it was indeed present on the C:\ drive. I went back to the LiveCD installation, this time going the custom options route. I assigned 16GB to an Ext4 journaling file system and 10GB to a swap file. I got the same bootloader error as before. Being prompted to select a different partition to install the bootloader to, I first tried the partition Ubuntu was installed on. A window came up saying that the install had succeeded, but a second window gave me the same error and choices as before. I went through every single option it gave me, including the Windows partition and the hard drives themselves (dev/sda, dev/sdb). Same result. I then chose to not install a bootloader. Windows still works fine, and I assume Ubuntu has installed but is unbootable. Knowing that my computer could potentially brick itself again - and, this time around, with a lot of data to lose and hassle to go through if I mess it up - I really don't want to do anything without some advice. So I'll ask this: a) Why did the bootloader fail to install? Can I fix the error and install Ubuntu fresh? b) Is there any way to get around the error, install the bootloader, and point it towards an existing installation of Ubuntu? c) Is there a quicker and easier solution I might have missed? EDIT: Thanks for the tip, AthloX. After testing the liveCD in Virtualbox with no installation problems, I looked around for some alternate bootloaders but had no success. I attempted another install, which installed the bootloader and Ubuntu just fine but bricked Windows 7. I wiped both hard disks clean, including some "System Reserved" partitions I hadn't noticed before, before re-installing Windows 7 on one hard drive and immediately afterwards installing Ubuntu on the other. Now the computer boots into Windows, but I can pop into the BIOS at startup to boot into Ubunbtu via it's bootloader, and I'm guessing it'll only take a bit of poking at the BIOS to swap the load order. Many thanks!

    Read the article

  • Set up iis7.5 to deny connections outside of LAN for certain folder [migrated]

    - by Darkcat Studios
    Im setting up a combined website and extranet currently, they both read from the same database on the same server as the site is hosted on. The reason being that the website is fed from the data that the staff plug into the extranet interface. it also links in to AD for authorising access to the extranet. I have the extranet in a folder within the website folder. What I want to do is only allow the extranet to be accessed from computers within our LAN, but allow the main website to be freely accessible to internet users. I have it set up as a generic web server currently, so anyone can view anything (well up to the point where the user is asked to log into the extranet of course! I have read a lot on this but nothing I read applies to, or works in IIS7.5

    Read the article

  • How to control in the vertex shader where pixel ends up in the renderTarget?

    - by cubrman
    What if I have an arbitrary renderTarget, that is smaller than the screen (say it is 1x1 pixel) and I want to make sure in the VertexShaderFunction that all my pixels end up exactly in that 1 pixel region? No matter what I do, they all seem to get culled at some point, though GraphicDevise.Clear() works OK. Where is the top left corner of the renderTarget Vertex-shader-vise? I tried output.Position = (0,0,0,0)/(0,0,0,1)/(1,1,1,1)/(-0.5,0.5,0,1) NOTHING works! Fullscreen quad is not an option 'cause I actually need to process geometry in the shaders to get the results I need.

    Read the article

  • Speeding up procedural texture generation

    - by FalconNL
    Recently I've begun working on a game that takes place in a procedurally generated solar system. After a bit of a learning curve (having neither worked with Scala, OpenGL 2 ES or Libgdx before), I have a basic tech demo going where you spin around a single procedurally textured planet: The problem I'm running into is the performance of the texture generation. A quick overview of what I'm doing: a planet is a cube that has been deformed to a sphere. To each side, a n x n (e.g. 256 x 256) texture is applied, which are bundled in one 8n x n texture that is sent to the fragment shader. The last two spaces are not used, they're only there to make sure the width is a power of 2. The texture is currently generated on the CPU, using the updated 2012 version of the simplex noise algorithm linked to in the paper 'Simplex noise demystified'. The scene I'm using to test the algorithm contains two spheres: the planet and the background. Both use a greyscale texture consisting of six octaves of 3D simplex noise, so for example if we choose 128x128 as the texture size there are 128 x 128 x 6 x 2 x 6 = about 1.2 million calls to the noise function. The closest you will get to the planet is about what's shown in the screenshot and since the game's target resolution is 1280x720 that means I'd prefer to use 512x512 textures. Combine that with the fact the actual textures will of course be more complicated than basic noise (There will be a day and night texture, blended in the fragment shader based on sunlight, and a specular mask. I need noise for continents, terrain color variation, clouds, city lights, etc.) and we're looking at something like 512 x 512 x 6 x 3 x 15 = 70 million noise calls for the planet alone. In the final game, there will be activities when traveling between planets, so a wait of 5 or 10 seconds, possibly 20, would be acceptable since I can calculate the texture in the background while traveling, though obviously the faster the better. Getting back to our test scene, performance on my PC isn't too terrible, though still too slow considering the final result is going to be about 60 times worse: 128x128 : 0.1s 256x256 : 0.4s 512x512 : 1.7s This is after I moved all performance-critical code to Java, since trying to do so in Scala was a lot worse. Running this on my phone (a Samsung Galaxy S3), however, produces a more problematic result: 128x128 : 2s 256x256 : 7s 512x512 : 29s Already far too long, and that's not even factoring in the fact that it'll be minutes instead of seconds in the final version. Clearly something needs to be done. Personally, I see a few potential avenues, though I'm not particularly keen on any of them yet: Don't precalculate the textures, but let the fragment shader calculate everything. Probably not feasible, because at one point I had the background as a fullscreen quad with a pixel shader and I got about 1 fps on my phone. Use the GPU to render the texture once, store it and use the stored texture from then on. Upside: might be faster than doing it on the CPU since the GPU is supposed to be faster at floating point calculations. Downside: effects that cannot (easily) be expressed as functions of simplex noise (e.g. gas planet vortices, moon craters, etc.) are a lot more difficult to code in GLSL than in Scala/Java. Calculate a large amount of noise textures and ship them with the application. I'd like to avoid this if at all possible. Lower the resolution. Buys me a 4x performance gain, which isn't really enough plus I lose a lot of quality. Find a faster noise algorithm. If anyone has one I'm all ears, but simplex is already supposed to be faster than perlin. Adopt a pixel art style, allowing for lower resolution textures and fewer noise octaves. While I originally envisioned the game in this style, I've come to prefer the realistic approach. I'm doing something wrong and the performance should already be one or two orders of magnitude better. If this is the case, please let me know. If anyone has any suggestions, tips, workarounds, or other comments regarding this problem I'd love to hear them.

    Read the article

  • Ubuntu 12.04 doesn't start after installing with LiveUSB

    - by Kevin Arutyunyan
    I need help! I installed Ubuntu with my LiveUSB, and it said restart to use your system, so I restarted. Now I am just stuck in a completely black screen with a blinking _ I'm already waiting 10 minutes for something to happen, but there's nothing. I don't know what the problem could be, but maybe the message I got during installation that said 'APT couldn't be configured, so no additional apps will be installed' was the problem? I'm still waiting.. Fast answers please :) thanks.

    Read the article

  • How do I get a Canon imageClass MF4350d printer working?

    - by Dan
    I have an imageClass MF4350d printer/scanner/fax. I've tried to install the drivers. The printer is recognized in the system settings, but nothing prints. The scanner is working in simple scan. I tried following all of the troubleshooting suggestions in this thread with no success I downloaded this driver. I downloaded the Linux_UFRII_PrinterDriver_V230_uk_EN from Canon: Installation: 1st attempt: I installed the CNCUPSMF4350ZK.ppd file in the printer settings and moved the pstoufr2cpca file to /usr/lib/cups/filter. 2nd attempt: I followed forum advice of installing a fake gs-esp to tell the system that "gs-esp" is PROVIDED by the package "fake-gs-esp" I then converted the RPM sudo apt-get install alien sudo alien -k cndrvcups-common-2.20-1.x86_64.rpm sudo alien -k cndrvcups-ufr2-uk-2.20-1.x86_64.rpm I then installed the resulting .deb packages. Since I'm new to Linux, as much detail as possible in your suggestions would be very appreciated. I am still learning how to use the Terminal. Thank you very much!

    Read the article

  • Low level Linux graphics

    - by math4tots
    For educational purposes, I'd like to write an application on a Linux environment that can process keyboard events and draw graphics without huge dependencies like X or SDL. I presume that this must be possible, because X and SDL are just programs themselves, so they must rely on other methods inherent to the environment. Is this understanding correct? If so, where might I learn to write such a program? My limited experience tells me that it would involve making calls to the kernel, and/or writing to special files; however, I haven't been able to find any tutorials on the matter (I am not even sure what to Google). Also, in case it is relevant, I am running Debian Squeeze on Virtualbox. I have used a netinst cd without networking, so there isn't much installed on it currently. I will install gcc, but I am hoping I can get by with nothing more.

    Read the article

  • Problem with Python3 picking Python2 package

    - by zetah
    I installed python3-numpy package, but trying to import it in Python3 interpreter I get this: $ python3 Python 3.2.3 (default, May 3 2012, 15:54:42) [GCC 4.6.3] on linux2 Type "help", "copyright", "credits" or "license" for more information. >>> import numpy Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/home/zetah/.local/lib/python2.7/site-packages/numpy/__init__.py", line 128, in <module> from version import git_revision as __git_revision__ ImportError: No module named version >>> Looking in Synaptic I see python3-numpy is installed in /usr/lib/python3/dist-packages/numpy/ Why is it picking wrong package and what can I do to remedy this? Update: OK, in my ~/.profile I have this line: PYTHONPATH=$PYTHONPATH:$HOME/.local/lib/python2.7/site-packages but if I remove this line then my Python 2.7 local packages (which I build from source) wont work Update 2: Everything seems to work perfect without $PYTHONPATH. I guess it was in my .profile file for nothing Please close this question

    Read the article

  • Is the HL7 membership model normal?

    - by Peter Turner
    To me, it's a little odd that HL7 requires you to be a member to distribute the standard within your organization and in that sense implement the standard and tell others who have implemented the standard what parts you'll be implementing, especially when it's nothing classier than a few pipes and carets for 2.x and some sort of XML for 3.0. I can understand paying money to use a library to utilize HL7 or even the source code to build the library to utilize HL7. But what's the point of requiring membership to see the spec to write the sourcecode to build the library to utilize HL7?

    Read the article

  • Time for the yearly Microsoft Pilgrimage known as the MVP Summit

    - by drsql
    One of the most fun events of the year is the MVP Summit if perhaps for no other reason than my preparation for it is packing a suitcase. No presentations to write, prepare for, nothing. No activities that I have to do anything more than show up with my 2 shovels. One for the amazing amounts of knowledge that will be flowing from the Microsoft folks to us, and the other is actually more of a fork, to get all of the great food they serve us in with.  The whole event is a lot like any other conference,...(read more)

    Read the article

< Previous Page | 149 150 151 152 153 154 155 156 157 158 159 160  | Next Page >