Search Results

Search found 28957 results on 1159 pages for 'single instance'.

Page 600/1159 | < Previous Page | 596 597 598 599 600 601 602 603 604 605 606 607  | Next Page >

  • Rendering multiple squares fast?

    - by Sam
    so I'm doing my first steps with openGL development on android and I'm kinda stuck at some serious performance issues... What I'm trying to do is render a whole grid of single colored squares on to the screen and I'm getting framerates of ~7FPS. The squares are 9px in size right now with one pixel border in between, so I get a few thousand of them. I have a class "Square" and the Renderer iterates over all Squares every frame and calls the draw() method of each (just the iteration is fast enough, with no openGL code the whole thing runs smootlhy at 60FPS). Right now the draw() method looks like this: // Prepare the square coordinate data GLES20.glVertexAttribPointer(mPositionHandle, COORDS_PER_VERTEX, GLES20.GL_FLOAT, false, vertexStride, vertexBuffer); // Set color for drawing the square GLES20.glUniform4fv(mColorHandle, 1, color, 0); // Draw the square GLES20.glDrawElements(GLES20.GL_TRIANGLES, drawOrder.length, GLES20.GL_UNSIGNED_SHORT, drawListBuffer); So its actually only 3 openGL calls. Everything else (loading shaders, filling buffers, getting appropriate handles, etc.) is done in the Constructor and things like the Program and the handles are also static attributes. What am I missing here, why is it rendering so slow? I've also tried loading the buffer data into VBOs, but this is actually slower... Maybe I did something wrong though. Any help greatly appreciated! :)

    Read the article

  • Moviebarcodes Showcases Entire Movies as Frame-based Barcodes

    - by Jason Fitzpatrick
    If you’ve ever wanted a chance to look at at an entire movie in a single glance, here’s your chance. Moviebarcodes shares mock-barcodes generated by turning each frame of a movie into a thin stripe, offering a glimpse into the color choices and shot lengths in popular movies. The barcode seen above was generated from The Matrix; you can see where the green indicates scenes that were shot inside the matrix and thus given a subtle green tint. In the barcode below, generated from the movie Pleasantville you can see the transition in the movie between the color and black and white scenes. In the case of Pleasantville, elements of the black and white world turning to color represent pivotal moments in the plot development which are now neatly mapped out below: Check out the hundreds of barcodes at the link below; you can even order prints of your favorite movies. Find a great rendering in the mix? Share a link in the comments below. Moviebarcodes [via Cool Inforgraphics] How to Create an Easy Pixel Art Avatar in Photoshop or GIMPInternet Explorer 9 Released: Here’s What You Need To KnowHTG Explains: How Does Email Work?

    Read the article

  • Remote I/O costs with a Content Delivery Network

    - by x711Li
    As far as I know, the time complexity of scanning a directory and the amount of files in said directory are correlated due to I/O costs. Would the administrative costs of placing the files in a hashed directory tree for uploading/downloading files through a CDN API be worth it for the added efficiency? For instance, given a filename foo.mp3, the MD5 hash for this is 10ebb1120767e9de166e0f5905077cb1. Thus, storing foo.mp3 in ./10/eb/foo.mp3 would allow for less files per directory (assuming MD5 generates patterns with in Base36, this allows for 36^2 root directories with 36^2 subdirectories each and little chance of hash collision) Considering the directories themselves are not loaded, would the I/O costs of directory scanning still exist with direct uploading/downloading?

    Read the article

  • How do I develop database-utilizing application in an agile/test-driven-development way?

    - by user39019
    I want to add databases (traditional client/server RDBMS's like Mysql/Postgresql as opposed to NoSQL, or embedded databases) to my toolbox as a developer. I've been using SQLite for simpler projects with only 1 client, but now I want to do more complicated things (ie, db-backed web development). I usually like following agile and/or test-driven-development principles. I generally code in Perl or Python. Questions: How do I test my code such that each run of the test suite starts with a 'pristine' state? Do I run a separate instance of the database server every test? Do I use a temporary database? How do I design my tables/schema so that it is flexible with respect to changing requirements? Do I start with an ORM for my language? Or do I stick to manually coding SQL? One thing I don't find appealing is having to change more than one thing (say, the CREATE TABLE statement and associated crud statements) for one change, b/c that's error prone. On the other hand, I expect ORM's to be a low slower and harder to debug than raw SQL. What is the general strategy for migrating data between one version of the program and a newer one? Do I carefully write ALTER TABLE statements between each version, or do I dump the data and import fresh in the new version?

    Read the article

  • getting started as a web developer [closed]

    - by kmote
    I have over 10 years of programming experience building (Windows-based) desktop applications and utilities (VC++, C#, Python). My goal over the next year is to start transitioning to web application development. I want to teach myself the fundamental tools and technologies that would be considered essential for building professional, online, interactive, visually-stunning, data-driven web apps -- the kind described in Google's recently released "Field Guide: Building Great Web Applications". So my question is, what are the primary, most commonly-used technologies that seasoned professionals will need in their tool belt in the coming years? My plan was to start coming up to speed in Javascript, HTML5, & CSS, and then to do a deep dive into ASP.NET and Ajax, along with SQL DBs. (I was surprised to not be able to find a single book at Amazon with a broad, general scope like this, which caused me to start second-guessing this approach.) So, seasoned professionals: am I on the right track? Are there some glaring omissions in my list? Or some unnecessary inclusions? I would welcome any book suggestions along these lines as well.

    Read the article

  • Wordpress 3 mutli site install

    - by mike
    Hello, Trying to figure out if this is possible... My company has a cms product that was written in Java and we decided to use Wordpress to run blogs for our clients. Obviously, Wordpress does not run on tomcat(at least not by default) so we installed Pound(http://www.apsis.ch/pound/) on our server and have setup any Apache and Tomcat on different ports. When "/blog/" is requested, the request is directed to Apache. This works fine but we would like to use Wordpress multi site so that we can manage all the blogs from a single interface. We would also like the url for every site to be "/blog/" example: http://www.site1.com/blog/ http://www.site2.com/blog/ I'm thinking it would have to be done with apache??? Is it even possible? Thanks!

    Read the article

  • fastcgi-mono-server with Nginx is much slower than xsp4

    - by marxin
    We started testing our MVC4 app on xsp4 server compiled with mono-3.0.3, speed was enough and we decided to set up production fastcgi-mono-server4 (version 2.11.0.0) with nginx (1.2.6-r1). Single query that loads some JSON query took ~200ms on XSP4, but Nginx serves the query in about 1.2s and I am wondering where could be such a slow down? I followed nginx configuration: http://www.mono-project.com/FastCGI_Nginx and fastcgi-mono-server4 uses socket for listening nginx. Do you have any ideas how to log some time stamp which will help me? Thanks

    Read the article

  • Offshoring: does it ever work?

    - by DanSingerman
    I know there has been a fair amount of discussion on here about outsourcing/offshoring, and the general opinion seems to be that at best it is difficult, and at worst it fails. I have direct experience of offshoring myself; a previous company where I was a dev manager wanted to send some development offshore, and we ran a pilot scheme to see how well it would work. Of course it was a complete failure, although it is not completely clear to me whether this was down to the offshore devs being less talented, the process, or other factors (no doubt it was really a combination). I can see as a business how offshoring looks attractive (much lower day rate), but as far as I can see, the only way it could possibly work is if you do exceptionally detailed design up front, with incredibly detailed specifications; and by the time you have invested in producing that, you have probably spent as nearly as much as if you had written the actual code locally (which I think is an instance of No Silver Bullet) So, what I want to know is, does anyone here have any experience of offshoring actually working ever? Especially if there are any success stories of it working in a semi-agile way? I know there are developers here from all over the World; has anyone worked on an offshore project they consider successful?

    Read the article

  • Managing PHP processes on Windows 7 (with WAMP)

    - by Andrea
    Is there a way to manage (especially list and kill) long-running PHP processes on a Windows 7 system set up with WAMP? Every once in a while, I'll accidentally throw an infinite loop into a PHP process and want to kill it. Right now, all I can think to do is to restart all my WAMP services but sometimes the PHP processes manage to survive right through the restart, i.e., I still see them outputting to logs even after WAMP's restarted. And if the process isn't logging, then I have no way at all to know when/if it's been killed. Not to mention, this will wipe out everything I'm doing with WAMP, not just a single process. I don't seem to see anything relevant in the Windows Task Manager, but maybe I'm missing something.

    Read the article

  • New SPC2 benchmark- The 7420 KILLS it !!!

    - by user12620172
    This is pretty sweet. The new SPC2 benchmark came out last week, and the 7420 not only came in 2nd of ALL speed scores, but came in #1 for price per MBPS. Check out this table. The 7420 score of 10,704 makes it really fast, but that's not the best part. The price one would have to pay in order to beat it is ridiculous. You can go see for yourself at http://www.storageperformance.org/results/benchmark_results_spc2The only system on the whole page that beats it was over twice the price per MBPS. Very sweet for Oracle. So let's see, the 7420 is the fastest per $. The 7420 is the cheapest per MBPS. The 7420 has incredible, built-in features, management services, analytics, and protocols. It's extremely stable and as a cluster has no single point of failure. It won the Storage Magazine award for best NAS system this year. So how long will it be before it's the number 1 NAS system in the market? What are the biggest hurdles still stopping the widespread adoption of the ZFSSA? From what I see, it's three things: 1. Administrator's comfort level with older legacy systems. 2. Politics 3. Past issues with Oracle Support.   I see all of these issues crop up regularly. Number 1 just takes time and education. Number 3 takes time with our new, better, and growing support team. many of them came from Oracle and there were growing pains when they went from a straight software-model to having to also support hardware. Number 2 is tricky, but it's the job of the sales teams to break through the internal politics and help their clients see the value in oracle hardware systems. Benchmarks like this will help.

    Read the article

  • Why are there so few Wireless N Dual Band adapter PCI cards, only USB adapters instead?

    - by daiphoenix
    There has been several Wireless N Dual Band routers/APs out in the market for quite some time now, and there are several Wireless N Dual Band USB adapters out there. But as for PCI/PCI-X card adapters, there seems to be only one (the Linksys WMP600N). Why is that? I find it very strange. Is it because the USB adapters are easier to install, and can be used on multiple computers? But if so, why isn't it the same case with single band (2.4 Ghz) wireless N adapters? Because for these ones there as many PCI card adapters as there are USB adapters. Also, can the USB adapters, despite the lack of external antenna, offer the same level of performance as a card with external antennas?

    Read the article

  • How do I put back different SCSI hard drives into their original RAID arrays across different servers?

    - by Edgar
    I have potentially a big mess in my hands: I received today a box with several hard drives that used to be connected to different servers each one of them using an unknown - at least as of right now- RAID configuration. Regretfully, these are not marked and I'm not sure how to go about putting them back into their original servers. Currently I don't have much more information: I don't know what type of array was being used on each instance and I don't have any specifics about the RAID controller originally used on each one of the servers (currently these servers are at a remote location with no easy access). Is there a way to sort through this mess? What would be the consequences of using trial and error to go about it? This might be a very basic question but I don't have much experience dealing with RAID arrays.

    Read the article

  • How far do I take Composition?

    - by whiterook6
    (Although I'm sure this is a common problem I really don't know what to search for. Composition is the only thing I could come up with.) I've read over and over that multiple inheritance and subclassing is really, really bad, especially for game entities. If I have three types of motions, five types of guns, and three types of armoring, I don't want to have to make 45 different classes to get all the possible combinations; I'm going to add a motion behavior, gun behavior, and armor behavior to a single generic object. That makes sense. But how far do I take this? I can have as many different types of behaviors as I can imagine: DamageBehavior, MotionBehavior, TargetableBehavior, etc. If I add a new class of behaviors then I need to update all the other classes that use them. But what happens when I have functionality that doesn't really fit into one class of behaviors? For example, my armor needs to be damageable but also updateable. And should I be able to have use more than one type of behavior on an entity at a time, such as two motion behaviors? Can anyone offer any wisdom or point me in the direction of some useful articles? Thanks!

    Read the article

  • Random compositing lag

    - by user1020567
    My laptop specs: 512 mb of RAM, out of which 64 mb are shared with an integrated GPU - ATI Radeon Xpress 200 M. Intel 1,6 Ghz Celeron M single-core processor. I've spent months trying to figure out why compositing and effects sometimes lag on any distro I try. Now I've come to realise that no matter what drivers I try (the default ones work for me on pretty much any linux) compositing lag is random. When I used Ubuntu 10.10, for example, sometimes window compositing would lag and sometimes it wouldn't. The PC is able to render those effects so hardware is not the problem. It's completely random and unpredictable - sometimes when I turn on the computer the effects lag horribly and sometimes it's completely smooth. I've also checked startup items and there doesn't seem to be any unnecessary entries. I also tried building my own OS with Arch Linux and the problem persists there, therefore I can only assume that it's a driver issue of some sort. By default there are lots of drivers supplied with linux distributions. Could it be that they're in the way? The ones that I need are ati/radeon (or both? What's the difference between them?) and there seem to be a lot of others... What should I do?

    Read the article

  • Moving an object using its velocity on a closed curve

    - by Futaro
    I want that an object follows a path, in Peggle game there are some pegs that have movement in a closed path. How can i get the same result? I guess that I can use parametric curve but I need use the velocity and not the position (x, y). I use NAPE and I have this in my gameloop: //circunference angle = angle + 1*(Math.PI / 180); movableBall.position.x = radius * Math.cos(angle)+ h; movableBall.position.y = radius * Math.sin(angle)+ k; it's works but I can not control the velocity, each movableBall must have its own velocity. Besides, from docs of NAPE:"Setting the position of a body is equivalent to simply teleporting the body; for instance moving a kinematic body by position is not the way to go about things.." I want to use: movableBall.velocity.x =?? movableBall.velocity.y = ?? The final idea is to follow others paths like the Lemniscate of Bernoulli. Thanks!

    Read the article

  • Can I use a wildcard to denote subdirectories as opposed to just files in the Windows Command Prompt

    - by Dinosaurus
    I know I can use a wildcard to list the files in a single directory: dir *.java However, does anyone know if it is possible to denote a subdirectory with a wildcard as well? I would like to do something like dir classes/*/*.java Where, it will list all the java files in every subdirectory beneath the classes directory. So, if there is: classes/cs1100/ classes/cs1200/ classes/cs1500/ It will list all the java files within these. Note, I'm not using this specifically for the "Dir" command, but instead another command line tool that accepts a list of files. But, if it works for Dir, it shoudl work in my other program as well.

    Read the article

  • Whole continent simulation [on hold]

    - by user2309021
    Let's suppose I am planning to create a simulation of an entire continent at some point in the past (let's say, around 0 A.D). Is it feasible to spawn a hundred million actors that interact with each other and their environments? Having them reproduce, extract resources, etc? The fact is that I actually want to create a simulation that allows me to zoom in from a view of the entire continent up to a single village, and interact with it. (Think as if you could keep zooming in the campaign map of any Total War game and the transition to the battle map was seamless, not a change of the "game mode"). By the way, I have never made a game in my entire life (I have programmed normal desktop applications, though), so I am really having trouble wrapping my head around how to implement such a thing. Even while thinking about how to implement a simple population simulator, without a graphical interface, I think that the O(n) complexity of traversing an array and telling all people to get one year older each time the program ticks is kind of stupid. Any kind help would be greatly appreciated :) EDIT: After being put on hold, I shall specify a question. How would you implement a simulation of all basic human dynamics (reproduction, resource consumption) in an entire continent (with millions of people)?

    Read the article

  • String manipulation functions in SQL Server 2000 / 2005

    - by Vipin
    SQL Server provides a range of string manipulation functions. I was aware of most of those in back of the mind, but when I needed to use one, I had to dig it out either from SQL server help file or from google. So, I thought I will list some of the functions which performs some common operations in SQL server. Hope it will be helpful to you all. Len (' String_Expression' ) - returns the length of input String_Expression. Example - Select Len('Vipin') Output - 5 Left ( 'String_Expression', int_characters ) - returns int_characters characters from the left of the String_Expression.     Example - Select Left('Vipin',3), Right('Vipin',3) Output -  Vip,  Pin  LTrim ( 'String_Expression' ) - removes spaces from left of the input 'String_Expression'  RTrim ( 'String_Expression' ) - removes spaces from right of the input 'String_Expression' Note - To removes spaces from both ends of the string_expression use Ltrim and RTrim in conjunction Example - Select LTrim(' Vipin '), RTrim(' Vipin ') , LTrim ( RTrim(' Vipin ')) Output - 'Vipin ' , ' Vipin' , 'Vipin' (Single quote marks ' ' are not part of the SQL output, it's just been included to demonstrate the presence of space at the end of string.) Substring ( 'String_Expression' , int_start , int_length ) - this function returns the part of string_expression. Right ( 'String_Expression', int_characters ) - returns int_characters characters from the right of the String_Expression.

    Read the article

  • Windows 7 Permissions

    - by Scott
    I have an odd problem with a windows 7 laptop. It's a single user installation currently. This is a fresh install on an Asus laptop. I have a svn repo checked out on my second partition. I have a directory which I have added to svn:ignore list, because it is for tmp files. This specific directory shows as read-only. I need write access on this directory for my project to function properly. If I right click and modify the directory to be not be read only and run this recursively, it simply is immediately reverted back to a read-only directory. I have also modified apache's service to run as myself to no avail. I'm stumped... Any ideas?

    Read the article

  • MySQL returning slow queries with result sets bigger than 30 rows

    - by josephs8
    When ever I run a query that exceeds 30 queries the time for the query to run goes from less than a second to over 10 seconds to get data. Example I run a query to return 29 rows, it takes .1 seconds, I run a query to return 31 rows it takes 11.2 seconds. I am running mySQL on Windows 2008 Server Dual Core 2.6Ghz with 3GB of Memory. The machine doesn't run anything else. It does have a instance of MSSQL running on the server but that does not get used at all. This only happens via PHP right now, If I manually run the query on the server it returns it in less than a second. The queries are not complicated either I have included one below: SELECT Name, Value FROM `bis_co`.`departments` LIMIT 31 What would be causing this issue and how can I correct this? Am I missing a configuration setting in MySQL or something. Thanks

    Read the article

  • Job queueing in Toast Titanium 10?

    - by moonslug
    I have a bunch of .MP4 video files I'm burning to DVD-Video using Toast Titanium 10 on my MacBook Pro. Right now, I'm doing them one at a time. Because my computer is several years old, encoding video for a single DVD takes approximately six hours. I've discovered that it appears I can encode the video directly to a .toast format — however, I have yet to figure out if I can burn these directly to DVD. Also, I have quite a bit of video left to burn, and even that method would require me intervening manually to start a new encoding or burn job every six hours. Would it be possible to somehow queue up multiple DVD-Video encoding jobs at once, and have the computer work through them automatically? The actual writing to DVD disc doesn't take nearly as long, and if I had all my video encoded for me to begin with my job would be a lot quicker. Maybe this can be accomplished with a different piece of software?

    Read the article

  • Unable to Uninstall Exchange 2010 ("Internet Newsgroups" public folder)

    - by helplessITguy
    I am trying to uninstall Exchange 2010, before installing a new instance of Exchange 2010 SP1 on a different server. (Our production Exchange server is 2003) We have met all of the Mailbox uninstall prereqs except for the following: Error: Uninstall cannot continue. Database 'Public Folder Database 1579722947': The public folder database "Public Folder Database 1579722947" contains folder replicas. Before deleting the public folder database, remove the folders or move the replicas to another public folder database. For detailed instructions about how to remove a public folder database, see http://go.microsoft.com/fwlink/?linkid=81409&clcid=0x409. Recommended Action: We have been able to delete all Public Folders in the 2010 storage group except for the one (previously replicated) folder - "Internet Newsgroups". How can I delete this folder without impacting public folders on the production Exchange 2003 server? We have: verified permissions to the public folder removed replication for the folder on (on the Exch 2010 server) tried PowerShell scripts: RemoveReplicaFromPFRecursive Get-PublicFolder -Server "\" -Recurse -ResultSize:Unlimited | Remove-PublicFolder -Server -Recurse -ErrorAction:SilentlyContinue

    Read the article

  • Today in the OTN Lounge (Wednesday October 3, 2012)

    - by Bob Rhubart
    Here's a quick rundown of today's activities in the OTN Lounge: OTN Lounge hours today: 8:00 am - 6:00pm 9:00 am - 1:00 pm RAC Attack Learn about Oracle Real Application Clustering (RAC) in this collaborative event. You'll work with experts from the IOUG RAC SIG to get an Oracle Database 11gR2 RAC cluster running inside a virtual machine. For more information: RAC attack at Oracle Open World (Pythian Blog) RAC Attack - Oracle Cluster Database at Home/Events (WikiBooks) 4:30 pm - 8:00 pm Oracle Social Network Developer Challenge Judging The Oracle Social Network Developer Challenge comes to its conclusion with the final judging on entries and the award of the single prize: $500 in Amazon gift cards. Click here for more information. 4:30 pm - 5:30 pm Oracle ADF / Oracle Fusion Middleware Meet-up Join other Oracle ADF and Oracle Fusion Middleware developers and meet the product managers and engineers behind Oracle ADF, ADF Mobile, and ADF Essentials. Did we mention free beer? The OTN Lounge is located in the Howard St. Tent, between 3rd and 4th, directly between Moscone North and Moscone South. Access to the OTN Lounge requires an Oracle OpenWorld or JavaOne conference badge.

    Read the article

  • Choosing the Database Solution for Large Data Application

    - by GµårÐïåñ
    I have been tasked to write an application that will be a combination of document and inventory management in VB.net which will be used to store document images in TIFF, PDF, XPS, TXT, DOC, PPT and so on as binary data that can be retrieved for viewing, printing, and possible OCR to be searchable as well along with meta data such as sender, recipient, type of document, date, source, etc. So the table would probably be something like: DOC_NAME, DOC_DATE, NOTES, ... DOC_BINARY (where the actual document will be put inside) What my concern is finding a database solution that will not become unstable due to size restrictions, records limitations and performance. Some of the options are MS_SQL, SQL Express, SQLite, mySQL, and Access. Now I can pretty much eliminate Access right off the bat as it is just too limiting and not scalable. I can further eliminate SQL Express because of the 2 GB limit and again scalability. So that leaves me with MS_SQL, SQLite and mySQL (although if anyone has other options they think would be good as well, please feel free to share them, by no means am I set on these only). So this brings me to what you guys think is the best option for what I have described. The goal is that the data is all in one place (a single file) that will make backup and portability easier. For small volume usage, pretty much any solution will hold for a while, but my goal is to think ahead and make sure its able to withstand heavy large volume usage as well. Another consideration is also the interoperability with .NET and stability of such code to avoid errors and memory leaks. Your feedback would be greatly appreciated.

    Read the article

  • PHP Image gallery that integrates well into custom CMS

    - by Thorarin
    I've been trying to find an image gallery that plays nice with our custom CMS. I've evaluated a number of them, but none of them seems to have the feature list that I would like: Run on LAMP environment Free software or low license costs (the website belongs to a non-profit organisation) Multi-user support Multiple albums. We're posting concert pictures, and would like an album per event. Pluggable authentication system. I want to reuse the accounts we have for our CMS. Permissions can be done inside the gallery itself, but I want to have a single sign on solution in a maintainable manner, by writing my own plugin/add-on for the software. Upload support (multiple images at the same time) And preferrably also: Can be integrated into a PHP page layout without IFRAMEs Automatic resizing of uploaded images to a maximum size Ability for visitors to place comments This combination is proving hard to find, especially the authentication requirement. I don't want to mess around all over the place in the source code to make it use the existing authentication. A plugin would be ideal, but alternatively a well thought out software design that allows for maintainable surgical changes would be acceptable. Any suggestions on which software I should take a closer look into?

    Read the article

< Previous Page | 596 597 598 599 600 601 602 603 604 605 606 607  | Next Page >