Search Results

Search found 13356 results on 535 pages for 'index buffers'.

Page 226/535 | < Previous Page | 222 223 224 225 226 227 228 229 230 231 232 233  | Next Page >

  • Material System

    - by Towelie
    I'm designing Material/Shader System (target API DX10+ and may be OpenGL3+, now only DX10). I know, there was a lot of topics about this, but i can't find what i need. I don't want to do some kind of compilation/parsing scripts in real-time. So there some artist-created material, written at some analog of CG. After it compiled to hlsl code and after to final shader. Also there are some hard-coded ConstantBuffers, like cbuffer EveryFrameChanging { float4x4 matView; float time; float delta; } And shader use shared constant buffers to get parameters. For each mesh in the scene, getting needs and what it can give (normals, binormals etc.) and finding corresponding permutation of shader or calculating missing parts. Also, during build calculating render states and the permutations or hash for this shader which later will be used for sorting or even giving the ID from 0 to ShaderCount w/o gaps to it for sorting. FinalShader have only 1 technique and one pass. After it for each Mesh setting some shader and it's good to render. some pseudo code SetConstantBuffer(ConstantBuffer::PerFrame); foreach (shader in FinalShaders) SetConstantBuffer(ConstantBuffer::PerShader, shader); SetRenderState(shader); foreach (mesh in shader.GetAllMeshes) SetConstantBuffer(ConstantBuffer::PerMesh, mesh); SetBuffers(mesh); Draw(); class FinalShader { public: UUID m_ID; RenderState m_RenderState; CBufferBindings m_BufferBindings; } But i have no idea how to create this CG language and do i really need it?

    Read the article

  • Rendering multiple squares fast?

    - by Sam
    so I'm doing my first steps with openGL development on android and I'm kinda stuck at some serious performance issues... What I'm trying to do is render a whole grid of single colored squares on to the screen and I'm getting framerates of ~7FPS. The squares are 9px in size right now with one pixel border in between, so I get a few thousand of them. I have a class "Square" and the Renderer iterates over all Squares every frame and calls the draw() method of each (just the iteration is fast enough, with no openGL code the whole thing runs smootlhy at 60FPS). Right now the draw() method looks like this: // Prepare the square coordinate data GLES20.glVertexAttribPointer(mPositionHandle, COORDS_PER_VERTEX, GLES20.GL_FLOAT, false, vertexStride, vertexBuffer); // Set color for drawing the square GLES20.glUniform4fv(mColorHandle, 1, color, 0); // Draw the square GLES20.glDrawElements(GLES20.GL_TRIANGLES, drawOrder.length, GLES20.GL_UNSIGNED_SHORT, drawListBuffer); So its actually only 3 openGL calls. Everything else (loading shaders, filling buffers, getting appropriate handles, etc.) is done in the Constructor and things like the Program and the handles are also static attributes. What am I missing here, why is it rendering so slow? I've also tried loading the buffer data into VBOs, but this is actually slower... Maybe I did something wrong though. Any help greatly appreciated! :)

    Read the article

  • Correct permissions for /var/www and wordpress

    - by dpbklyn
    Hello and thank you in advance! I am relatively new to ubuntu, so please excuse the newbie-ness of this question... I have set up a LAMP server (ubuntu server 11.10) and I have access via SSH and to the "it works" page from a web browser from inside my network (via ip address) and from outside using dyndns. I have a couple of projects in development with some outside developers and I want to use this server as a development server for testing and for client approvals. We have some Wordpress projects that sit in subdirectories in /var/www/wordpress1 /var/www/wordpress2, etc. I cannot access these sub directories from a browser in order to set up WP--or (I assume) to see the content on a browser. I get a 403 Forbidden error on my browser. I assume that this is a permissions problem. Can you please tell me the proper settings for the permissions to: 1) Allow the developers and me to read/write. 2) to allow WP set up and do its thing 3) Allow visitors to access the site(s) via the web. I should also mention that the subfolder are actually simlinks to folder on another internal hdd--I don't think this will make a difference, but I thought I should disclose. Since I am a newbie to ubuntu, step-by-step directions are greatly appreciated! Thank you for taking the time! dp total 12 drwxr-xr-x 2 root root 4096 2012-07-12 10:55 . drwxr-xr-x 13 root root 4096 2012-07-11 20:02 .. lrwxrwxrwx 1 root root 43 2012-07-11 20:45 admin_media -> /root/django_src/django/contrib/admin/media -rw-r--r-- 1 root root 177 2012-07-11 17:50 index.html lrwxrwxrwx 1 root root 14 2012-07-11 20:42 media -> /hdd/web/media lrwxrwxrwx 1 root root 18 2012-07-12 10:55 wordpress -> /hdd/web/wordpress Here is the result of using chown -R www-data:www-data /var/www total 12 drwxr-xr-x 2 www-data www-data 4096 2012-07-12 10:55 . drwxr-xr-x 13 root root 4096 2012-07-11 20:02 .. lrwxrwxrwx 1 www-data www-data 43 2012-07-11 20:45 admin_media -> /root/django_src/django/contrib/admin/media -rw-r--r-- 1 www-data www-data 177 2012-07-11 17:50 index.html lrwxrwxrwx 1 www-data www-data 14 2012-07-11 20:42 media -> /hdd/web/media lrwxrwxrwx 1 www-data www-data 18 2012-07-12 10:55 wordpress -> /hdd/web/wordpress I am still unable to access via browser...

    Read the article

  • Yet Another SQL Strategy for Versioned Data

    There is a popular design for a database that requires a built-in audit-trail of amendments and additions, where data is never deleted, but merely superseded by a later version. Whilst this is conceptually simple, it has always made for complicated SQL for reporting the latest version of data. Alex joins the debate on the best way of doing this with an example using an indexed view and the filtered index.

    Read the article

  • HP 625 with an ATI HD4200 crashes when watching videos on websites

    - by Honza
    I have installed 11.10 on HP 625 (graphic card ATI/AMD Radeon HD4200), everything looks normal but if I see videos on websites (youtube, stream.cz, or similar), the whole Ubuntu crashes. I have installed Catalyst 11.11 from AMD sites (according to this http://wiki.cchtml.com/index.php/Ubuntu_Oneiric_Installation_Guide#Installing_Catalyst_Manually_.28from_AMD.2FATI.27s_site.29) but Ubuntu still crashes. Any idea how to solve this really annoying problem?

    Read the article

  • is Pidgin bugged? I cannot set it up for facebook [closed]

    - by Elysium
    Possible Duplicate: Pidgin, how to set up facebook I've tried almost all options and used this guide (https://wiki.archlinux.org/index.php/Pidgin#Facebook_XMPP) to set up Pidgin for my facebook acccount, but it is simply impossible. At the username section I am using my actual username....not the long copied link from facebook...so that cannot be the issue. Is there anyone out there who had had issues of the same type? Thanks

    Read the article

  • Upgrade from 11.04 to 11.10, getting "W:Failed to fetch gzip..."

    - by Michael Durrant
    Error during update A problem occurred during the update. This is usually some sort of network problem, please check your network connection and retry. W:Failed to fetch gzip:/var/lib/apt/lists/partial/us.archive.ubuntu.com_ubuntu_dists_oneiric_universe_i18n_Translation-en Encountered a section with no Package: header , E:Some index files failed to download. They have been ignored, or old ones used instead. I can browse web sites normally during this time though (my network is ok).

    Read the article

  • Openvz: What exactly does it mean when tcpsndbuf failcnt increases? Why must there be a minimum difference between limit and barrier?

    - by Antonis Christofides
    When the failcnt of tcpsndbuf increases, what does this mean? Does it mean the system had to go past the barrier, or past the limit? Or, maybe, that the system failed to provide enough buffers, either because it needed to go past the limit, or because it needed to go past the barrier but couldn't because other VMs were using too many resources? I understand the difference between barrier and limit only for disk space, where you can specify a grace period for which the system can exceed the barrier but not the limit. But in resources like tcpsndbuf, which have no such thing as a grace period, what is the meaning of barrier vs. limit? Why does the difference between barrier and limit in tcpsndbuf be at least 2.5KB times tcpnumsock? I could understand it if, e.g., tcpsndbuf should be at least 2.5KB times tcpnumsock (either the barrier or the limit), but why should I care about the difference between the barrier and the limit?

    Read the article

  • cannot install cinnimon 2.xx

    - by user207587
    Fetched 841 kB in 28s (29.3 kB/s) W: GPG error: http://packages.mate-desktop.org raring Release: The following signatures couldn't be verified because the public key is not available: NO_PUBKEY 68980A0EA10B4DE8 W: Failed to fetch http://ppa.launchpad.net/pmcenery/ppa/ubuntu/dists/raring/main/binary-i386/Packages 404 Not Found E: Some index files failed to download. They have been ignored, or old ones used instead. userx@bw:~$ How do I fix that? How do I add a PUBKEY to get needed files to complete install of Cinnamon?

    Read the article

  • EE vs Computer Science: Effect on Developers' Approaches, Styles?

    - by DarenW
    Are there any systematic differences between software developers (sw engineers, architect, whatever job title) with an electronics or other engineering background, compared to those who entered the profession through computer science? By electronics background, I mean an EE degree, or a self-taught electronics tinkerer, other types of engineers and experimental physicists. I'm wondering if coming into the software-making professions from a strong knowledge of flip flops, tristate buffers, clock edge rise times and so forth, usually leads to a distinct approach to problems, mindsets, or superior skills at certain specialties and lack of skills at others, when compared to the computer science types who are full of concepts like abstract data types, object orientation, database normalization, who speak of "closures" in programming languages - things that make little sense to the soldering iron crowd until they learn enough programming. The real world, I'm sure, offers a wild range of individual exceptions, but for the most part, can you say there are overall differences? Would these have hiring implications e.g. (to make up something) "never hire an electron wrangler to do database design"? Could knowing about any differences help job seekers find something appropriate more effectively? Or provide enlightenment or some practical advice for those who find themselves misfits in a particular job role? (Btw, I've never taken any computer science classes; my impression of exactly what they cover is fuzzy. I'm an electronics/physics/art type, myself.)

    Read the article

  • Failed to upgrade to Ubuntu 11.10

    - by Gigili
    This error prevents the system from upgrading to a newer version of Ubuntu, what is causing it? W: Failed to fetch http://extras.ubuntu.com/ubuntu/dists/natty/main/source/Sources 404 Not Found W: Failed to fetch http://extras.ubuntu.com/ubuntu/dists/natty/main/binary-amd64/Packages 404 Not Found E: Some index files failed to download. They have been ignored, or old ones used instead. Since I got the warning message that this release is not supported anymore, should I download and install Ubuntu 12.10 directly from Ubuntu's site instead?

    Read the article

  • Is the W3 standard a major factor when google decides SERP position?

    - by Anonymous12345
    I have a dynamic php website which index only has around 800 errors according to the w3 validator online. I have tried checking major websites like ebay, stackoverflow and others also, all with around 400 errors. So my first thought is, what good is that validator when it always displays errors? Secondly, will the errors affect my SERP ranking? ie, will me fixing these errors as good as I can increase my Google search position? Thanks

    Read the article

  • Traits of a DBA - Part One – The Technical Side

    What does it take to become a database administrator, or what kinds of traits should I be looking for when I am hiring a DBA. Those traits can be summarized it two categories: Technical and Personal. In this article, Greg Larsen discusses the technical traits a DBA should have. Free eBook - Performance Tuning with DMVsThis free eBook provides you with the core techniques and scripts to monitor your query execution, index usage, session and transaction activity, disk IO, and more. Download the free eBook.

    Read the article

  • Image slider not working when website is hosted on remote server [on hold]

    - by Tushar Khatiwada
    I'm having a different problem. I made a html website and it contains Nivo Slider in the index page. The site is working perfectly when viewed locally. I uploaded the site to remote server but the slider is not being displayed and the photo from the gallery is not working as expected ( popups on the local pc). The url of the site is: http://d138444.u24.elitehostingwizard.com/ The screenshot from the local pc: http://postimg.org/image/lxiqzx7br/ Thanks

    Read the article

  • Will google "forget" unlinked pages?

    - by Mystere Man
    If i remove all links to a page, but do not delete the page from the site (nor block it from being requested), will google eventually "forget" about it when it reindexes the site? Assuming of course there are no other links to the page somewhere else externally. Or will google continue to request the page and verify it in the index and keep it around so long as it returns a valid page? Is this similar for Bing et al?

    Read the article

  • How can I make Bash (or Zsh) run a particular command before each entered command?

    - by Peeja
    I'd like to configure Bash to run a particular command before running each command line I enter at the prompt. Specifically, I'd like to tell Vim (which is running in another terminal) to write all open buffers, because in my workflow if anything's unsaved when I leave Vim it's a mistake. Is there an option for this in Bash? If not, is there an option in Zsh? (There is a readline-based solution that somewhat fits this problem on another question, but it feels a bit hacky. It'll take it as a last resort.)

    Read the article

  • Error -12 hibernation image. Not enough free memory (sometimes)

    - by user99306
    I am having a problem with hibernation in Ubuntu 12.10, it had worked fine in 12.04. When I try and hibernate it sometimes appears to hibernate throws up an error and returns me to the desktop. The error I get is this: PM: Creating hibernation image: PM: Need to copy 375021 pages PM: Normal pages needed: 117957 + 1024, available pages: 110205 PM: Not enough free memory PM: Error -12 creating hibernation image Now I understand what the error means, but it doesn't make sense. My swap file is 5GB and is seldom ever used as I have 4GB of RAM. I know it is recommended to have 1.5 times more swap than RAM etc, but space doesn't seem to be the problem, despite the error. For example, I rarely use more than about 25%-30% of my RAM, yet still have the problem above. Moreover on a fresh boot and login, with no programs open and using only about 12% of RAM, I can get the above error - yet at other times I can hibernate whilst using 25% of my RAM. Also if I keep trying to hibernate, it eventually does after throwing up the above error four or five times. A successful hibernation looks like this: PM: Creating hibernation image: PM: Need to copy 295511 pages PM: Normal pages needed: 95534 + 1024, available pages: 132627 Is there some setting that I need to tweak or something I need to do before hibernating to avoid this problem? I guess the question could be better interpreted as: Is there some way of safely flushing the buffers and the cache before hibernation? Other than attempting to hibernate several times until it is successful! Thanks in advance.

    Read the article

  • Why a Small Business Website Needs SEO

    Search Engine Optimization makes it easier for search engines, such as Google and Yahoo, to locate, categorize, index and rank web content. It's not uncommon for small business owners to establish a website but not establish a web presence. Put it this way, if your website content is not optimized it's comparable to having a billboard on a dessert island.

    Read the article

  • Difference between key_buffers and recommendation

    - by Typeoneerror
    I'm looking to add a bit of memory to MySQL on a Linode VPS server on which I've got a small facebook (canvas app) PHP app using MySQL running. I'm not super familiar with MySQL optimization so I'm hoping to find a simple answer. I think I want to increase the key_buffer size (the default is 16M) to something like 32M to start, but I'm not sure if I need to tweak anything else as well. All I've done so far is increase the query_cache_size to 32M from 16M. There's also key_buffer under [mysqld] and key_buffer under [isamchk]. What are the difference between those two? If I have Linode 2048MB (http://www.linode.com) VPS, what would recommend I set the buffers to? I don't expect this site to have tons of visitors, but I'd like it to be as optimized as possible. Definitely way more heavy on the database access than PHP and very few HTTP requests.

    Read the article

  • Website that allows twitter like message accessed via tinyurl

    - by blunders
    Looking for a website that allows me to post twitter like messages that are accessed via tinyurls and that I'm able to create an account to get reports on the access of each of these messages. Basically, unlikely twitter each message would require the tinyurl to access the message, but the would be no central author index of messages, but the author would not only be able to login and see a centralized listing of all the messages, but also reports on if the messages had been accessed.

    Read the article

  • Load on Ubuntu 8.04 LTS high

    - by Paddington
    My Ubuntu 8.04 LTS server periodically has a high load avg spike(once every 2 days) resulting in Apache timing out and virtualy everything even SSH to the server is not possible. When I am on the console and run TOP is see that The load avg increases from less than 1 to above 60 in 15 mins. How can I isolate the cause? top - 09:21:51 up 37 days, 20:18, 6 users, load average: 5.41, 5.53, 5.36 Tasks: 160 total, 2 running, 156 sleeping, 0 stopped, 2 zombie Cpu(s): 65.0%us, 8.8%sy, 0.0%ni, 1.0%id,24.6%wa, 0.3%hi, 0.3%si, 0.0%st Mem: 3989468k total, 3444984k used, 544484k free, 360460k buffers Swap: 11687248k total, 178168k used, 11509080k free, 881772k cached

    Read the article

  • Whats a good host for an active vBulletin site?

    - by Kyle
    I've been switching hosts using a VPS each time and I'm just really not sure I'm finding the right VPS's. I've used a VPS from burst.net & rubyringtech and I just feel like it's slowly killing my site because of the slow speed. I really don't know if it's the network or the VPS itself but I really wish to fix this. When I TOP into the VPS peak times it shows this: top - 03:18:56 up 16:33, 1 user, load average: 1.33, 1.40, 1.33 Tasks: 30 total, 1 running, 29 sleeping, 0 stopped, 0 zombie Cpu(s): 27.2%us, 13.6%sy, 0.0%ni, 59.2%id, 0.0%wa, 0.0%hi, 0.0%si, 0.0%st Mem: 1048576k total, 679712k used, 368864k free, 0k buffers Swap: 0k total, 0k used, 0k free, 0k cached And pages take atleast a good 2-3 minutes to load. I have only like 50-60 members on the forum also. I had a shared hosting account and the forum was lightning fast.... Is a VPS a bad idea? :\ What should I do to fix this? I'm running lighttpd with xcache, and the latest mysql + php version. The server is a intel i7 2600 w/ 1gb uplink (I think the 1gb uplink is a lie because I've tested the network and the highest download speed I've seen was 20mb/s from a code.google page) All in all I've seen people talking about linode. Should I try them? I honestly don't need a dedicated server yet it's only 50-70 members online. What should I do? I really want a VPS because I enjoy root access. Does anyone have any suggestions?

    Read the article

  • Linux/Apache performance very slow even on local network

    - by klausch
    I have an Ubuntu server machine running Apache and MYSQL. System and version info is as follows: Linux kernel 3.0.0.-12 Apache/2.2.20 MySQL Ver 14.14.Distrib 5.1.58 I am running a few websites on this server, some HTML only, some PHP/MySQL. THe [problem is that response time is very slow, both on static as well as the dynamic sites. Sometimes it takes more than 10 seconds before a response is given, this makes the sites very slow and almost unusable. The problem occurs even when requesting from the local network. I have added the involved subdomains to my /etc/hosts file, and abolve all the problem is not solved by using IP numbers instead of URL's. So there is no DNS lookup issue. I have modified the log format by showing the response times and sometimes a files takes 12 seconds to be served, see the jquery~.js file in the example screenshot. I have no explanation for this extremely long response time, but is is not even the only issue here, some other files takes a long time to be served too, but do not show a long response time in the log file. So probably different tissues are involved here. I cannot find a solution until now, any suggestions??? THanx in advance, Klaas link to screenshot picture from access logfile Some extra configuration info: apache2.conf (comment is removed) LockFile ${APACHE_LOCK_DIR}/accept.lock PidFile ${APACHE_PID_FILE} Timeout 300 KeepAlive On MaxKeepAliveRequests 100 KeepAliveTimeout 5 <IfModule mpm_prefork_module> StartServers 5 MinSpareServers 5 MaxSpareServers 10 MaxClients 150 MaxRequestsPerChild 0 </IfModule> <IfModule mpm_worker_module> StartServers 2 MinSpareThreads 25 MaxSpareThreads 75 ThreadLimit 64 ThreadsPerChild 25 MaxClients 150 MaxRequestsPerChild 0 </IfModule> <IfModule mpm_event_module> StartServers 2 MinSpareThreads 25 MaxSpareThreads 75 ThreadLimit 64 ThreadsPerChild 25 MaxClients 150 MaxRequestsPerChild 0 </IfModule> User ${APACHE_RUN_USER} Group ${APACHE_RUN_GROUP} AccessFileName .htaccess <Files ~ "^\.ht"> Order allow,deny Deny from all Satisfy all </Files> DefaultType text/plain HostnameLookups Off ErrorLog ${APACHE_LOG_DIR}/error.log LogLevel warn Include mods-enabled/*.load Include mods-enabled/*.conf Include httpd.conf Include ports.conf LogFormat "%v:%p %h %l %u %t \"%r\" %>s %O \"%{Referer}i\" \"%{User-Agent}i\"" vhost_combined LogFormat "%h %l %u %t \"%r\" %>s %O \"%{Referer}i\" \"%{User-Agent}i\" %T/%D" combined LogFormat "%h %l %u %t \"%r\" %>s %O" common LogFormat "%{Referer}i -> %U" referer LogFormat "%{User-agent}i" agent Include conf.d/ Include sites-enabled/ And the virtual hostfile for one of the slow sites, in fact it is pretty straightforward... <VirtualHost *:80> ServerAdmin [email protected] ServerSignature EMail ServerName toenjoy.drsklaus.nl DocumentRoot /var/www/toenjoy.drsklaus.nl <Directory /> Options FollowSymLinks AllowOverride None </Directory> <Directory /var/www/toenjoy.drsklaus.nl/> Options Indexes FollowSymLinks MultiViews AllowOverride AuthConfig AuthType Basic AuthName "To Enjoy" AuthUserFile /etc/.htpasswd Require user petraaa Order allow,deny allow from all </Directory> ScriptAlias /cgi-bin/ /usr/lib/cgi-bin/ <Directory "/usr/lib/cgi-bin"> AllowOverride None Options +ExecCGI -MultiViews +SymLinksIfOwnerMatch Order allow,deny Allow from all </Directory> ErrorLog /var/log/apache2/error.log # Possible values include: debug, info, notice, warn, error, crit, # alert, emerg. LogLevel warn CustomLog /var/log/apache2/access.log combined Alias /doc/ "/usr/share/doc/" <Directory "/usr/share/doc/"> Options Indexes MultiViews FollowSymLinks AllowOverride None Order deny,allow Deny from all Allow from 127.0.0.0/255.0.0.0 ::1/128 </Directory> </VirtualHost> And the output of free -m: klaas@ubuntu-server:/etc/apache2$ free -m total used free shared buffers cached Mem: 1997 1401 595 0 144 1017 -/+ buffers/cache: 238 1758 Swap: 2035 0 2035 and I have no indication that swapping occurs on the moments the site is slow. I have runned top and it does not appear to be a CPU issue. I have the impression that the spawning of a apache thread could maybe be the bottleneck but it is just a suggestion. Maybe this gives some extra information! EDIT: The problem seemed to be gone for some time but occurs again! And not only with Apache, also connecting using SSH takes a tremendous time, sometimes it takes up to 15 seconds before the keyphrase is asked for. Also scp works very slowly. The behavious is really unpredoctable and makes the server very hard to use. Any ideas...?

    Read the article

  • Subdomains on WampServer

    - by MohamedKadri
    I'm working on WampServer for development, I've set up the domain tuniguide.local and it works fine with this configuration: DocumentRoot "D:\www\tuniguide" ServerName tuniguide.local But when I wanted to add a subdomain fr.tuniguide.local I get a 404 Not Found with this configuration: DocumentRoot "D:\www\tuniguide\fr" ServerName fr.tuniguide.local It gives me this message: The requested URL /www/tuniguide/index.php was not found on this server. Is there someting that I missed? Thanks.

    Read the article

< Previous Page | 222 223 224 225 226 227 228 229 230 231 232 233  | Next Page >