Search Results

Search found 21436 results on 858 pages for 'draw order'.

Page 494/858 | < Previous Page | 490 491 492 493 494 495 496 497 498 499 500 501  | Next Page >

  • Windows 7 x64. Some 32 bit applications refuse to install.

    - by user250712
    I have been having problems lately when trying to install older games onto my PC. It is only with 32 bit applications. A few games that will not install are: Drakan: Order Of The Flame TA Kingdoms (Total Annihilation installed fine) Baldur's Gate. In Baldur's Gate, when I use autorun.exe and choose install, the autorun closes and the computer loads for a second (as it should) then nothing pops up. Ten minutes later still nothing, so I try again, still nothing. So next I use Setup.exe. Still nothing. I run it in every compatibility mode, and as Administrator in every mode, still nothing. Then I open Task Manager, and there are about 80 setup.exe processes running, all of them doing nothing and taking up next to no resources.

    Read the article

  • "Device Not Ready" when attempting to connecting an old (DOS) hard drive to windows 7

    - by Christopher Vigliotti
    I have an old Western Digital Caviar 2540 hard drive that I am attempting to connect to a Windows 7 machine by using a Bytecc BT-300 USB 2.0 to IDE/SATA Adapter. I'm connecting things in the right order (power, then sata, then usb) and the jumper settings on the drive are at present set to slave, but am still having an issue. I am recieving a message that the drive is not initialized. When I go into Computer Management I can see the drive (as "Disk 5, Unknown, Not Initialized"), but when I attempt to initialize it or right-click and select "Offline" It tells me that "the device is not ready". Is there something that I can to to get around this, connect the drive and copy the data that I need off of this drive? Is there a third party tool available that I can use?

    Read the article

  • how to properly edit hosts, hostname and resolf.conf?

    - by Firewall
    i,v been searching the internet for a real noop tutorial on the subject but could not found any direct info. on how to edit these files the proper way. i,v got a debian internet server that i use to host some personal domains and runs squid and rTorrent. the server is up and running with no problems but i am confused about a few things. lets say that i named my server (foo), my domain is (example.com) and my public IP is 95.211.133.200 now: should /etc/hostname contains: tango.example.com or tango <----- just the server name should /etc/hosts contains: 127.0.0.1 localhost.localdomain localhost 95.211.133.200 foo.example.com foo should /etc/resolf.conf contains (along with the nameservers) both: domain example.com search example.com or just the first one. are there any other files that i should edit in order to make things right? last thing, the command: domainname returns: (none) i believe it should return (example.com). what should i do to correct that?

    Read the article

  • Quiz Master at Beyond Relational

    - by Vincent Maverick Durano
    Last month a friend of mine invited me to join BeyondRelational.com and asked me to nominate myself as a .NET Quiz Master. In order to qualify I must submit an interesting question related to .NET and their .NET team will review the information and will select 31 quiz masters for the .NET quiz category. This seems insteresting to me so I go ahead and submit one entry. Luckily I was selected as one of the 31 Quiz Masters in the .NET category. I hope to be able to keep up the good work there for years to come. Big Thanks to Jacob Sebastian and his Team! And oh.. I didn't get a changce to blog about this last week but just to let you guys know that the .NET General Quiz just started last january 1st 2011. The quiz will be a series of 31 questions, managed by 31 .NET quiz masters. Each quiz master will ask one question and will moderate the discussion and answers and finally will identify the winner of each quiz. Each answer that is correct will get a certain score ranging from 1 to 10 where 10 is the highest. The scores of all 31 questions will be added up to identify the final winner. So what are you waiting for? Sign-up and register now and get a changce to win some exciting prizes! Technorati Tags: Community

    Read the article

  • Could a singleton type replace static methods and classes?

    - by MKO
    In C# Static methods has long served a purpose allowing us to call them without instantiating classes. Only in later year have we became more aware of the problems of using static methods and classes. They can’t use interfaces They can’t use inheritance They are hard to test because you can’t make mocks and stubs Is there a better way ? Obviously we need to be able to access library methods without instantiated classes all the time otherwise our code would become pretty cluttered One possibly solution is to use a new keyword for an old concept: the singleton. Singleton’s are global instances of a class, since they are instances we can use them as we would normal classes. In order to make their use nice and practical we'd need some syntactic sugar however Say that the Math class would be of type singleton instead of an actual class. The actual class containing all the default methods for the Math singleton is DefaultMath, which implements the interface IMath. The singleton would be declared as singleton Math : IMath { public Math { this = new DefaultMath(); } } If we wanted to substitute our own class for all math operations we could make a new class MyMath that inherits DefaultMath, or we could just inherit from the interface IMath and create a whole new Class. To make our class the active Math class, you'd do a simple assignment Math = new MyMath(); and voilá! the next time we call Math.Floor it will call your method. Note that for a normal singleton we'd have to write something like Math.Instance.Floor but the compiler eliminates the need for the Instance property Another idea would be to be able to define a singletons as Lazy so they get instantiated only when they're first called, like lazy singleton Math : IMath What do you think, would it have been a better solution that static methods and classes? Is there any problems with this approach?

    Read the article

  • How to tune Windows 2008r2 and IIS to maximize single file download speeds?

    - by uSlackr
    We recently put up an IIS site (on WinSvr 2008r2) that is used almost exclusively for downloading files over the internet. The data exists as a large collection of .zip files ranging from 1MB - 35GB in size. We want to allow a lot of downloads during a day (more than 500GB) but have implemented an outbound ASA throttle at 60mbps in order to preserve bandwidth for other uses. The total link speed is 100mbps. Here's the interesting part: While we can serve up multiple downloads to hit the 60mbps cap, we cannot get any single download to exceed 2.5M bytes/sec (20 Mbits/s). Is there any TCP or IIS tuning we can do to push up individual download speeds? Or something else to look at?

    Read the article

  • How to configure DNS BIND to work locally on one computer?

    - by user619656
    I want to do some changes to the BIND source code. In order to test those changes I want to be able to post queries to my local BIND server and for it to use only the local zone files. I know how to make the zone files and somewhat the named.conf file but what should i put in /etc/resolv.conf? In resolv.conf currently there is the line nameserver 192.168.0.1 witch i guess is my router IP address and the queries go through the router to my ISP. I want those queries to go to the local BIND server and to look for answers in the zone files i provided. Is there a way for this using resolf.conf file or should i do something else?

    Read the article

  • Installing a personal security certificate for Windows Server 2008 Terminal Services user

    - by Rick
    We use StoneEdge Order Manager, which runs under Microsoft Access, on several Windows computers as well as through Terminal Services on Windows Server 2008. Terminal Services users are unable to process credit cards using the First Data Global Gateway on the server. We have followed the necessary setup instructions provided under the user account, which involves adding a certificate in the Internet Options control panel. The Windows XP desktops require this to be done, or a generic 'unable to connect' message is shown when attempting to charge a card. On the server, this message is shown regardless of whether the certificate has been installed. Is there anything else that needs to be done that is specific to Windows Server that is not mentioned in the workstation instructions? Setup Instructions

    Read the article

  • UID uniqueness of IMAP mails

    - by SecStone
    In our internal webmail system, we'd like to attach notes and contacts to certain mails. In order to do this, we have to keep track of every mail on our IMAP server. Unfortunately the IMAP standard doesn't enforce the uniqueness of the UID of a mail in a mailbox (just in subfolders). Is there any tool/IMAP server which generates UIDs which are truly unique? Or is there any other way how we can identify each mail? (the Message-ID header field is not unique as some mails do not contain such a field). Additional resources: Unique ID in IMAP protocol - Limilabs.com

    Read the article

  • Helping to Reduce Page Compression Failures Rate

    - by Vasil Dimov
    When InnoDB compresses a page it needs the result to fit into its predetermined compressed page size (specified with KEY_BLOCK_SIZE). When the result does not fit we call that a compression failure. In this case InnoDB needs to split up the page and try to compress again. That said, compression failures are bad for performance and should be minimized.Whether the result of the compression will fit largely depends on the data being compressed and some tables and/or indexes may contain more compressible data than others. And so it would be nice if the compression failure rate, along with other compression stats, could be monitored on a per table or even on a per index basis, wouldn't it?This is where the new INFORMATION_SCHEMA table in MySQL 5.6 kicks in. INFORMATION_SCHEMA.INNODB_CMP_PER_INDEX provides exactly this helpful information. It contains the following fields: +-----------------+--------------+------+ | Field | Type | Null | +-----------------+--------------+------+ | database_name | varchar(192) | NO | | table_name | varchar(192) | NO | | index_name | varchar(192) | NO | | compress_ops | int(11) | NO | | compress_ops_ok | int(11) | NO | | compress_time | int(11) | NO | | uncompress_ops | int(11) | NO | | uncompress_time | int(11) | NO | +-----------------+--------------+------+ similarly to INFORMATION_SCHEMA.INNODB_CMP, but this time the data is grouped by "database_name,table_name,index_name" instead of by "page_size".So a query like SELECT database_name, table_name, index_name, compress_ops - compress_ops_ok AS failures FROM information_schema.innodb_cmp_per_index ORDER BY failures DESC; would reveal the most problematic tables and indexes that have the highest compression failure rate.From there on the way to improving performance would be to try to increase the compressed page size or change the structure of the table/indexes or the data being stored and see if it will have a positive impact on performance.

    Read the article

  • Syncing Large Directories/Filesystems using USB Drive

    - by Alan Lue
    Does anyone have a solution for syncing large directories/filesystems using just a USB flash drive (and specifically without using a network connection)? The objective is simply to sync a user directory between two computers. The contents of the user directory could amount to a large quantity of data—say, a quantity larger than could be stored on any single USB drive—but the aggregate size of changes that must be propagated by a single sync could easily fit on a USB drive. As an example, suppose a user directory is already synchronized between a desktop and a laptop computer. Here's a use case: Some changes are made in the user directory on the desktop. We mount a USB drive onto the desktop and copy whatever changes need to be applied to the laptop user directory in order to synchronize the desktop and laptop user directories. We now mount the USB drive onto the laptop and apply the changes. The desktop and laptop user directories are now synchronized. Any ideas? Alan

    Read the article

  • Windows 7 Tray Application

    - by Cpt. Jack
    When I launch an application in Windows 7, it shows up in the tray but no where on my main screen. When I hover over it and select the application, nothing shows up. In order to make the application visible, I need to hover over it, right click and select maximize. Anytime I maximize the screen and then select to make it a smaller window, it also disappears. This is an awfully painful process in judst trying to launch the application. Please help me figure out how to change this to make sure th applicationalways opens into a maximized screen.

    Read the article

  • How are certain analytics metrics (time on site, etc.) usually distributed?

    - by a barking spider
    I'm not sure if I've come to the right place to ask this question, but I'm gathering some information for a research project. We're trying to design an experiment that'll heavily involve web analytics, and I'm trying to figure out some sensible values of mean +/- standard deviation for the following visitor-level (i.e., visitor 1 spends 2 minutes on site, visitor 2 spends 1 minute -- mean 1.5 +/- 0.71...) metrics: time spent on site page views If time allowed, we would put up the sites and gather the information ourselves, but we have a grant deadline coming up. I realize that even though these the distributions of these quantities are probably going to be heavily skewed towards zero, we'll need some reasonable figures or estimates of these figures in order to do sample size calculations, etc. Anyway, I'm not sure where else I'd turn, and I certainly have had a difficult time finding these values in the prior literature. If someone could direct me to a paper with the right information, or if you have these figures on hand (perhaps taken directly from your logs!) -- that would be amazing, and I'd love to hear from you. Thanks in advance, and even though I'm not allowed to reveal too much, rest assured that this info'll be applied towards a good cause :)

    Read the article

  • How can I see if apache is overloaded and dropping or not accepting connections?

    - by cat pants
    Basically I just want to see if apache is handling a current level of high traffic or if I need to tune it to handle more connections. (I have found plenty of information on the actual tuning, so no help needed there) I know it has been dropping or not accepting connections earlier today, but not seeing anything in the error logs. Is the expected behavior to throw a 503 in the error log if apache cannot accept more connections? If so, what error logging level do I need in order to see these? What is the correct terminology: dropping connections or not accepting connections? MPM is prefork, OS is Linux, apache version is 2.2.15.

    Read the article

  • Help me with this logic (newbie) [migrated]

    - by Surendra
    I need to generate a half pyramid number series with the entered starting number and the number of lines in a html page using Javascript and show the result in html page . I have done the Java scripting and stuff . What I don't get is the logic to it. Take a look at this you may get an idea what I'm talking about: Here is my function in Javascript that will be triggered on a button click function doFunction(){ var enteredNumber=document.getElementById("start"); var lines=document.getElementById("lines"); var result; for(i=0;i<=lines.value;i++) { for(j=enteredNumber.value;j<=i;j++) { document.write(j + "&nbsp;" + "&nbsp;"); } document.write("<br />"); } } Help me with the logic to print following order: 1 1 2 1 2 3 1 2 3 4 1 2 3 4 5 There is a condition. I will specify $start and $lines. If $start = 5 and $lines = 3 then output should be like: 5 5 6 5 6 7 I have had used the for loop , but that doesn't work if I give my own start number that is higher than the number of lines. I actually need it done with Javascript, I have had done the necessary but I'm confused with the logic to generate such series (with the user given values) I had actually used two for loops to generate the regular number series like below 1 1 2 1 2 3 and so on.

    Read the article

  • Unable to ping between subnets and out to internet

    - by battlemidget
    My setup is Modem - Linksys router - Laptop with 2 devices (wlan0/eth0) - desktop machine Router is 192.168.1.1 gateway to the internet Laptop wlan0 is 192.168.1.4 with a gw of 192.168.1.1 Laptop eth0 is 192.168.2.254 which acts as a second gateway desktop is 192.168.2.100 On laptop i've setup ip_forward to 1, and have inserted 2 iptables rules -A FORWARD -i eth0 -o wlan0 -j ACCEPT -A FORWARD -i wlan0 -o eth0 -j ACCEPT The laptop can ping outside the network (i,e, yahoo.com) it can not ping 192.168.2.100. The desktop can ping 192.168.2.254 but nothing outside the network or 192.168.1.0 subnet. On laptop ip route show lists: 192.168.2.0/24 dev eth0 proto kernel scope link src 192.168.2.254 192.168.1.0/24 dev wlan0 proto kernel scope link src 192.168.1.4 127.0.0.0/8 dev lo scope link default via 192.168.1.1 dev wlan0 What am I missing to make my desktop go through the laptop in order to access the router which provides access to the internet? Thanks

    Read the article

  • Can't copy paste on the first try

    - by Sunny88
    When I try to copy and paste something from firefox to say notepad or word, it doesn't work on the first try. That is I go to firefox, select text, right click, select copy, then switch to notepad, right click, select paste, but it pastes not the thing which I copied just now, but whatever was in the clipboard before I copied. If after this I go back to firefox and copy that text again, and then go back to notepad, then it will paste correctly. So in order to copy paste something it takes me 2 tries. This doesn't always happen this way, but only sometimes. So sometimes I can paste on first try, but sometimes it takes me two tries. I am using firefox 7.0.1 and windows 7. Also it is not only with firefox, sometimes the same thing happens when I copy paste from other programs. What could be the reason, and how can I fix this?

    Read the article

  • Changing location in Google Chrome when searching

    - by Alex
    I've recently moved to the Czech Republic from Scotland and I can't find a way to permanently stop Google from automatically defaulting back to google.cz all the time. I've checked to ensure that all my google accounts and cookie based settings (e.g. Advanced Search Options) are set to English but it's still clearly doing an IP address lookup and disregarding everything else. The default Search Engine for Google Chrome (and switches to google.cz automatically): {google:baseURL}search?{google:RLZ}{google:acceptedSuggestion}{google:originalQueryForSuggestion}sourceid=chrome&ie={inputEncoding}&q=%s I've tried hardcoding it to: http://www.google.com/search?{google:RLZ}{google:acceptedSuggestion}{google:originalQueryForSuggestion}sourceid=chrome&ie={inputEncoding}&q=%s this kind of works, but won't work for inline searching, i.e. I always have to press enter in order to get any results which is a bit annoying as I've gotten so used to AJAX style searching I can't have been the only one to get this issue? Any help is appreciated

    Read the article

  • some issues with removing www and redirecting index.html

    - by MariaKeys
    Hello Fellas, I am having trouble doing what i want to do with the following setup. I would like to remove all WWW, and also forward index.html to root dir. I would like this to be for all domains, so i am doing inside httpd.conf directory directive. I tried many variations with no success. Latest version is below (domains are inside /var/www/html, in seperate directories). http://www.example.com/index.html > http://example.com http://www.example.com/someother/index.html > http://example.com/someother/ Thanks, Maria <Directory "/var/www/html/*/"> RewriteEngine on RewriteBase / RewriteCond %{HTTP_HOST} ^www\.(.+)$ [NC] RewriteRule ^(.*)$ http://%1/$1 [R=301,L] #RewriteCond %{REQUEST_URI} /^index\.html/ RewriteRule ^(.*)index\.html$ / [R=301,L] Options ExecCGI Includes FollowSymLinks AllowOverride AuthConfig AllowOverride All Order allow,deny Allow from all </Directory>

    Read the article

  • Is there any diff tool for XML files?

    - by qedi
    Are there any good (Linux) tools for diffing two XML files? Ideally, I would like to be able configure it to some things strict, or loosen some things, like whitespace, or attribute order. I'll often care that the files are functionally the same, but diff by itself, would be annoying to use, especially if the XML file doesn't have a lot of linebreaks. For example, the following should really be okay to me: <tag att1="one" att2="two"> content </tag> <tag att2="two" att1="one"> content </tag>

    Read the article

  • Windows updates behind a physical firewall with only IP based rules and generic outbound connections are turned off

    - by user125245
    I have some boxes that I do not want to allow any in or outbound traffic to the internet Except for windows updates. However the fire wall in place (Cisco ASA) apparently only supports ip based rules. As best I can tell access to Microsoft updates via anything other then the half dozen URL masks the Microsoft lists as needed does not appear possible. I have kicked around building a full WSUS that I would then manually copy the update files to so that no direct Microsoft access is needed but this sounds very top heavy for the very few boxes involved. I have also kicked around manual updates all around but am not certain how to be conveniently and confidently sure that the correct updates are being applied in the correct order. Any ideas from any direction would be appreciated. I want this as simple / cost effective as possible but have very little flexibility on the only absolutely required internet access policy.

    Read the article

  • How can I delete Time Machine files using the commandline

    - by Tim
    I want to delete some files/directories from my Time Machine Partition using rm, but am unable to do so. I'm pretty sure the problem is related to some sort of access control extended attributes on files in the backup, but do not know how to override/disable them in order to get rm to work. An example of the error I'm getting is: % sudo rm -rf Backups.backupdb/MacBook/Latest/MacBook/somedir rm: Backups.backupdb/MacBook/Latest/MacBook/somedir: Directory not empty rm: Backups.backupdb/MacBook/Latest/MacBook/somedir/somefile: Operation not permitted There are a number of reasons I do not want to use either the Time Machine GUI or Finder for this. If possible, I'd like to be able to maintain the extended protection for all other files (I'd like not to disable them globally, unless I can re-enable once I've done my work).

    Read the article

  • Rethinking Oracle Optimizer Statistics for P6 Part 2

    - by Brian Diehl
    In the previous post (Part 1), I tried to draw some key insights about the relationship between P6 and Oracle Optimizer Statistics.  The first is that average cardinality has the greatest impact on query optimization and that the particular queries generated by P6 are more likely to use this average during calculations. The second is that these are statistics that are unlikely to change greatly over the life of the application. Ultimately, our goal is to get the best query optimization possible.  Or is it? Stability No application administrator wants to get the call at 9am that their application users cannot get there work done because everything is running slow. This is a possibility with a regularly scheduled nightly collection of statistics. It may not just be slow performance, but a complete loss of service because one or more queries are optimized poorly. Ideally, this should not be the case. The database optimizer should make better decisions with more up-to-date data. Better statistics may give incremental performance benefit. However, this benefit must be balanced against the potential cost of system down time.  It is stability that we ultimately desire and not absolute optimal performance. We do want the benefit from more accurate statistics and better query plans, but not at the risk of an unusable system. As a result, I've developed the following methodology around managing database statistics for the P6 database.  1. No Automatic Re-Gathering - The daily, weekly, or other interval of statistic gathering is unlikely to be beneficial. Quite the opposite. It is more likely to cause problems. 2. Smart Re-Gathering - The time to collect statistics is when things have changed significantly. For a new installation of P6, this is happening more often because the data is growing from a few rows to thousands and more. But for a mature system, the data is not changing significantly from week-to-week. There are times to collect statistics: New releases of the application Changes in the underlying hardware or software versions (ex. new Oracle RDBMS version) When additional user groups are added. The new groups may use the software in significantly different ways. After significant changes in the data. This may be monthly, quarterly or yearly.  3. Always Test - If you take away one thing from this post, it would be to always have a plan to test after changing statistics. In reality, statistics can be collected as often as you desire provided there are tests in place to verify that performance is the same or better. These might be automated tests or simply a manual script of application functions. 4. Have a Way Out - Never change the statistics without a way to return to the previous set. Think of the statistics as one part of the overall application code that also includes the source code--both application and RDBMS. It would be foolish to change to the new code without a way to get back to the previous version. In the final post, I will talk about the actual script I created for P6 PMDB and possible future direction for managing query performance. 

    Read the article

  • Virtualise Excel in a browser

    - by Macros
    Is it possible to give users access to a virtualised instance of Excel - I don't want to give them access to a full OS (although this will clearly be running in the background, all they can access is Excel - they don't even see any other screens)? Secondly, if it is possible, is it possible to do within a browser? Edit I am building a system which is designed to test candidates skills in Excel and for this reason needs to use the full desktop version and not a web app. I don't want to have to ensure Excel is installed on the client machine as there will be issues around differing versions and security as the workbook(s) that are used in the test use VBA extensively to customise and mark the exercises. Ideally my web app would be able to open a session to the server which then just puts the user into an instance of Excel without ever seeing a desktop. I would also need to be able to pass in command line parameters in order to define which workbook to open and also pass in a unique token to identify the user

    Read the article

  • Hello!

    - by barryoreilly
    After many months of deliberating I have finally gotten around to starting this blog! The reason for doing this is the large number of half finished articles lying around on my hard disk, unpublished and unloved. These articles have been of huge benefit to me, and have been written in an attempt to consolidate my own thinking, in order to help me structure my thoughts and ideas as I have tried to digest new ideas and understand abstract theories. It is my hope that by tidying up these articles and publishing them here that I can continue this learning process by getting feedback on the ideas from within the developer community. i have worked with .NET for 8 years now, and have worked with ASP.NET, SQL Server, Windows programming as well as general network administration. Since 2004 my focus has been on integration, web services, and more often than not Biztalk Server. The last two years have seen me focus on SOA and WCF, and the Managed Services Engine, so this is probably where the main focus of the blog will to start with, but there are so many fun things to play with these days that i have no idea where it will end up.....   Barry

    Read the article

< Previous Page | 490 491 492 493 494 495 496 497 498 499 500 501  | Next Page >