Search Results

Search found 9380 results on 376 pages for 'report definition'.

Page 197/376 | < Previous Page | 193 194 195 196 197 198 199 200 201 202 203 204  | Next Page >

  • exchange server 2010 Outlook Web Access - Exchange Control Panel WEB Interface

    - by Aceth
    from what i can gather the mailbox bit of the web interface works fine.. when any of the users go to options (top right) and try to use some of the features such as the Organise Mail Delivery Reports to find messages etc... it comes up with a message .. "An item with the same key has already been added" I've looked in the event viewer and i think its this error - Watson report about to be sent for process id: 7016, with parameters: E12IIS, c-RTL-AMD64, 14.00.0639.021, ECP, ECP.Powershell, https://x.x.x.x/ecp/PersonalSettings/Accounts.svc/GetList, UnexpectedCondition:ArgumentException, c09, 14.00.0639.021. ErrorReportingEnabled: False and Request for URL 'https://x.x.x.x/ecp/PersonalSettings/Accounts.svc/GetList' failed with the following error: System.ArgumentException: An item with the same key has already been added. at System.ServiceModel.AsyncResult.End[TAsyncResult](IAsyncResult result) at System.ServiceModel.Activation.HostedHttpRequestAsyncResult.End(IAsyncResult result) at System.ServiceModel.Activation.HostedHttpRequestAsyncResult.ExecuteSynchronous(HttpApplication context, Boolean flowContext) at Microsoft.Exchange.Management.ControlPanel.WebServiceHandler.ProcessRequest(HttpContext context) at System.Web.HttpApplication.CallHandlerExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute() at System.Web.HttpApplication.ExecuteStep(IExecutionStep step, Boolean& completedSynchronously) I've tried googling but no luck that's relevant :(

    Read the article

  • perfmon reporting higher IOPs than possible?

    - by BlueToast
    We created a monitoring report for IOPs on performance counters using Disk reads/sec and Disk writes/sec on four servers (physical boxes, no virtualization) that have 4x 15k 146GB SAS drives in RAID10 per server, set to check and record data every 1 second, and logged for 24 hours before stopping reports. These are the results we got: Server1 Maximum disk reads/sec: 4249.437 Maximum disk writes/sec: 4178.946 Server2 Maximum disk reads/sec: 2550.140 Maximum disk writes/sec: 5177.821 Server3 Maximum disk reads/sec: 1903.300 Maximum disk writes/sec: 5299.036 Server4 Maximum disk reads/sec: 8453.572 Maximum disk writes/sec: 11584.653 The average disk reads and writes per second were generally low. I.e. for one particular server it was like average 33 writes/sec, but when monitoring in real-time it would often spike up to several hundreds and also sometimes into the thousands. Could someone explain to me why these numbers are significantly higher than theoretical calculations assuming each drive can do 180 IOPs? Additional details (RAID card): HP Smart Array P410i, Total cache size of 1GB, Write cache is disabled, Array accelerator cache ratio is 25% read and 75% write

    Read the article

  • Thunderbird: Synchronize tags

    - by fuenfundachtzig
    When I check my mail (via IMAP) on my laptop I'd like to see the same tags that I set when I checked my mail on my office computer. So the question is: Is it possible to synchronize user-specified mail tags between Thunderbird running on different computers? I've read that tags could be stored via IMAP on the server, so maybe it's just that the server of my mail provider does not support this IMAP feature? (Whichever it is...) Has anybody any experience with this? Related, but not my primary concern: Will there ever be a more flexible tagging system in Thunderbird which allows for an easier definition of new tags? (I.e. that a don't have to define new tags in the preferences menu, but can just type them in when reading a mail?)

    Read the article

  • Doxygen for C++ template class member specialization

    - by Ziv
    When I write class templates, and need to fully-specialize members of those classes, Doxygen doesn't recognize the specialization - it documents only the generic definition, or (if there are only specializations) the last definition. Here's a simple example: ===MyClass.hpp=== #ifndef MYCLASS_HPP #define MYCLASS_HPP template<class T> class MyClass{ public: static void foo(); static const int INT_CONST; static const T TTYPE_CONST; }; /* generic definitions */ template<class T> void MyClass<T>::foo(){ printf("Generic foo\n"); } template<class T> const int MyClass<T>::INT_CONST = 5; /* specialization declarations */ template<> void MyClass<double>::foo(); template<> const int MyClass<double>::INT_CONST; template<> const double MyClass<double>::TTYPE_CONST; template<> const char MyClass<char>::TTYPE_CONST; #endif === MyClass.cpp === #include "MyClass.hpp" /* specialization definitions */ template<> void MyClass<double>::foo(){ printf("Specialized double foo\n"); } template<> const int MyClass<double>::INT_CONST = 10; template<> const double MyClass<double>::TTYPE_CONST = 3.141; template<> const char MyClass<char>::TTYPE_CONST = 'a'; So in this case, foo() will be documented as printing "Generic foo," INT_CONST will be documented as set to 5, with no mention of the specializations, and TTYPE_CONST will be documented as set to 'a', with no mention of 3.141 and no indication that 'a' is a specialized case. I need to be able to document the specializations - either within the documentation for MyClass<T>, or on new pages for MyClass<double>, MyClass<char>. How do I do this? Can Doxygen even handle this? Am I possibly doing something wrong in the declarations/code structure that's keeping Doxygen from understanding what I want? I should note two related cases: A) For templated functions, specialization works fine, e.g.: /* functions that are global/in a namespace */ template<class T> void foo(){ printf("Generic foo\n"); } template<> void foo<double>(){ printf("Specialized double foo\n"); } This will document both foo() and foo(). B) If I redeclare the entire template, i.e. template<> class MyClass<double>{...};, then MyClass<double> will get its own documentation page, as a seperate class. But this means actually declaring an entirely new class - there is no relation between MyClass<T> and MyClass<double> if MyClass<double> itself is declared. So I'd have to redeclare the class and all its members, and repeat all the definitions of class members, specialized for MyClass<double>, all to make it appear as though they're using the same template. Very awkward, feels like a kludge solution. Suggestions? Thanks much :) --Ziv

    Read the article

  • Troubleshooting a crash with Windows 7

    - by AngryHacker
    I have a folder with several thousand videos (All .MPG extensions). When I open the folder with these videos, it shows up fine, but as I start scrolling down, it crashes the Windows Explorer. In the Event Viewer, I see this: Faulting application name: Explorer.EXE, version: 6.1.7600.16450, time stamp: 0x4aebab8d Faulting module name: ntdll.dll, version: 6.1.7600.16559, time stamp: 0x4ba9b802 Exception code: 0xc0000374 Fault offset: 0x00000000000c6df2 Faulting process id: 0x954 Faulting application start time: 0x01cbb1b71edf3b51 Faulting application path: C:\Windows\Explorer.EXE Faulting module path: C:\Windows\SYSTEM32\ntdll.dll Report Id: ee987372-1dc4-11e0-8e06-406186ea9135 I suspect that one of the videos has bad metadata. I removed the Length column and it was still crashing. I then removed the Date column and the problem disappeared. How do I go about troubleshooting this problem or at least identifying the file that's causing the issue.

    Read the article

  • Terminal Server 2008 - Slow File open dialog in Office 2003

    - by Chris
    I have yet another small issue that annoys me everyday in our Terminal Server environment. It seems when logging into Terminal Server users report the initial File | Open or File | Save As from within an application such as Word, Excel (2003 edition) is very slow to display the actual dialog box. The dialog appears quickly but it is whited out (sometimes displays not responding in title bar) and unresponsive, it then sits like this for about 20-30 secs before popping into life and displaying all the folders etc. The second time you go to save or open a file it loads almost instantly. Any suggestions or similar problems out there?

    Read the article

  • Do most "normal" people use metro or desktop in Windows 8

    - by ihateapps
    Just curious since I don't fit the definition of a "normal" computer user nor have my friends or relatives upgraded to Windows 8 yet. Are most people in the "wild" (i.e. the "guy in starbucks" or "your friend who works as an auto tech". not you since you are on SU) using metro apps regularly to the point where the desktop is pointless or are they sticking with the desktop? I've gotten mixed statistics from Google. I'm asking for your personal anecdotal opinion based on what you notice in the wild.

    Read the article

  • Gnome Terminal intercepts ctrl-F1

    - by frank
    Gnome Terminal does not pass on to applications the keypress ctrl-F1. It's an official bug: https://bugs.launchpad.net/ubuntu/+source/gnome-terminal/+bug/932940 The bug is marked Feb. 2012 but lives on in serendipity since 2009. The bug report is not even complete since shift-ctrl-F1 is also affected. However, I noticed that those two keys are the default keys for switch-to-workspace-1 and move-to-workspace-1. So I disabled them. Zero, zippo, zilch: Gnome Terminal would still swallow the keys. Next, I assigned to those two workspace functions totally different keys. The new keybindings did work, Gnome Terminal would still swallow ctrl-F1 and shift-ctrl-F1. Where are the default workspace keybindings stored? [Not in a xml-file.]

    Read the article

  • Web Folder size/quota reporting tool?

    - by nctrnl
    I am currently using a Visual Basic script to determine how big the web folders are and what quota is decided for each folder. The quota is in no way a physical limit, just a value inserted by me to decide whether a user is using too much space or not. The script does the job quite neatly and sends an html file by mail on a regular basis. The problem is that it's such a hassle to insert new quotas since I have to fiddle around with the code. A central "control panel" with an overview and ability to insert new quotas would be more suitable. Is there any software that can do the following: Scan specified folder/subfolders Report the file size and present it in some sort of interface (could be a php/mysql solution) Ability to specify a quota and see the difference value ? It is really important that the quota handling is made simple so that some non-technician can handle this.

    Read the article

  • Web Folder size/quota reporting tool?

    - by nctrnl
    I am currently using a Visual Basic script to determine how big the web folders are and what quota is decided for each folder. The quota is in no way a physical limit, just a value inserted by me to decide whether a user is using too much space or not. The script does the job quite neatly and sends an html file by mail on a regular basis. The problem is that it's such a hassle to insert new quotas since I have to fiddle around with the code. A central "control panel" with an overview and ability to insert new quotas would be more suitable. Is there any software that can do the following: Scan specified folder/subfolders Report the file size and present it in some sort of interface (could be a php/mysql solution) Ability to specify a quota and see the difference value ? It is really important that the quota handling is made simple so that some non-technician can handle this.

    Read the article

  • File corruption when copying different file on raid 1

    - by Stephan
    I have a RAID 1 configuration of 2 1TB drives on a Fedora 12 box. Most of what is stored there are video files that are numerical labeled. The problem I'm having is that I had one of the video files get corrupted. I copied a replacement from a backup and replaced the bad file and now it works fine. However, after doing this the next numbered file goes from 350MB to 200KB and all but about .5 second of video disappears. If I then replace that file it happens to the next one down the line. Ex: Replace corrupt file 1.avi and file 2.avi shrinks to 200KB. Replace now corrupted 2.avi and it works but 3.avi gets screwed up. I have run SMART tests on the drives and they report fine. Does anyone have any tests I can run to try to figure out what is going on? EDIT: It is a two disk software RAID 1 with an ext4 filesystem

    Read the article

  • CentOS - Configuring Puppet to play nice with SELinux

    - by Mike Purcell
    I am running into an issue every time I attempt to start the puppetmasterd service, for which I receive the following error message: root@service1 ~ # -> /etc/init.d/puppetmaster start Starting puppetmaster: Could not prepare for execution: Got 1 failure(s) while initializing: change from absent to directory failed: Could not set 'directory on ensure: Permission denied - /etc/puppet/ssl [FAILED] Apparently there was a known issue with this scenario as outlined in this bug report, however in the bug report it states the issue has been resolved in selinux-policy-3.9.16-29.fc15, but the latest CentOS default upstream version is 3.7.19-155.el6_3.4. So I am trying to figure out the best solution. I can either create a local security policy to allow puppetmasterd the access it needs, or keep researching and install a newer version of selinux-policy outside of the default upstream channel. Anyone have any recommendations? Please don't recommend disabling SELinux... ----- Update ----- Here is the puppet.conf: [main] # The Puppet log directory. # The default value is '$vardir/log'. logdir = /var/log/puppet # Where Puppet PID files are kept. # The default value is '$vardir/run'. rundir = /var/run/puppet # Where SSL certificates are kept. # The default value is '$confdir/ssl'. ssldir = $vardir/ssl [master] certname=puppetmaster.ownij.lan dns_alt_names=puppetmaster.ownij.lan [agent] # The file in which puppetd stores a list of the classes # associated with the retrieved configuratiion. Can be loaded in # the separate ``puppet`` executable using the ``--loadclasses`` # option. # The default value is '$confdir/classes.txt'. classfile = $vardir/classes.txt # Where puppetd caches the local configuration. An # extension indicating the cache format is added automatically. # The default value is '$confdir/localconfig'. localconfig = $vardir/localconfig server=puppetmaster.ownij.lan And here are the denials per the audit log: type=AVC msg=audit(1349751364.985:666): avc: denied { search } for pid=15093 comm="puppetmasterd" name="/" dev=dm-2 ino=2 scontext=unconfined_u:system_r:puppetmaster_t:s0 tcontext=system_u:object_r:home_root_t:s0 tclass=dir type=SYSCALL msg=audit(1349751364.985:666): arch=c000003e syscall=4 success=no exit=-13 a0=1391420 a1=7fffef09ed10 a2=7fffef09ed10 a3=120c500 items=0 ppid=15092 pid=15093 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=pts1 ses=13 comm="puppetmasterd" exe="/usr/bin/ruby" subj=unconfined_u:system_r:puppetmaster_t:s0 key=(null) type=AVC msg=audit(1349751365.302:667): avc: denied { search } for pid=15093 comm="puppetmasterd" name="/" dev=dm-2 ino=2 scontext=unconfined_u:system_r:puppetmaster_t:s0 tcontext=system_u:object_r:home_root_t:s0 tclass=dir type=SYSCALL msg=audit(1349751365.302:667): arch=c000003e syscall=4 success=no exit=-13 a0=1d18530 a1=7fffef0d04d0 a2=7fffef0d04d0 a3=8 items=0 ppid=15092 pid=15093 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=pts1 ses=13 comm="puppetmasterd" exe="/usr/bin/ruby" subj=unconfined_u:system_r:puppetmaster_t:s0 key=(null) type=AVC msg=audit(1349751365.465:668): avc: denied { search } for pid=15093 comm="puppetmasterd" name="/" dev=dm-2 ino=2 scontext=unconfined_u:system_r:puppetmaster_t:s0 tcontext=system_u:object_r:home_root_t:s0 tclass=dir type=SYSCALL msg=audit(1349751365.465:668): arch=c000003e syscall=4 success=no exit=-13 a0=1af3930 a1=7fffef0c5c70 a2=7fffef0c5c70 a3=8 items=0 ppid=15092 pid=15093 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=pts1 ses=13 comm="puppetmasterd" exe="/usr/bin/ruby" subj=unconfined_u:system_r:puppetmaster_t:s0 key=(null) type=AVC msg=audit(1349751365.467:669): avc: denied { search } for pid=15093 comm="puppetmasterd" name="/" dev=dm-2 ino=2 scontext=unconfined_u:system_r:puppetmaster_t:s0 tcontext=system_u:object_r:home_root_t:s0 tclass=dir type=SYSCALL msg=audit(1349751365.467:669): arch=c000003e syscall=4 success=no exit=-13 a0=1b17aa0 a1=7fffef0c5c70 a2=7fffef0c5c70 a3=8 items=0 ppid=15092 pid=15093 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=pts1 ses=13 comm="puppetmasterd" exe="/usr/bin/ruby" subj=unconfined_u:system_r:puppetmaster_t:s0 key=(null) type=AVC msg=audit(1349751366.401:670): avc: denied { write } for pid=15093 comm="puppetmasterd" name="puppet" dev=dm-0 ino=132035 scontext=unconfined_u:system_r:puppetmaster_t:s0 tcontext=system_u:object_r:puppet_etc_t:s0 tclass=dir type=SYSCALL msg=audit(1349751366.401:670): arch=c000003e syscall=83 success=no exit=-13 a0=2d7a400 a1=1f9 a2=2d7a40f a3=7fffef0a6df0 items=0 ppid=15092 pid=15093 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=pts1 ses=13 comm="puppetmasterd" exe="/usr/bin/ruby" subj=unconfined_u:system_r:puppetmaster_t:s0 key=(null) And the audit log if I pass through audit2allow: root@service1 ~ # -> fgrep puppetmasterd /var/log/audit/audit.log | audit2allow -m puppetmasterd module puppetmasterd 1.0; require { type home_root_t; type puppetmaster_t; type puppet_etc_t; type puppet_var_run_t; type httpd_sys_content_t; class lnk_file { relabelfrom relabelto }; class file { relabelfrom read getattr open }; class dir { write read search getattr setattr }; } #============= puppetmaster_t ============== allow puppetmaster_t home_root_t:dir { search getattr }; allow puppetmaster_t httpd_sys_content_t:dir read; allow puppetmaster_t httpd_sys_content_t:file { read getattr open }; #!!!! The source type 'puppetmaster_t' can write to a 'dir' of the following types: # puppet_log_t, puppet_var_lib_t, puppet_var_run_t, puppetmaster_tmp_t allow puppetmaster_t puppet_etc_t:dir { write setattr }; allow puppetmaster_t puppet_etc_t:lnk_file { relabelfrom relabelto }; allow puppetmaster_t puppet_var_run_t:file relabelfrom;

    Read the article

  • SQL server 2000 reporting bad values to ASP.Net Application

    - by Ben
    I have an instance of SQL server 2000 (8.0.2039) with a rather simple table. We recently had users complain about an application I wrote returning bad values for some of the dates in the databse. When I query the table directly via Server Management Studio, it will return the correct values, however the identical queries from my application report the wrong values, but only for a couple of dates. I have been over the code, and it is solid. If the error was in the code, all of the dates reported should be wrong. I have also run the code on an identical test database, and everything is reported properly. I believe the problem may lie in the sql instance itself, which is why I am posting in Server Fault. My question is, has anyone heard of a database reporting bad (incorrect) date values when queried via web application? It should be noted that this particular server was once manually rebuilt after having a cluster clean run on it.

    Read the article

  • How do I set zone priority in Microsoft DNS?

    - by Justin
    I have a standard small network setup (20 users) on Active Directory. All Windows machines have a primary DNS server as the AD and a secondary DNS server as Google PDNS. I want to setup a DNS entry that exists in real DNS but set it up on our DC so that local requests would route this public domain to a local development machine on the network. I setup the zone in DNS which results in the clients resolving the public FQDN to our internal IP. However, sometimes it still resolves to the "real" value (I check by pinging it). Is there some way to give the zone definition in my DC DNS higher priority? Or will the client that has secondary public DNS always at sometimes have a competing entry for this zone?

    Read the article

  • auspex LFS backups

    - by user1250465
    I have some backup tapes which existed on an AUSPEX file server. The backups were written to tape with the SunOs version of the CPIO command. Now that I need to restore them, (of course there are no more auspex servers in existance), the backups won't restore because the headers are not standard. I have dumped the tape images to disk. PAX, CPIO, and TAR cannot read the images. I've tried all of the CPIO format options. The errors I get are "name too long", "byte swapped in header", or just junk output. I can open up the images and read the contents of the files, but cannot restore the images. I have found that SunOs had a special header in CPIO V2.5 images. I have found the source for cpio, now I need definition of the SunOs header inside CPIO?

    Read the article

  • Which file format to use for simple video editoring

    - by Craig HB
    I've just bought a Sanyo VPC-CG10 camcorder which saves clips as MPEG-4 (specifically, High-Definition 720p MPEG-4 AVC/H.246) I want a simple, cheap way to edit the clips and as, I'm using Windows Vista, I though Windows Movie Maker would be a good choice. But you can't import mp4 files into Windows Movie Maker. I've tried various codecs, but I'm not sure what is the best format to convert to, so that I can edit the files in Windows Movie Maker. AVI, MPG, WMI... I'm not sure what format is best. Should I try to keep the movie clips in MPEG-4 and edit them in that format (and then which editor do I use?) or is there a better format that I should convert to before editing (and what format is that?) ?

    Read the article

  • nagios: trouble using check_smtps command

    - by ethrbunny
    I'm trying to use this command to check on port 587 for my postfix server. Using nmap -P0 mail.server.com I see this: Starting Nmap 5.51 ( http://nmap.org ) at 2013-11-04 05:01 PST Nmap scan report for mail.server.com (xx.xx.xx.xx) Host is up (0.0016s latency). rDNS record for xx.xx.xx.xx: another.server.com Not shown: 990 closed ports PORT STATE SERVICE 22/tcp open ssh 25/tcp open smtp 110/tcp open pop3 111/tcp open rpcbind 143/tcp open imap 465/tcp open smtps 587/tcp open submission 993/tcp open imaps 995/tcp open pop3s 5666/tcp open nrpe So I know the relevant ports for smtps (465 or 587) are open. When I use openssl s_client -connect mail.server.com:587 -starttls smtp I get a connection with all the various SSL info. (Same for port 465). But when I try libexec/check_ssmtp -H mail.server.com -p587 I get: CRITICAL - Cannot make SSL connection. 140200102082408:error:140770FC:SSL routines:SSL23_GET_SERVER_HELLO:unknown protocol:s23_clnt.c:699: What am I doing wrong?

    Read the article

  • CPU Affinity on ARM processors

    - by dsljanus
    I am using some RaspberryPI boards for a data acquisition system. They are nice boards, with plenty of community support around them, but they are really slow. I am thinking of gradually replacing them with ODROID multicore boards, with the Samsung Exynos processors. I have some experience using taskset to set CPU affinity on my servers because I am always running Node.js applications that are by definition single threaded. Now, is it possible to do this on an ARM board? I do not see why it would not in theory, but I have doubts over how well it is going to work. Does anyone have experience with this kind of hack? Also, I would appreciate any comments about ARM CPUs and how they differ from x86.

    Read the article

  • Can someone explain the physical architecture of RAID 10 in complete layman's terms?

    - by Hank
    I am a newbie in the world of storage and I am having a hard time digesting the physical architecture of some of the RAID levels. I am particularly interested in RAID 10, and 50. I asked the question specifically about RAID 10, because I feel if I understand that, I'll understand the other. So, I get the definition of RAID 10 - "minimum 4 disks, a striped array whose segments are mirrored". If I've got 4 disks and Disks 1 and 2 are a mirrored pair, and Disks 3 and 4 are a mirrored pair - where does the data get striped? Thanks.

    Read the article

  • Send mail from a distribution group's email address

    - by Campo
    A user has send permission on a distro group on a WINDOWS SERVER 2003 domain. I am the admin. When either of us sends email using the distribution group's email adress we get a non delivery report Your message did not reach some or all of the intended recipients. Subject: TEST Sent: 4/19/2010 4:46 PM The following recipient(s) cannot be reached: [email protected] on 4/19/2010 4:46 PM You do not have permission to send to this recipient. For assistance, contact your system administrator. MSEXCH:MSExchangeIS:/DC=local/DC=DOMAIN:SERVERNAME Thanks, JC

    Read the article

  • Possibly hacked delicious account suddenly containing links to adult sites etc - is there an alternative for delicious

    - by Homunculus Reticulli
    I was searching for some academic publications I saved earlier on delicious. To my astonishment, there are a lot of tags and links in my account, that point to adult material from a Brazillian site (Portugeuse or Spanish), and lots of links to European sports site (in German or Dutch). Most of these links are public, and it seems my account has been hacked. I searched the delicious site to see if I could report the issue, and ofcourse there is no contact details. I've had just about enough with delicious (first, they changed their login system after being taken over - resulting in me losing over a year's worth of bookmarks), now this. Is there a replacement online bookmarking service someone can recommend?

    Read the article

  • Website loading until initial script finishes

    - by wardy277
    Hi, i have a highly used server (running plesk). I have some long scripts that take a while to process (huge mysql database). I have found then in 1 browser, i run the script and while it is loading i cannot view any other parts of the site until the script finishes, it seems that all the requests go off, but they don't get served until the initial script finishes. i thought this may be a server wide issue, but it is not. If i use another computer i can view the site fine, even on the same computer with a different browser i can navigate fine, while the script still loads. I think it much limit the number of requests per session. Is this correct? is there any way to configure this to allow for 2-3 other requests per session? It is really bad that when i am on the phone to a client, i have just run a long report, but cannot use the site or follow what they are saying until the page has loaded? Chris

    Read the article

  • How to interpret IOZone results?

    - by homer5439
    Here are the resuts of running IOZone on an ext3 filesystem on an LVM volume residing on a SAN LUN (it was ran with 5 parallel processes). "Throughput report Y-axis is type of test X-axis is number of processes" "Record size = 4 Kbytes " "Output is in Kbytes/sec" " Initial write " 81628.55 " Rewrite " 83354.72 " Read " 115595.02 " Re-read " 119306.09 " Reverse Read " 47684.20 " Stride read " 10011.09 " Random read " 16751.27 " Mixed workload " 5659.77 " Random write " 1661.85 " Pwrite " 36030.83 Now this is all nice and dandy, but my question is: how do I know whether the values are as good as they could be or there is something to tweak (and if so, what?) The actual usage I will have for that Logical Volume is to act as virtual disk for a VM.

    Read the article

  • Sound card problem, no audio device detected

    - by Paul
    I bought a new sound card because my built in sound card did not function. When I open YouTube, Media Player or anything that can create a sound my computer will hang up and sometimes when I start my computer it will hang when the Windows XP sound will activate. Update: My computer has no audio. It says NO AUDIO DEVICE. I already installed Realtek AC97 and Realtek High Definition Audio Driver and I also pasted stream.dll to the Windows and system32 folders and I restarted my computer but it still says NO AUDIO DEVICE. Please help me. Thanks

    Read the article

  • Internet Explorer Automation: how to suppress Open/Save dialog?

    - by Vladimir Dyuzhev
    When controlling IE instance via MSHTML, how to suppress Open/Save dialogs for non-HTML content? I need to get data from another system and import it into our one. Due to budget constraints no development (e.g. WS) can be done on the other side for some time, so my only option for now is to do web scrapping. The remote site is ASP.NET-based, so simple HTML requests won't work -- too much JS. I wrote a simple C# application that uses MSHTML and SHDocView to control an IE instance. So far so good: I can perform login, navigate to desired page, populate required fields and do submit. Then I face a couple of problems: First is that report is opening in another window. I suspect I can attach to that window too by enumerating IE windows in the system. Second, more troublesome, is that report itself is CSV file, and triggers Open/Save dialog. I'd like to avoid it and make IE save the file into given location OR I'm fine with programmatically clicking dialog buttons too (how?) I'm actually totally non-Windows guy (unix/J2EE), and hope someone with better knowledge would give me a hint how to do those tasks. Thanks! UPDATE I've found a promising document on MSDN: http://msdn.microsoft.com/en-ca/library/aa770041.aspx Control the kinds of content that are downloaded and what the WebBrowser Control does with them once they are downloaded. For example, you can prevent videos from playing, script from running, or new windows from opening when users click on links, or prevent Microsoft ActiveX controls from downloading or executing. Slowly reading through... UPDATE 2: MADE IT WORK, SORT OF... Finally I made it work, but in an ugly way. Essentially, I register a handler "before navigate", then, in the handler, if the URL is matching my target file, I cancel the navigation, but remember the URL, and use WebClient class to access and download that temporal URL directly. I cannot copy the whole code here, it contains a lot of garbage, but here are the essential parts: Installing handler: _IE2.FileDownload += new DWebBrowserEvents2_FileDownloadEventHandler(IE2_FileDownload); _IE.BeforeNavigate2 += new DWebBrowserEvents2_BeforeNavigate2EventHandler(IE_OnBeforeNavigate2); Recording URL and then cancelling download (thus preventing Save dialog to appear): public string downloadUrl; void IE_OnBeforeNavigate2(Object ob1, ref Object URL, ref Object Flags, ref Object Name, ref Object da, ref Object Head, ref bool Cancel) { Console.WriteLine("Before Navigate2 "+URL); if (URL.ToString().EndsWith(".csv")) { Console.WriteLine("CSV file"); downloadUrl = URL.ToString(); } Cancel = false; } void IE2_FileDownload(bool activeDocument, ref bool cancel) { Console.WriteLine("FileDownload, downloading "+downloadUrl+" instead"); cancel = true; } void IE_OnNewWindow2(ref Object o, ref bool cancel) { Console.WriteLine("OnNewWindow2"); _IE2 = new SHDocVw.InternetExplorer(); _IE2.BeforeNavigate2 += new DWebBrowserEvents2_BeforeNavigate2EventHandler(IE_OnBeforeNavigate2); _IE2.Visible = true; o = _IE2; _IE2.FileDownload += new DWebBrowserEvents2_FileDownloadEventHandler(IE2_FileDownload); _IE2.Silent = true; cancel = false; return; } And in the calling code using the found URL for direct download: ... driver.ClickButton(".*_btnRunReport"); driver.WaitForComplete(); Thread.Sleep(10000); WebClient Client = new WebClient(); Client.DownloadFile(driver.downloadUrl, "C:\\affinity.dump"); (driver is a simple wrapper over IE instance = _IE) Hope that helps someone.

    Read the article

< Previous Page | 193 194 195 196 197 198 199 200 201 202 203 204  | Next Page >