Search Results

Search found 23808 results on 953 pages for 'c source'.

Page 592/953 | < Previous Page | 588 589 590 591 592 593 594 595 596 597 598 599  | Next Page >

  • Web master tools is throwing out 404 errors on link not on page

    - by plantify
    Webmaster tools is showing thousands of 404 errors, where pages on the site are referring to another incorrect url. For example, URL not found www.plantify.co.uk/shop/=, linked from http://www.plantify.co.uk/shop/gift-voucher and http://www.plantify.co.uk/shop/special-plant-offers. I obviously have checked the source and cannot find any references to this link on any page. The only consistent issue is that it only seems to report this error on pages with two section i.e. www.plantify.co.uk/shop does not report any error whilst all pages with www.plantify.co.uk/shop/xxx (where xxx can be several different pages such as gift-voucher) all report this. I cannot seem to duplicate this error. I have run a link checker (we use Screaming Frog) and it does not report this error. I have fetched these pages as a bot, and these do not report this error. I am at a total loss. I cannot even duplicate the issue, but it is most definitely an issue, as Webmaster Tools is reporting new errors every day. Is this perhaps google bot doing its own thing?

    Read the article

  • How do 2D physics engines solve the problem of resolving collisions along tiled walls/floors in non-grid-based worlds?

    - by ssb
    I've been working on implementing my SAT algorithm which has been coming along well, but I've found that I'm at a wall when it comes to its actual use. There are plenty of questions regarding this issue on this site, but most of them either have no clear, good answer or have a solution based on checking grid positions. To restate the problem that I and many others are having, if you have a tiled surface, like a wall or a floor, consisting of several smaller component rectangles, and you traverse along them with another rectangle with force being applied into that structure, there are cases where the object gets caught on a false collision on an edge that faces the inside of the shape. I have spent a lot of time thinking about how I could possibly solve this without having to resort to a grid-based system, and I realized that physics engines do this properly. What I want to know is how they do this. What do physics engines do beyond basic SAT that allows this kind of proper collision resolution in complex environments? I've been looking through the source code to Box2D trying to find out how they do it but it's not quite as easy as looking at a Collision() method. I think I'm not good enough at physics to know what they're doing mathematically and not good enough at programming to know what they're doing programmatically. This is what I aim to fix.

    Read the article

  • Choosing the Database Solution for Large Data Application

    - by GµårÐïåñ
    I have been tasked to write an application that will be a combination of document and inventory management in VB.net which will be used to store document images in TIFF, PDF, XPS, TXT, DOC, PPT and so on as binary data that can be retrieved for viewing, printing, and possible OCR to be searchable as well along with meta data such as sender, recipient, type of document, date, source, etc. So the table would probably be something like: DOC_NAME, DOC_DATE, NOTES, ... DOC_BINARY (where the actual document will be put inside) What my concern is finding a database solution that will not become unstable due to size restrictions, records limitations and performance. Some of the options are MS_SQL, SQL Express, SQLite, mySQL, and Access. Now I can pretty much eliminate Access right off the bat as it is just too limiting and not scalable. I can further eliminate SQL Express because of the 2 GB limit and again scalability. So that leaves me with MS_SQL, SQLite and mySQL (although if anyone has other options they think would be good as well, please feel free to share them, by no means am I set on these only). So this brings me to what you guys think is the best option for what I have described. The goal is that the data is all in one place (a single file) that will make backup and portability easier. For small volume usage, pretty much any solution will hold for a while, but my goal is to think ahead and make sure its able to withstand heavy large volume usage as well. Another consideration is also the interoperability with .NET and stability of such code to avoid errors and memory leaks. Your feedback would be greatly appreciated.

    Read the article

  • Rsync fails for files that start with underscore when destination is zfs

    - by Eric
    everyone. I'm using rsync3.1.0pre1 on Mac OS X 10.8.5, and am trying to rsync one folder to another. The destination is a ZFS volume mounted via SMB. The problem I'm having is that files that start with underscore (e.g., '_filename.jpg') are not being successfully synced to the destination. I get the following error message: rsync: mkstemp "/path/to/destination/._filename.jpg.NUgYJw" failed: Permission denied (13) In this case, '_filename.jpg' does not make it to the destination. I understand that rsync creates hidden, temporary files at the destination which are preceded with '.' and have a random file extension appended on the end. But the original filename starts with '', not '.', and I haven't asked rsync to copy extended attributes / resource forks over (unless it always does it). The rsync command I'm using is: rsync -avE --exclude='.DS_Store' --exclude '.Trash' --exclude 'Thumbs.db' --exclude '._*' --delete /source/ /destination/ Has anyone found a way around this problem? Thank you!

    Read the article

  • MD5 and SHA1 checksum uses for downloading

    - by Zac
    I notice that when downloading a lot of open source tools (Eclipse, etc.) there are links for MD5 and SHA1 checksums, and didn't know what these were or what their purpose was. I know these are hashing algorithms, and I do understand hashing, so my only guess is that these are used for hashing some component of the download targets, and to compare them with "official" hash strings stored server-side. Perhaps that way it can be determined whether or not the targets have been modified from their correct version (for security and other purposes). Am I close or completely wrong, and if wrong, what are they?!?! Thanks!

    Read the article

  • issues with ASUS S301L ultrabooks

    - by Wuerze
    I just like to tell you what still is a Problem in Ubuntu 14.04 and should be solved for a nice experience of this distribution. Unfortunately I do not have any hints to tell you, because I am just a user and know not much about programming. Anyway, I hope it helps. 1) Intel open source Graphics issues (proprietary) -when attaching HDMI-cable the screen switches to the external monitor and it's fine, but there is no switching back unless restart with HDMI deattached -black screen appears for like 2 seconds as if I would change screen in the settings; frequency increases with intensity of driver use (i.e. videos or video games) 2) hotkeys (commonly recognised) -there is no possibility to adjust brightness with the hotkeys (Fn+F5 and Fn+F6) 3) touchscreen (commonly recognised) -the touchscreen is behaving like a mouse -configuring gestures for the multi touch screen seems to always end in disabling touch pad gestures 4) panel symbols -battery symbol has got a low accuracy while it determines accurately the energy left; only 5 steps of energy are shown by the symbol Thank you all for participating in the solution of these problems! I will stay tuned and edit this list if something has been solved.

    Read the article

  • Did you know documentation is built-in to usp_ssiscatalog?

    - by jamiet
    I am still working apace on updates to my open source project SSISReportingPack, specifically I am working on improvements to usp_ssiscatalog which is a stored procedure that eases the querying and exploration of the data in the SSIS Catalog. In this blog post I want to share a titbit of information about usp_ssiscatalog, that all the actions that you can take when you execute usp_ssiscatalog are documented within the stored procedure itself. For example if you simply execute EXEC usp_ssiscatalog @action='exec' in SSMS then switch over to the messages tab you will see some information about the action: OK, that’s kinda cool. But what if you only want to see the documentation and don’t actually want any action to take place. Well you can do that too using the @show_docs_only parameter like so: EXEC dbo.usp_ssiscatalog @a='exec',@show_docs_only=1; That will only show the documentation. Wanna read all of the documentation? That’s simply: EXEC dbo.usp_ssiscatalog @a='exec',@show_docs_only=1; EXEC dbo.usp_ssiscatalog @a='execs',@show_docs_only=1; EXEC dbo.usp_ssiscatalog @a='configure',@show_docs_only=1; EXEC dbo.usp_ssiscatalog @a='exec_created',@show_docs_only=1; EXEC dbo.usp_ssiscatalog @a='exec_running',@show_docs_only=1; EXEC dbo.usp_ssiscatalog @a='exec_canceled',@show_docs_only=1; EXEC dbo.usp_ssiscatalog @a='exec_failed',@show_docs_only=1; EXEC dbo.usp_ssiscatalog @a='exec_pending',@show_docs_only=1; EXEC dbo.usp_ssiscatalog @a='exec_ended_unexpectedly',@show_docs_only=1; EXEC dbo.usp_ssiscatalog @a='exec_succeeded',@show_docs_only=1; EXEC dbo.usp_ssiscatalog @a='exec_stopping',@show_docs_only=1; EXEC dbo.usp_ssiscatalog @a='exec_completed',@show_docs_only=1; I hope that comes in useful for you sometime. Have fun exploring the documentation on usp_ssiscatalog. If you think the documentation can be improved please do let me know. @jamiet

    Read the article

  • Reverse Proxy to filter out js files from multiple hosts in nginx

    - by stwissel
    I have a website http://someplace.acme.com that I want my users to access via http://myplace.mycorp.com - pretty standard reverse proxy setup. The special requirement: any js file - either identified by the .js extension and/or the mime-type (if that is possible) text/javascript needs to be served from a different location, a local tool that inspects the js for potential threats. So I have location / { proxy_pass http://someplace.acme.com; proxy_next_upstream error timeout invalid_header http_500 http_502 http_503 http_504; proxy_redirect off; proxy_buffering off; proxy_set_header Host $host; proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; } location ~* \.(js)$ { proxy_pass http://127.0.0.1:8188/filter?source=$1; proxy_redirect off; proxy_buffering off; } The JS still is served from remote and I have no idea how to check for the mime type. What do I miss?

    Read the article

  • Windows Filtering Platform not turning off until admin logon. Win2008R2sp1

    - by rjt
    Just installed Windows Server 2008R2 SP1 to see if it would fix this problem, but it didn't. Until an administrator logs onto the domain controller, there are many events that WFP blocked a connection from Server60 to Server60 or Server60 to Server70. Both server60 and server70 are the domain controllers. One the admin logs on, the WFP events stop. The firewall is off by default GPO. Yes, i know that the WFP kicks in during the boot up sequence until the firewall takes over or in my case does not take over (since Vista), but i clearly should not have to autologon to a domain controller and call autolock or something. Example event LEVEL = Information Source = Microsoft Windows Security Auditing EventID = 5152 "Filtering Platform Packet Drop" and its evil twin id = 5157 "Filtering Platform Connection" "The Windows Filtering platform has blocked a connection." Direction %%14593 SourceAddress 192.168.10.60 SourcePort 49677 DestAddress 192.168.10.60 DestPort 389 Protocol 6 FilterRTID 65667 LayerName %%14611 LayerRTID 48 RemoteUserID S-1-0-0 RemoteMachineID S-1-0-0 windows-server-2008-r2 WFP BFE WindowsFilteringPlatform BaseFilteringEngine

    Read the article

  • Custom Transport Agent: How do I collect NDRs and all other undeliverables in Exchange 2010 from the Postmaster?

    - by makerofthings7
    I'm trying to collect all NDRs in a single mailbox for all invalid recipients, and anything that fails for any reason. I have a custom transport agent, that I've written myself that appears here: [PS] C:\Windows\system32>Get-TransportAgent Identity Enabled Priority -------- ------- -------- Transport Rule Agent True 1 Text Messaging Routing Agent True 2 Text Messaging Delivery Agent True 3 Routing Rule Agent True 4 **** Sometimes when I run get-messagetrackinglog I get failures like this below RunspaceId : 4ecc61fb-13b9-4506-b680-577222c9bf21 Timestamp : 10/14/2013 12:42:42 PM ClientIp : ClientHostname : Exchange1 ServerIp : ServerHostname : SourceContext : Routing Rule Agent ConnectorId : Source : AGENT EventId : FAIL InternalMessageId : 4416 MessageId : <[email protected]> Recipients : {[email protected]} RecipientStatus : {} TotalBytes : 4542 RecipientCount : 1 RelatedRecipientAddress : Reference : MessageSubject : review CGRC due diligence. Sender : [email protected] ReturnPath : [email protected] MessageInfo : MessageLatency : MessageLatencyType : None EventData : How can I collect the NDRs in a single mailbox for review? I have already set the following command but it is of no effect [PS] C:\>set-TransportConfig -JournalingReportNdrTo [email protected] -ExternalPostmasterAddress [email protected]

    Read the article

  • Recommend a web file sharing software please.

    - by Baczek
    I'm looking for a web platform to put company files at. My requirements are: should be accessible via a browser should be open source must be installable (dropbox is a no-go) must have an option to put a access time limit on a file must perform garbage collection automatically after a file expires must be able to mark files as public or private an option to protect a file via a pin-code for users without accounts in the system would be nice to have The problem is I don't even know what to search for - all my googling results in either complete groupware solutions or p2p file sharing software. If such a thing doesn't exist, please don't hestitate to say so, so I can crawl to a corner and cry myself to sleep. TIA

    Read the article

  • Access Control Service: Programmatically Accessing Identity Provider Information and Redirect URLs

    - by Your DisplayName here!
    In my last post I showed you that different redirect URLs trigger different response behaviors in ACS. Where did I actually get these URLs from? The answer is simple – I asked ACS ;) ACS publishes a JSON encoded feed that contains information about all registered identity providers, their display names, logos and URLs. With that information you can easily write a discovery client which, at the very heart, does this: public void GetAsync(string protocol) {     var url = string.Format( "https://{0}.{1}/v2/metadata/IdentityProviders.js?protocol={2}&realm={3}&version=1.0",         AcsNamespace,         "accesscontrol.windows.net",         protocol,         Realm);     _client.DownloadStringAsync(new Uri(url)); } The protocol can be one of these two values: wsfederation or javascriptnotify. Based on that value, the returned JSON will contain the URLs for either the redirect or notify method. Now with the help of some JSON serializer you can turn that information into CLR objects and display them in some sort of selection dialog. The next post will have a demo and source code.

    Read the article

  • Mail sent from local Postfix marked as "possible phishing" in Outlook

    - by leo grrr
    Hi folks, Sorry for the newbie question--this is not my area of expertise by a long shot. I work at a small development shop and we finally got around to doing code reviews. (Yay!) I set up an instance of Review Board -- an open-source code review tool -- on one of our local servers but it doesn't seem to like talking to our hosted Exchange server to send notification emails. I decided to just install Postfix on that same box and send mail from localhost, which is working much more reliably, but Outlook disables all links in the email announcements and marks it as possible phishing. What is making these emails look suspicious and what can I change? Would the best thing be to figure out how to relay to Exchange from Postfix? Thanks!

    Read the article

  • .VBS scripts have stopped running... No idea why!

    - by Django Reinhardt
    We have two .vbs scripts that are run by our Task Scheduler that have suddenly stopped working for no reason we can fathom. We haven't significantly altered our system configuration in the last 24 hours, and the scripts have run without a hitch for months. According to the Task Scheduler the scripts just keep running and never stop, which is never the case. I stopped all running versions through the Scheduler and manually attempted to run one of the .vbs scripts. I got the following error message: Line: 15 Error: The system cannot locate the resource specified. Code: 800C0005 Source: msxml3.dll Line 15 (or 16 to be more accurate - line 15 itself is blank, but so is line 1) is: xml.Send Would could have suddenly caused this? Looking in system32\ and sysWOW64\ shows that msxml3.dll exists. Anybody got any ideas? Thanks a lot!

    Read the article

  • Proper library for enums

    - by Bobson
    I'm trying to refactor some code such that the display is separate from the implementation, and I'm not sure where to put the existing enums. My project is currently structured as follows: Utilities RemoteData (Depends on: Utilities) LocalData (Depends on: RemoteData, Utilities) RemoteWeb (Depends on: RemoteData, Utilities) LocalWeb (Depends on: RemoteData, LocalData, Utilities) I'm now trying to add "ViewLibrary (Depends on: Utilities)" to this list, and then adding it as a new dependency to both RemoteWeb and LocalWeb. It will contain a set of interfaces which the other two projects will implement, use to populate the view, and then consume the result. There's an enum which is currently used in all the projects except Utilities. It thus lives in the RemoteData project, because everything else depends on it. But this new ViewLibrary won't depend on either data project. So how will it know about this enum? Some options I see: Create a new project just for shared enum values. Add it to Utilities, even though it is related to data. Define it a second time in ViewLibrary, and require both RemoteWeb and LocalWeb to convert the one type into the other when they access the shared views. Add a dependency on RemoteData to the ViewLibrary, even though it's supposed to be independent of data-source. Are there any better options? Is this structure flawed to begin with?

    Read the article

  • Recommend a Rackspace Cloud Server API Language Binding?

    - by Alex R
    Rackspace publishes only a hard-to-use HTTP and JSON/XML based "API" (they call it an API but it's really a non-standard Web Service without a WSDL). There are dozens of open-source language bindings to choose from. I have tried three of them so far and they're all horrible (incomplete, buggy, and/or undocumented). Can anybody recommend a language binding which is reasonably complete, well documented, and bug-free? I can use Perl, Python, PHP, or Java. My ultimate objective is to create a script/program that will provision a server, launch a process inside it, wait for the process to finish, copy the results to the local server, and destroy the remote server. What's the best choice for that? Thanks

    Read the article

  • Complex nagios command

    - by gonvaled
    I have defined the following command for one of my service checks: define command{ command_name mycommand command_line $USER1$/check_by_ssh -p $ARG1$ -l nagios -i /etc/nagios2/keys/key1 -H $HOSTADDRESS$ -v -C 'source $USER10$ ; command.py -a get --alert-name $ARG2$ -q' } The problem is that it seems that nagios is parsing the command with the semicolon, and producing garbage which can not be executed. I have tried also putting a backslash \;, to no avail. If I run the command directly on the shell, it works. Which means that this is not a problem with check_by_ssh, but a problem on the parsing of the nagios configuration file. How can I debug this? Is there a way to get a listing of all the commands that nagios has parsed when reading the configuration files?

    Read the article

  • I don't program in my spare time. Does that make me a bad developer?

    - by not-my-real-name
    A lot of blogs and advice on the web seem to suggest that in order to become a great developer, doing just your day job is not enough. For example, you should contribute to open source projects in your spare time, write smartphone apps, etc. In fact a lot of this advice seems to suggest that if you don't love programming enough to do it all day long then you're probably in the wrong career. That doesn't ring true with me. I enjoy my work, but when I come home from the office I'm not in the mood to jump straight back onto the computer and start coding away until bedtime. I only have a certain number of hours free time each day, and I'd rather spend them on other hobbies, seeing friends or going outside than in front of the computer. I do get a kick out of programming, and do hack around outside of work occasionally. I'm committed to my personal development and spend time reading tech blogs and books as a way to keep learning and becoming better. But that doesn't extend so far as to my wanting to use all my spare time for coding. Does this mean I'm not a 'true' software developer at heart? Is it possible to become a good software developer without doing extra outside your job? I'd be very interested to hear what you think.

    Read the article

  • Pull Request Conversations, Inline Diff Enhancements

    [Do you tweet? Follow us on Twitter @matthawley and @adacole_msft] We deployed a new version of the CodePlex website today. Pull Request Conversations Previously, the only way for project members and users who submitted pull requests to converse was via e-mail. This complicated the review process and made conversations isolated and difficult to track. For this release, we’ve added functionality that enables you to have those same conversations within the pull request page. When you view a pull request, you’ll now see “Comments” and “Changes” tabs, with current comments displayed. Inline Diff Enhancements We tweaked the inline diff experience to make it easier to traverse diff blocks. When you open up the inline diff experience, you’ll now see up and down arrows. To move between the diff blocks, you can use those arrows or utilize the available keyboard shortcuts. Lastly, we have also brought the inline diff experience to the source control changes page for project and fork changesets. You can see both enhancements live by viewing the associated pull request or changeset changes on WikiPlex. The CodePlex team values your feedback. We are frequently monitoring Twitter, our Discussions, and Issue Tracker. If you have not visited the Issue Tracker recently, please take a few minutes to suggest or vote on a feature you would like to see implemented.

    Read the article

  • Is there any kind of established architecture for browser based MMO games?

    - by black_puppydog
    I am beginning the development of a broser based game in which players take certain actions at any point in time. Big parts of gameplay will be happening in real life and just have to be entered into the system. I believe a good kind of comparison might be a platform for managing fantasy football, although I have virtually no experience playing that, so please correct me if I am mistaken here. The point is that some events happen in the program (i.e. on the server, out of reach for the players) like pulling new results from some datasource, starting of a new round by a game master and such. Other events happen in real life (two players closing a deal on the transfer of some team member or whatnot - again: have never played fantasy football) and have to be entered into the system. The first part is pretty easy since the game masters will be "staff" and thus can be trusted to a certain degree to not mess with the system. But the second part bothers me quite a lot, especially since the actions may involve multiple steps and interactions with different players, like registering a deal with the system that then has to be approved by the other party or denied and passed on to a game master to decide. I would of course like to separate the game logic as far as possible from the presentation and basic form validation but am unsure how to do this in a clean fashion. Of course I could (and will) put some effort into making my own architectural decisions and prototype different ideas. But I am bound to make some stupid mistakes at some point, so I would like to avoid some of that by getting a little "book smart" beforehand. So the question is: Is there any kind of architectural works that I can read up on? Papers, blogs, maybe design documents or even source code? Writing this down this seems more like a business application with business rules, workflows and such... Any good entry points for that?

    Read the article

  • Update get's stuck unpacking bad package, won't continue without it

    - by Shazzner
    Removing the package from cache, and disabling Recommended Updates in Software Sources gives me an error saying I need to install this package. I've tried to update several times, but it keeps hanging on unpacking the ubuntu-sso-client package. Which forces me to hard-reset to unlock the package manager. I've tried: sudo dpkg --configure -a No errors sudo apt-get upgrade --fix-broken Wants me to reinstall said package, resulting in it hanging Removing the package: sudo rm -f /var/cache/apt/archives/ubuntu-sso-client_1.0.8-0ubuntu1_all.deb Results in the same effect, it re-downloads then hangs I can de-select Recommended Updates but I get error messages when I try to update again: E: The package ubuntu-sso-client needs to be reinstalled, but I can't find an archive for it. Which won't let me continue Finally re-enabling the source, I try to remove ubuntu-sso sudo apt-get remove ubuntu-sso-client It removes a bunch of other packages but complains about the package: dpkg: error processing ubuntu-sso-client (--remove): Package is in a very bad inconsistent state - you should reinstall it before attempting a removal. Reinstalling ubuntu-sso-client hangs :( I'm at my wits end, any ideas? I would be nice to install all the other updates but this one is preventing it.

    Read the article

  • What is this video format, and how do I convert It

    - by OrangeRind
    Description I have a big (7.4G) .mkv file (1080p) which I want to convert to H.264 (using x264) Problem MediaCoder and GSpot are unable to detect the codec. They don't display anything. Just that the file is a matroska Container Video with a MIMEtype of video/x-matroska. No bitrate, profile etc. But the source tells me that is VC-1 encoded. Question So how do I encode this file. as in, using what encoding software, since MediaCoder has failed.

    Read the article

  • Headphones not working in ubuntu 12.04LTS

    - by mursalat
    So after a million warnings about not supported ubuntu OS, last night I finally upgraded to 12.04 - went somewhat smoothly (sadly) After my installation I got all excited with the exciting new look when I login, and all the new shebang, gently I installed chrome and went to youtube to check out some of my music and test flash in the process, now the sound worked awesomely, however when I plugged in my headphones, I could hear only a buzz-like sound when the base drops or there is a loud noise, maybe I didn't hear it from the headphones, but some other source. However sadly my headphones are not working, and I am a noob at fixing these stuff on Ubuntu. I've done loads of programming but when it comes to linux drivers and settings I seem to get frustrated. So I would really appreciate any help people! IN SUMMARY: My headphones are not working, my laptop's internal speakers are working awesome. To be as helpful as I can I have pasted the output from lspci -v to http://pastebin.com/VQNzDkZs I have also checked the volume levels from alsamixer and none are on mute. If you need any more information, please just ask, and I will be checking this post every 3 to 4 hours! Cheers!

    Read the article

  • Rsync over ssh with root access on both sides

    - by Tim Abell
    Hi, I have one older ubuntu server, and one newer debian server and I am migrating data from the old one to the new one. I want to use rsync to transfer data across to make final migration easier and quicker than the equivalent tar/scp/untar process. As an example, I want to sync the home folders one at a time to the new server. This requires root access at both ends as not all files at the source side are world readable and the destination has to be written with correct permissions into /home. I can't figure out how to give rsync root access on both sides. I've seen a few related questions, but none quite match what I'm trying to do. I have sudo set up and working on both servers.

    Read the article

  • How can I measure TCP timeout limit on NAT firewall for setting keepalive interval?

    - by jmanning2k
    A new (NAT) firewall appliance was recently installed at $WORK. Since then, I'm getting many network timeouts and interruptions, especially for operations which would require the server to think for a bit without a response (svn update, rsync, etc.). Inbound SSH sessions over VPN also timeout frequently. That clearly suggests I need to adjust the TCP (and ssh) keepalive time on the servers in question in order to reduce these errors. But what is the appropriate value I should use? Assuming I have machines on both sides of the firewall between which I can make a connection, is there a way to measure what the time limit on TCP connections might be for this firewall? In theory, I would send a packet with gradually increasing intervals until the connection is lost. Any tools that might help (free or open source would be best, but I'm open to other suggestions)? The appliance is not under my control, so I can't just get the value, though I am attempting to ask what it currently is and if I can get it increased.

    Read the article

< Previous Page | 588 589 590 591 592 593 594 595 596 597 598 599  | Next Page >