Search Results

Search found 25422 results on 1017 pages for 'url stream handler'.

Page 9/1017 | < Previous Page | 5 6 7 8 9 10 11 12 13 14 15 16  | Next Page >

  • Recovery from URL structure change?

    - by Dejan Pelzel
    in July this year, we have changed the URL structure of the website from: Post: domain.com/blog/post/986/dance/heart-beats-dance-video-by-chinatsu/ Category: domain.com/blog/index/cosplay/ to Post: domain.com/dance/heart-beats-dance-video-by-chinatsu-986/ Category: domain.com/cosplay/ Everything was (supposedly) properly redirected with 301 redirects and it first seemed that the traffic returned after a couple of days, but it has now been close to 2 months and things keep going worse although Google is slowly indexing the changes. What is worrying me even more is that the Pages crawled per day from Webmaster Tools started drastically dropping a few days ago and has just reached a new low in months (from over 2000 to 700). Should I be worried or will things sort out eventually?

    Read the article

  • URL structure for content that is updated daily

    - by Brendon
    A small, simple site I am working on displays a single page with the day's best offers on it. The user is able to move back and forth between previous days. Which of the following URL structures works best? Structure 1 /index.html -- today's best offers /2013-06-29.html -- yesterday's best offers, etc. Structure 2 /index.html -- 302 redirects to /2013-06-30.html (or whatever today is) /2013-06-30.html -- today's best offers /2013-06-29.html -- yesterday's best offers, etc. I quite like structure 2 from the user's point of view (they can share content easily), but I am a bit concerned about updating the redirect from /index.html every single day -- would this perhaps have unintended SEO consequences?

    Read the article

  • Getting a domain sub-directory url for a new server

    - by Xianlin
    I have an web application server running tomcat and i need to publish my APIs to internet users. However I don't have a domain name for this server and I can only put the ip address of this server (e.g. 145.XXX.XXX.XXX) to point out where my API xml files are located. I have another web server running with a domain name "http://www.webserver.com" registered on the internet and I want to make use of its domain name to server my web application server API xml files location. How can I do that? using "www.webserver.com/api" or using "api.webserver.com"? which is better? Also I wonder if I want to publish a "rstp://145.XXX.XXX.XXX" web link for video streaming purpose, can I use "rstp://www.webserver.com/api" to replace it and how to do it? I always thought the url contain domain sub-directory name cannot point to another IP address, it only can point to another folder location on the webserver itself.

    Read the article

  • Moving large website to new CMS - URL changes

    - by herrherr
    Hi, I was wondering if you have any tipps on the following situation. I'm going to move a large website to a new Content Management System, here are some details on the site: online news magazine with roughly 3,000 articles domain age: 10 years online in the current form since May 2010 indexed pages: ~10.000 percent of search engine traffic: under 10% Unfortunately a custom-tailored CMS was used for the site. The performance, reliability and SEO capabilites have been really bad, so we are moving to a new and proven open source CMS. All the articles will be kept as they are, but the URL structure as well as the structure of the HTML templates will be changed. What I wanted to do now is to actually create 301 redirects for all articles from the old to the new schema, i.e: Old: www.example.com/en/html/news/detail/title-of-the-article/ New: www.example.com/category/title-of-article.html Is this a proven way to do something like this? If not, can you recommend a way that has worked for you? Thanks :)

    Read the article

  • Accented characters representation in the URL

    - by Dan
    We have support for various languages in our website, including Spanish, French and Swedish. For now, the links in the site are NOT encoded before sent to the browser, sending the real accented chars (if such exists, i.e. href="www.(dot)example(dot)com/héllo.html") and not their HEX representation. This works & looks good on all browsers, including Chrome, FF and IE. However, we care great deal about SEO. We got this tip that encoding the links before sending them to the browser (so instead of linking to http://www.example.com/héllo, we will link to http://www.example.com/h%E9llo) will improve the way search engines will 'understand' the links and the keywords in the URL. This involves some work at our side, so we wanted to know if there's truth in that tip, but couldn't find anything addressing this issue.

    Read the article

  • Working with different URL structures

    - by Dane411
    As I'm quite newbie to this field, I've doubts and there are some I couldn't find on Google, i.e: If I'm not wrong, index.html makes it possible to avoid to add the filename to the url, www.example.com/ is equal to www.example.com/index.html. And that works for the following subdirectories, right? www.example.com/music/ Is there any other way to achieve this without using an index.html file? (I've read smth about converting dynamic urls to static: ./?var1=value1&varN=valueN - ./value1/valueN) How can I convert www.example.com/music/ to music.example.com/ and why should it be used? Thanks in advance!

    Read the article

  • URL rewriting via forward proxy

    - by Biggroover
    I have an app that runs inside my firewall and talks out to multiple end points via HTTP/HTTPS on a non-standard port e.g. http://endpoint1.domain.com:7171, http://endpoint2.domain.com:7171 What I want to do is route these requests through a forward proxy that then rewrites the URL to something like http://allendpoints.domain.com/endpoint1 (port 80 or 443) then on the other end have a reverse proxy that unwinds what I did on the forward proxy to reach the specific endpoints. The result being that I can route existing app requests through to specific endpoints across the internet without having to change my app software. My questions are: is this even possible? is it a good idea, are their better ways to do this? Can this be done with IIS and Apache as the proxies?

    Read the article

  • URL parameter names being changed by user agents

    - by Mike Deck
    In reviewing one of our site's web logs I'm seeing instances where we are returning a 404 to requests because we're expecting an id parameter to be sent, but instead we're seeing a di parameter. The resource in question is an image but which image file actually gets served is dependent on the id parameter. The expected url is something like http://images.mysite.com/photo.gif?id=123&width=200&height=300 What I'm seeing in the logs is requests for http://images.mysite.com/photo.gif?di=123&width=200&height=300 The only case where we are seeing this on the id parameter. It seems unlikely that this is due to a server side or JavaScript bug since it seems to be only effecting a small percentage of our traffic. We are seeing this across a wide variety of user agents (both mobile and desktop) and IPs. Has anyone else seen this? Is there a browser plugin or other software you're aware of that could be causing this, and if so is there a good way to work around the issue?

    Read the article

  • URL Frame redirection CakePHP

    - by themanbehindoftheprojectmayhem
    I need to redirect CakePHP installation host to my domain. Location of my Cakephp installation: myhosting.com/newsite/ Domain: www.mydomain.com I'm currently using URL Frame to direct www.mydomain.com to myhosting.com/newsite/. Problem When I load www.mydomain.com, I see all links in the site is pointing to the hosting location - example - myhosting.com/newsite/product/1 It should be pointing to www.mydomain.com/product/1 Any simple way to fix this? Probably very simple to solve it, but I can't bend it. Help much appreciated.

    Read the article

  • Ant get task throws "get doesn't support nested resources element" error

    - by David Corley
    The following ant xml should work according to documentation, but does not. Can anyone tell me if I'm doing something wrong. The get task should support the nested "resources" element in Ant 1.7.1 which is the version I'm using: -- <target name="setup"> <tstamp/> <!-- set up work areas --> <!--<taskdef name="ccmutil" classname="com.allfinanz.framework.tools.CCMUtil" classpath="\\Abate\Data\Build_Lib\Ivy\com.allfinanz\ccmutil\1.0\ccmutil-1.0.jar"/>--> <!-- 1st one is special, also sets ${project_wa} --> <!--<ccmutil file="${ant.file}" projects="framework, xpbuw, xpb, bil"/>--> <property name="framework_wa" value="../../../framework"/> <property name="xpbuw_wa" value="../../../xpbuw"/> <property name="xpb_wa" value="../../../xpb"/> <property name="bil_wa" value="../.."/> <!-- Create properties to hold the build values --> <property name="out" value="${user.dir}"/> <!-- This may be overridden from the command line --> <property name="locale" value="us"/> <!-- set contextRoot up as a property - this mean that it can be overwritten from the command line e.g.: ant -DcontextRoot=xpertBridge. --> <property name="contextRoot" value="xpertBridge"/> <property name="build_dir" value="${out}/${release}/build"/> <property name="distrib_dir" value="${out}/${release}/distrib"/> <property name="build.number" value="-1"/> <!-- Download dependencies from repo.fms.allfinanz.com--> <get dest="${lib}"> <resources> <url url="http://repo.fms.allfinanz.com/service/local/artifact/maven/redirect?r=central&amp;g=soap&amp;a=soap&amp;v=2.3.1&amp;e=jar"/> <url url="http://repo.fms.allfinanz.com/service/local/artifact/maven/redirect?r=JBOSS&amp;g=apache-fileupload&amp;a=commons-fileupload&amp;v=1.0&amp;e=jar"/> <url url="http://repo.fms.allfinanz.com/service/local/artifact/maven/redirect?r=thirdparty&amp;g=regexp&amp;a=regexp&amp;v=1.1&amp;e=jar"/> <url url="http://repo.fms.allfinanz.com/service/local/artifact/maven/redirect?r=thirdparty&amp;g=javax.mail&amp;a=mail&amp;v=1.2&amp;e=jar"/> <url url="http://repo.fms.allfinanz.com/service/local/artifact/maven/redirect?r=thirdparty&amp;g=com.ibm.ws.webservices&amp;a=webservices.thinclient&amp;v=6.1.0&amp;e=jar"/> <url url="http://repo.fms.allfinanz.com/service/local/artifact/maven/redirect?r=thirdparty&amp;g=avalon-framework&amp;a=avalon-framework&amp;v=4.2.0&amp;e=jar"/> <url url="http://repo.fms.allfinanz.com/service/local/artifact/maven/redirect?r=thirdparty&amp;g=jimi&amp;a=jimi&amp;v=1.0&amp;e=jar"/> <url url="http://repo.fms.allfinanz.com/service/local/artifact/maven/redirect?r=thirdparty&amp;g=batik&amp;a=batik-all&amp;v=1.6&amp;e=jar"/> <url url="http://repo.fms.allfinanz.com/service/local/artifact/maven/redirect?r=thirdparty&amp;g=bsf&amp;a=bsf&amp;v=2.3.0&amp;e=jar"/> <url url="http://repo.fms.allfinanz.com/service/local/artifact/maven/redirect?r=thirdparty&amp;g=rhino&amp;a=js&amp;v=1.5R3&amp;e=jar"/> <url url="http://repo.fms.allfinanz.com/service/local/artifact/maven/redirect?r=central&amp;g=commons-io&amp;a=commons-io&amp;v=1.1&amp;e=jar"/> <url url="http://repo.fms.allfinanz.com/service/local/artifact/maven/redirect?r=central&amp;g=commons-logging&amp;a=commons-logging&amp;v=1.0.4&amp;e=jar"/> <url url="http://repo.fms.allfinanz.com/service/local/artifact/maven/redirect?r=thirdparty&amp;g=xmlgraphics&amp;a=commons&amp;v=1.0&amp;e=jar"/> <url url="http://repo.fms.allfinanz.com/service/local/artifact/maven/redirect?r=thirdparty&amp;g=barcode4j&amp;a=barcode4j&amp;v=trunkBIL&amp;e=jar"/> <url url="http://repo.fms.allfinanz.com/service/local/artifact/maven/redirect?r=thirdparty&amp;g=com.ibm&amp;a=fmcojagt&amp;v=6.1&amp;e=jar"/> <url url="http://repo.fms.allfinanz.com/service/local/artifact/maven/redirect?r=thirdparty&amp;g=com.allfinanz&amp;a=ejbserversupport&amp;v=1.0&amp;e=jar"/> <url url="http://repo.fms.allfinanz.com/service/local/artifact/maven/redirect?r=thirdparty&amp;g=com.sun&amp;a=jce&amp;v=1.0&amp;e=zip"/> <url url="http://repo.fms.allfinanz.com/service/local/artifact/maven/redirect?r=thirdparty&amp;g=ssce&amp;a=ssce&amp;v=5.8&amp;e=jar"/> <url url="http://repo.fms.allfinanz.com/service/local/artifact/maven/redirect?r=thirdparty&amp;g=com.ibm&amp;a=mq&amp;v=5.1&amp;e=jar"/> <url url="http://repo.fms.allfinanz.com/service/local/artifact/maven/redirect?r=thirdparty&amp;g=com.ibm&amp;a=mqjms&amp;v=5.1&amp;e=jar"/> <url url="http://repo.fms.allfinanz.com/service/local/artifact/maven/redirect?r=thirdparty&amp;g=NetServerRemote&amp;a=NetServerRemote&amp;v=1.0&amp;e=jar"/> <url url="http://repo.fms.allfinanz.com/service/local/artifact/maven/redirect?r=thirdparty&amp;g=NetServerRMI&amp;a=NetServerRMI&amp;v=1.0&amp;e=jar"/> <url url="http://repo.fms.allfinanz.com/service/local/artifact/maven/redirect?r=thirdparty&amp;g=jwsdp&amp;a=saaj-api&amp;v=1.5&amp;e=jar&amp;c=api"/> <url url="http://repo.fms.allfinanz.com/service/local/artifact/maven/redirect?r=thirdparty&amp;g=jwsdp&amp;a=saaj-impl&amp;v=1.5&amp;e=jar&amp;c=impl"/> <url url="http://repo.fms.allfinanz.com/service/local/artifact/maven/redirect?r=thirdparty&amp;g=org.apache.xmlgraphics&amp;a=fop&amp;v=0.92b&amp;e=jar"/> <url url="http://repo.fms.allfinanz.com/service/local/artifact/maven/redirect?r=thirdparty&amp;g=xerces&amp;a=dom3-xml-apis&amp;v=1.0&amp;e=jar"/> <url url="http://repo.fms.allfinanz.com/service/local/artifact/maven/redirect?r=thirdparty&amp;g=org.apache&amp;a=derbynet&amp;v=10.0.2&amp;e=jar"/> <url url="http://repo.fms.allfinanz.com/service/local/artifact/maven/redirect?r=thirdparty&amp;g=com.sun&amp;a=jsse&amp;v=1.0&amp;e=jar"/> </resources> </get> </target>

    Read the article

  • site not working with url www

    - by jarus
    hello im having problem with my site when i type http://mysite.com it works fine but when i type http://www.mysite.com it displays page cannot be found , what is the problem i couldnot find , i tried .htaccess redirection also RewriteEngine On RewriteCond %{HTTP_HOST} ^www.mysite.com [nc] RewriteRule (.*) mysite.com/$1 [R=301,L] it is not working any help will be appreciated

    Read the article

  • URL Rewrite problem. (Many directory)

    - by marharépa
    Hello! I'd like to make a htaccess file, which can make a good structure for my websites. My .htaccess is now: RewriteEngine On RewriteCond %{REQUEST_FILENAME} !\.(jpg|jpeg|gif|png|css|js|pl|txt)$ RewriteCond %{REQUEST_URI} !^/admin/* RewriteRule ^(.*)$ index.php?q=$1 [QSA] (based on Sombat's comment) And I want to make this, with it: for every elements but (jpg|jpeg|gif|png|css|js|pl|txt) if domain.xx/admin redirect to the domain.xx/admin directory and don't make a rewrite at all i mean: let me use domain.xx/admin/index.php?asd=1&asdd=2 else rewrite everything as rule one, to index.php. Thanks for the help.

    Read the article

  • Implementing an async "read all currently available data from stream" operation

    - by Jon
    I recently provided an answer to this question: C# - Realtime console output redirection. As often happens, explaining stuff (here "stuff" was how I tackled a similar problem) leads you to greater understanding and/or, as is the case here, "oops" moments. I realized that my solution, as implemented, has a bug. The bug has little practical importance, but it has an extremely large importance to me as a developer: I can't rest easy knowing that my code has the potential to blow up. Squashing the bug is the purpose of this question. I apologize for the long intro, so let's get dirty. I wanted to build a class that allows me to receive input from a console's standard output Stream. Console output streams are of type FileStream; the implementation can cast to that, if needed. There is also an associated StreamReader already present to leverage. There is only one thing I need to implement in this class to achieve my desired functionality: an async "read all the data available this moment" operation. Reading to the end of the stream is not viable because the stream will not end unless the process closes the console output handle, and it will not do that because it is interactive and expecting input before continuing. I will be using that hypothetical async operation to implement event-based notification, which will be more convenient for my callers. The public interface of the class is this: public class ConsoleAutomator { public event EventHandler<ConsoleOutputReadEventArgs> StandardOutputRead; public void StartSendingEvents(); public void StopSendingEvents(); } StartSendingEvents and StopSendingEvents do what they advertise; for the purposes of this discussion, we can assume that events are always being sent without loss of generality. The class uses these two fields internally: protected readonly StringBuilder inputAccumulator = new StringBuilder(); protected readonly byte[] buffer = new byte[256]; The functionality of the class is implemented in the methods below. To get the ball rolling: public void StartSendingEvents(); { this.stopAutomation = false; this.BeginReadAsync(); } To read data out of the Stream without blocking, and also without requiring a carriage return char, BeginRead is called: protected void BeginReadAsync() { if (!this.stopAutomation) { this.StandardOutput.BaseStream.BeginRead( this.buffer, 0, this.buffer.Length, this.ReadHappened, null); } } The challenging part: BeginRead requires using a buffer. This means that when reading from the stream, it is possible that the bytes available to read ("incoming chunk") are larger than the buffer. Remember that the goal here is to read all of the chunk and call event subscribers exactly once for each chunk. To this end, if the buffer is full after EndRead, we don't send its contents to subscribers immediately but instead append them to a StringBuilder. The contents of the StringBuilder are only sent back whenever there is no more to read from the stream. private void ReadHappened(IAsyncResult asyncResult) { var bytesRead = this.StandardOutput.BaseStream.EndRead(asyncResult); if (bytesRead == 0) { this.OnAutomationStopped(); return; } var input = this.StandardOutput.CurrentEncoding.GetString( this.buffer, 0, bytesRead); this.inputAccumulator.Append(input); if (bytesRead < this.buffer.Length) { this.OnInputRead(); // only send back if we 're sure we got it all } this.BeginReadAsync(); // continue "looping" with BeginRead } After any read which is not enough to fill the buffer (in which case we know that there was no more data to be read during the last read operation), all accumulated data is sent to the subscribers: private void OnInputRead() { var handler = this.StandardOutputRead; if (handler == null) { return; } handler(this, new ConsoleOutputReadEventArgs(this.inputAccumulator.ToString())); this.inputAccumulator.Clear(); } (I know that as long as there are no subscribers the data gets accumulated forever. This is a deliberate decision). The good This scheme works almost perfectly: Async functionality without spawning any threads Very convenient to the calling code (just subscribe to an event) Never more than one event for each time data is available to be read Is almost agnostic to the buffer size The bad That last almost is a very big one. Consider what happens when there is an incoming chunk with length exactly equal to the size of the buffer. The chunk will be read and buffered, but the event will not be triggered. This will be followed up by a BeginRead that expects to find more data belonging to the current chunk in order to send it back all in one piece, but... there will be no more data in the stream. In fact, as long as data is put into the stream in chunks with length exactly equal to the buffer size, the data will be buffered and the event will never be triggered. This scenario may be highly unlikely to occur in practice, especially since we can pick any number for the buffer size, but the problem is there. Solution? Unfortunately, after checking the available methods on FileStream and StreamReader, I can't find anything which lets me peek into the stream while also allowing async methods to be used on it. One "solution" would be to have a thread wait on a ManualResetEvent after the "buffer filled" condition is detected. If the event is not signaled (by the async callback) in a small amount of time, then more data from the stream will not be forthcoming and the data accumulated so far should be sent to subscribers. However, this introduces the need for another thread, requires thread synchronization, and is plain inelegant. Specifying a timeout for BeginRead would also suffice (call back into my code every now and then so I can check if there's data to be sent back; most of the time there will not be anything to do, so I expect the performance hit to be negligible). But it looks like timeouts are not supported in FileStream. Since I imagine that async calls with timeouts are an option in bare Win32, another approach might be to PInvoke the hell out of the problem. But this is also undesirable as it will introduce complexity and simply be a pain to code. Is there an elegant way to get around the problem? Thanks for being patient enough to read all of this. Update: I definitely did not communicate the scenario well in my initial writeup. I have since revised the writeup quite a bit, but to be extra sure: The question is about how to implement an async "read all the data available this moment" operation. My apologies to the people who took the time to read and answer without me making my intent clear enough.

    Read the article

  • Implementing a robust async stream reader

    - by Jon
    I recently provided an answer to this question: C# - Realtime console output redirection. As often happens, explaining stuff (here "stuff" was how I tackled a similar problem) leads you to greater understanding and/or, as is the case here, "oops" moments. I realized that my solution, as implemented, has a bug. The bug has little practical importance, but it has an extremely large importance to me as a developer: I can't rest easy knowing that my code has the potential to blow up. Squashing the bug is the purpose of this question. I apologize for the long intro, so let's get dirty. I wanted to build a class that allows me to receive input from a Stream in an event-based manner. The stream, in my scenario, is guaranteed to be a FileStream and there is also an associated StreamReader already present to leverage. The public interface of the class is this: public class MyStreamManager { public event EventHandler<ConsoleOutputReadEventArgs> StandardOutputRead; public void StartSendingEvents(); public void StopSendingEvents(); } Obviously this specific scenario has to do with a console's standard output, but that is a detail and does not play an important role. StartSendingEvents and StopSendingEvents do what they advertise; for the purposes of this discussion, we can assume that events are always being sent without loss of generality. The class uses these two fields internally: protected readonly StringBuilder inputAccumulator = new StringBuilder(); protected readonly byte[] buffer = new byte[256]; The functionality of the class is implemented in the methods below. To get the ball rolling: public void StartSendingEvents(); { this.stopAutomation = false; this.BeginReadAsync(); } To read data out of the Stream without blocking, and also without requiring a carriage return char, BeginRead is called: protected void BeginReadAsync() { if (!this.stopAutomation) { this.StandardOutput.BaseStream.BeginRead( this.buffer, 0, this.buffer.Length, this.ReadHappened, null); } } The challenging part: BeginRead requires using a buffer. This means that when reading from the stream, it is possible that the bytes available to read ("incoming chunk") are larger than the buffer. Since we are only handing off data from the stream to a consumer, and that consumer may well have inside knowledge about the size and/or format of these chunks, I want to call event subscribers exactly once for each chunk. Otherwise the abstraction breaks down and the subscribers have to buffer the incoming data and reconstruct the chunks themselves using said knowledge. This is much less convenient to the calling code, and detracts from the usefulness of my class. To this end, if the buffer is full after EndRead, we don't send its contents to subscribers immediately but instead append them to a StringBuilder. The contents of the StringBuilder are only sent back whenever there is no more to read from the stream (thus preserving the chunks). private void ReadHappened(IAsyncResult asyncResult) { var bytesRead = this.StandardOutput.BaseStream.EndRead(asyncResult); if (bytesRead == 0) { this.OnAutomationStopped(); return; } var input = this.StandardOutput.CurrentEncoding.GetString( this.buffer, 0, bytesRead); this.inputAccumulator.Append(input); if (bytesRead < this.buffer.Length) { this.OnInputRead(); // only send back if we 're sure we got it all } this.BeginReadAsync(); // continue "looping" with BeginRead } After any read which is not enough to fill the buffer, all accumulated data is sent to the subscribers: private void OnInputRead() { var handler = this.StandardOutputRead; if (handler == null) { return; } handler(this, new ConsoleOutputReadEventArgs(this.inputAccumulator.ToString())); this.inputAccumulator.Clear(); } (I know that as long as there are no subscribers the data gets accumulated forever. This is a deliberate decision). The good This scheme works almost perfectly: Async functionality without spawning any threads Very convenient to the calling code (just subscribe to an event) Maintains the "chunkiness" of the data; this allows the calling code to use inside knowledge of the data without doing any extra work Is almost agnostic to the buffer size (it will work correctly with any size buffer irrespective of the data being read) The bad That last almost is a very big one. Consider what happens when there is an incoming chunk with length exactly equal to the size of the buffer. The chunk will be read and buffered, but the event will not be triggered. This will be followed up by a BeginRead that expects to find more data belonging to the current chunk in order to send it back all in one piece, but... there will be no more data in the stream. In fact, as long as data is put into the stream in chunks with length exactly equal to the buffer size, the data will be buffered and the event will never be triggered. This scenario may be highly unlikely to occur in practice, especially since we can pick any number for the buffer size, but the problem is there. Solution? Unfortunately, after checking the available methods on FileStream and StreamReader, I can't find anything which lets me peek into the stream while also allowing async methods to be used on it. One "solution" would be to have a thread wait on a ManualResetEvent after the "buffer filled" condition is detected. If the event is not signaled (by the async callback) in a small amount of time, then more data from the stream will not be forthcoming and the data accumulated so far should be sent to subscribers. However, this introduces the need for another thread, requires thread synchronization, and is plain inelegant. Specifying a timeout for BeginRead would also suffice (call back into my code every now and then so I can check if there's data to be sent back; most of the time there will not be anything to do, so I expect the performance hit to be negligible). But it looks like timeouts are not supported in FileStream. Since I imagine that async calls with timeouts are an option in bare Win32, another approach might be to PInvoke the hell out of the problem. But this is also undesirable as it will introduce complexity and simply be a pain to code. Is there an elegant way to get around the problem? Thanks for being patient enough to read all of this.

    Read the article

  • URL is generating a /#!/splash-page

    - by user32642
    My site for some reason is generating a shebang - /#!/splash-page on the URL. For example when I type www.modernvintage1005.com, the browser returns www.modernvintage1005.com/#!/splash-page and every subsequent page is /#!/about, /#!/contact, and so forth. There's absolutely nothing on the Google about this. There is a lot of rewrite help to eliminate .index.php from the home page, but that's it. How do I rewrite it to just say domain.com and domain.com/about.html, etc.? Here is my .htaccess file if you need to see it. # Rewrite Rule <IfModule mod_rewrite.c> RewriteEngine On RewriteBase / RewriteRule ^index\.php$ - [L] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule . /index.php [L] </IfModule> # compress text, html, javascript, css, xml: <IfModule mod_deflate.c> AddOutputFilterByType DEFLATE text/plain AddOutputFilterByType DEFLATE text/html AddOutputFilterByType DEFLATE text/xml AddOutputFilterByType DEFLATE text/css AddOutputFilterByType DEFLATE application/xml AddOutputFilterByType DEFLATE application/xhtml+xml AddOutputFilterByType DEFLATE application/rss+xml AddOutputFilterByType DEFLATE application/javascript AddOutputFilterByType DEFLATE application/x-javascript AddType x-font/otf .otf AddType x-font/ttf .ttf AddType x-font/eot .eot AddType x-font/woff .woff AddType image/x-icon .ico AddType image/png .png </IfModule> ## EXPIRES CACHING ## <IfModule mod_expires.c> ExpiresActive On ExpiresByType image/jpg "access 1 year" ExpiresByType image/jpeg "access 1 year" ExpiresByType image/gif "access 1 year" ExpiresByType image/png "access 1 year" ExpiresByType text/css "access 1 month" ExpiresByType application/pdf "access 1 month" ExpiresByType text/x-javascript "access 1 month" ExpiresByType application/x-shockwave-flash "access 1 month" ExpiresByType image/x-icon "access 1 year" ExpiresDefault "access 2 days" </IfModule> ## EXPIRES CACHING ##

    Read the article

  • Should I use subdomains or subfolders for my user groups?

    - by bilygates
    Hello, I run a photography website where each user has its own subdomain (i.e. user.site.com). I'm thinking of adding user groups but I'm unable to decide if I should also associate a separate subdomain or simply a subfolder for each group: Subfolders (www.site.com/groups/my-group) Pros: Easier to maintain from a tehnical p.o.v. Cons: Harder to memorize. The URLs can get really long (www.site.com/groups/my-group/albums/my-album/) Subdomains (my-group.site.com) Pros: Easier to memorize. Shorter URLs. One might have the impression that such an URL is somewhat more "independent" from the main site. Cons: Group and user names belong to the same name space, so we need to check for collisions when creating a new user/group. One cannot determine the content of the page by only reading the URL: Is x.site.com a user page or a group page? What's your opinion on the matter? I should note that DeviantArt.com uses the 2nd option (that's where I got the idea). Thank you in advance!

    Read the article

  • Stream classes ... design, pattern for creating views over streams

    - by ToxicAvenger
    A question regarding the design of stream classes - I need a pattern to create independent views over a single stream instance (in my case for reading). A view would be a consecutive part of the stream. The problem I have with the stream classes is that the state (reading or writing) is coupled with the underlying data/storage. So if I need to partition a stream into different segments (whether segments overlap or not doesn't matter), I cannot easily create views over the stream, the views would store start and end position. Because reading from a view - which would translate to reading from the underlying stream adjusted based on the start/end positions - would change the state of the underlying stream instance. So what I could do is take a read on a view instance, adjust the Position of the stream, read the chunks I need. But I cannot do that concurrently. Why is it designed in such a way, and what kind of pattern could I implement to create independet views over a single stream instance which would allow to read/write independently (and concurrently)?

    Read the article

  • ffmpeg, vlc - Unable to find input stream

    - by zozo
    Good day to all... I have some "little" problems with ffserver and ffmpeg... What I need to do is to broadcast a live video. So I got the cam... used vlc and used send stream option. I sent it to 192.168.1.9:64555, which is a virtual machine on the same computer, running centos. On the virtual machine I run the command ffmpeg -i 192.168.1.9:64555 output.mpg. The response is "unable to find file whatever". Can any1 tell me what I did wrong? Thank you and have a great day. Print-screen with error:

    Read the article

  • How to find other end of unix socket connection?

    - by depesz
    I have a process (dbus-daemon) which has many open connection over UNIX sockets. One of these connections is fd #36: =$ ps uw -p 23284 USER PID %CPU %MEM VSZ RSS TTY STAT START TIME COMMAND depesz 23284 0.0 0.0 24680 1772 ? Ss 15:25 0:00 /bin/dbus-daemon --fork --print-pid 5 --print-address 7 --session =$ ls -l /proc/23284/fd/36 lrwx------ 1 depesz depesz 64 2011-03-28 15:32 /proc/23284/fd/36 -> socket:[1013410] =$ netstat -nxp | grep 1013410 (Not all processes could be identified, non-owned process info will not be shown, you would have to be root to see it all.) unix 3 [ ] STREAM CONNECTED 1013410 23284/dbus-daemon @/tmp/dbus-3XDU4PYEzD =$ netstat -nxp | grep dbus-3XDU4PYEzD unix 3 [ ] STREAM CONNECTED 1013953 23284/dbus-daemon @/tmp/dbus-3XDU4PYEzD unix 3 [ ] STREAM CONNECTED 1013825 23284/dbus-daemon @/tmp/dbus-3XDU4PYEzD unix 3 [ ] STREAM CONNECTED 1013726 23284/dbus-daemon @/tmp/dbus-3XDU4PYEzD unix 3 [ ] STREAM CONNECTED 1013471 23284/dbus-daemon @/tmp/dbus-3XDU4PYEzD unix 3 [ ] STREAM CONNECTED 1013410 23284/dbus-daemon @/tmp/dbus-3XDU4PYEzD unix 3 [ ] STREAM CONNECTED 1012325 23284/dbus-daemon @/tmp/dbus-3XDU4PYEzD unix 3 [ ] STREAM CONNECTED 1012302 23284/dbus-daemon @/tmp/dbus-3XDU4PYEzD unix 3 [ ] STREAM CONNECTED 1012289 23284/dbus-daemon @/tmp/dbus-3XDU4PYEzD unix 3 [ ] STREAM CONNECTED 1012151 23284/dbus-daemon @/tmp/dbus-3XDU4PYEzD unix 3 [ ] STREAM CONNECTED 1011957 23284/dbus-daemon @/tmp/dbus-3XDU4PYEzD unix 3 [ ] STREAM CONNECTED 1011937 23284/dbus-daemon @/tmp/dbus-3XDU4PYEzD unix 3 [ ] STREAM CONNECTED 1011900 23284/dbus-daemon @/tmp/dbus-3XDU4PYEzD unix 3 [ ] STREAM CONNECTED 1011775 23284/dbus-daemon @/tmp/dbus-3XDU4PYEzD unix 3 [ ] STREAM CONNECTED 1011771 23284/dbus-daemon @/tmp/dbus-3XDU4PYEzD unix 3 [ ] STREAM CONNECTED 1011769 23284/dbus-daemon @/tmp/dbus-3XDU4PYEzD unix 3 [ ] STREAM CONNECTED 1011766 23284/dbus-daemon @/tmp/dbus-3XDU4PYEzD unix 3 [ ] STREAM CONNECTED 1011663 23284/dbus-daemon @/tmp/dbus-3XDU4PYEzD unix 3 [ ] STREAM CONNECTED 1011635 23284/dbus-daemon @/tmp/dbus-3XDU4PYEzD unix 3 [ ] STREAM CONNECTED 1011627 23284/dbus-daemon @/tmp/dbus-3XDU4PYEzD unix 3 [ ] STREAM CONNECTED 1011540 23284/dbus-daemon @/tmp/dbus-3XDU4PYEzD unix 3 [ ] STREAM CONNECTED 1011480 23284/dbus-daemon @/tmp/dbus-3XDU4PYEzD unix 3 [ ] STREAM CONNECTED 1011349 23284/dbus-daemon @/tmp/dbus-3XDU4PYEzD unix 3 [ ] STREAM CONNECTED 1011312 23284/dbus-daemon @/tmp/dbus-3XDU4PYEzD unix 3 [ ] STREAM CONNECTED 1011284 23284/dbus-daemon @/tmp/dbus-3XDU4PYEzD unix 3 [ ] STREAM CONNECTED 1011250 23284/dbus-daemon @/tmp/dbus-3XDU4PYEzD unix 3 [ ] STREAM CONNECTED 1011231 23284/dbus-daemon @/tmp/dbus-3XDU4PYEzD unix 3 [ ] STREAM CONNECTED 1011155 23284/dbus-daemon @/tmp/dbus-3XDU4PYEzD unix 3 [ ] STREAM CONNECTED 1011061 23284/dbus-daemon @/tmp/dbus-3XDU4PYEzD unix 3 [ ] STREAM CONNECTED 1011049 23284/dbus-daemon @/tmp/dbus-3XDU4PYEzD unix 3 [ ] STREAM CONNECTED 1011035 23284/dbus-daemon @/tmp/dbus-3XDU4PYEzD unix 3 [ ] STREAM CONNECTED 1011013 23284/dbus-daemon @/tmp/dbus-3XDU4PYEzD unix 3 [ ] STREAM CONNECTED 1010961 23284/dbus-daemon @/tmp/dbus-3XDU4PYEzD unix 3 [ ] STREAM CONNECTED 1010945 23284/dbus-daemon @/tmp/dbus-3XDU4PYEzD Based on number connections, I assume that dbus-daemon is actually server. Which is OK. But how can I find which process is connected to it - using the connection that is 36th file handle in dbus-launcher? Tried lsof and even greps on /proc/net/unix but I can't figure out a way to find the client process.

    Read the article

  • Stream data with Node.js

    - by Saif Bechan
    I want to know if it is possible to stream data from the server to the client with Node.js. From what I can understand all over the internet is that this has to be possible, yet I fail to find a correct example or solution. What I want is a single http post to node.js with AJAX. Than leave the connection open and continuously stream data to the client. The client will receive this stream and update the page continuously.

    Read the article

  • Concepts: Channel vs. Stream

    - by hotzen
    Hello, is there a conceptual difference between the terms "Channel" and "Stream"? Do the terms require/determine, for example, the allowed number of concurrent Consumers or Producers? I'm currently developing a Channel/Stream of DataFlowVariables, which may be written by one producer and read by one consumer as the implementation is destructive/mutable. Would this be a Channel or Stream, is there any difference at all? Thanks

    Read the article

  • HttpClient response handler always returns closed stream

    - by Alex Ciminian
    I'm new to Java development so please bear with me. Also, I hope I'm not the champion of tl;dr :). I'm using HttpClient to make requests over Http (duh!) and I'd gotten it to work for a simple servlet that receives an URL as a query string parameter. I realized that my code could use some refactoring, so I decided to make my own HttpResponseHandler, to clean up the code, make it reusable and improve exception handling. I currently have something like this: public class HttpResponseHandler implements ResponseHandler<InputStream>{ public InputStream handleResponse(HttpResponse response) throws ClientProtocolException, IOException { int statusCode = response.getStatusLine().getStatusCode(); InputStream in = null; if (statusCode != HttpStatus.SC_OK) { throw new HttpResponseException(statusCode, null); } else { HttpEntity entity = response.getEntity(); if (entity != null) { in = entity.getContent(); // This works // for (int i;(i = in.read()) >= 0;) System.out.print((char)i); } } return in; } } And in the method where I make the actual request: HttpClient httpclient = new DefaultHttpClient(); HttpGet httpget = new HttpGet(target); ResponseHandler<InputStream> httpResponseHandler = new HttpResponseHandler(); try { InputStream in = httpclient.execute(httpget, httpResponseHandler); // This doesn't work // for (int i;(i = in.read()) >= 0;) System.out.print((char)i); return in; } catch (HttpResponseException e) { throw new HttpResponseException(e.getStatusCode(), null); } The problem is that the input stream returned from the handler is closed. I don't have any idea why, but I've checked it with the prints in my code (and no, I haven't used them both at the same time :). While the first print works, the other one gives a closed stream error. I need InputStreams, because all my other methods expect an InputStream and not a String. Also, I want to be able to retrieve images (or maybe other types of files), not just text files. I can work around this pretty easily by giving up on the response handler (I have a working implementation that doesn't use it), but I'm pretty curious about the following: Why does it do what it does? How do I open the stream, if something closes it? What's the right way to do this, anyway :)? I've checked the docs and I couldn't find anything useful regarding this issue. To save you a bit of Googling, here's the Javadoc and here's the HttpClient tutorial (Section 1.1.8 - Response handlers). Thanks, Alex

    Read the article

  • Does isEmpty method in Stream evaluate the whole Stream?

    - by abhin4v
    In Scala, does calling isEmtpy method on an instance of Stream class cause the stream to be evaluated completely? My code is like this: import Stream.cons private val odds: Stream[Int] = cons(3, odds.map(_ + 2)) private val primes: Stream[Int] = cons(2, odds filter isPrime) private def isPrime(n: Int): Boolean = n match { case 1 => false case 2 => true case 3 => true case 5 => true case 7 => true case x if n % 3 == 0 => false case x if n % 5 == 0 => false case x if n % 7 == 0 => false case x if (x + 1) % 6 == 0 || (x - 1) % 6 == 0 => true case x => primeDivisors(x) isEmpty } import Math.{sqrt, ceil} private def primeDivisors(n: Int) = primes takeWhile { _ <= ceil(sqrt(n))} filter {n % _ == 0 } So, does the call to isEmpty on the line case x => primeDivisors(x) isEmpty cause all the prime divisors to be evaluated or only the first one?

    Read the article

  • Put an object in Handler message

    - by Tsimmi
    Hi! I need to download an image from the internet, in a different thread, and then send that image object in the handler message, to the UI thread. I already have this: ... Message msg = Message.obtain(); Bundle b = new Bundle(); b.putParcelable("MyObject", (Parcelable) object); msg.setData(b); handler.sendMessage(msg); And when I receive this message, I want to extract the object: ... public void handleMessage(Message msg) { super.handleMessage(msg); MyObject objectRcvd = (MyObject) msg.getData().getParcelable("IpTile"); addToCache(ipTile); mapView.invalidate(); } But this is giving me: ...java.lang.ClassCastException... Can anyone help? And by the way, is this the most efficient way to pass an object to the UI Thread? Thank you all!

    Read the article

  • I get error when trying to write response stream to a file

    - by MemphisDeveloper
    I am trying to test a rest webservice but when I do a post and try to retreive the save the response stream to a file I get an exception saying "Stream was not readable." What am I doing wrong? Public Sub PostAndRead() Dim flReader As FileStream = New FileStream("~\testRequest.xml", FileMode.Open, FileAccess.Read) Dim flWriter As FileStream = New FileStream("~\testResponse.xml", FileMode.Create, FileAccess.Write) Dim address As Uri = New Uri(restAddress) Dim req As HttpWebRequest = DirectCast(WebRequest.Create(address), HttpWebRequest) req.Method = "POST" req.ContentLength = flReader.Length req.AllowWriteStreamBuffering = True Dim reqStream As Stream = req.GetRequestStream() ' Get data from upload file to inData Dim inData(flReader.Length) As Byte flReader.Read(inData, 0, flReader.Length) ' put data into request stream reqStream.Write(inData, 0, flReader.Length) flReader.Close() reqStream.Close() ' Post Response req.GetResponse() ' Save results in a file Copy(req.GetRequestStream(), flWriter) End Sub

    Read the article

< Previous Page | 5 6 7 8 9 10 11 12 13 14 15 16  | Next Page >