Search Results

Search found 7128 results on 286 pages for 'httpcontext cache'.

Page 158/286 | < Previous Page | 154 155 156 157 158 159 160 161 162 163 164 165  | Next Page >

  • Running Long Process: Indexing 5GB docs with Lucene

    - by Robert Dondo
    Situation:I have an ASP .NET application that will search through docs using Lucene. I want to run the initial indexing (the index will be incremental after the initial run so there wont be need to index the whole directory again in future). Currently, I have about 5GB of docs (45000files). Problem: My application times out before completing the process. I have altered the TimeOut like this: HttpContext.Current.Server.ScriptTimeout = 200000; but it still does not complete the process. How can I run the index?

    Read the article

  • ASP.NET request extension type

    - by Krishna
    Hello, I am working on a large web application which I have recently shelved tons of .aspx pages from the project. To avoid page not found error, I added these entities in the xml which came around 300+ in count. I wrote a http module that checks the request url in the xml entities and if they are found, my module is going to redirect the request to respective new pages. Everything works great, but my collection is getting iterated for all the requests, I mean for each and every .jpg, .css, .js, .ico, .pdf etc. Is there any object or property in .net that can tell the type of request that user requested for like HttpContext.request.type. So that I can avoid checking the request for all unwanted file types.

    Read the article

  • Amazon EC2 multiple servers share session state

    - by Theofanis Pantelides
    Hi everyone, I have a bunch of EC2 servers that are load balanced. Some of the servers are not sharing session, and users keep getting logged in and out. How can I make all the server share the one session, possibly even using a partitionresolver solution public class PartitionResolver : System.Web.IPartitionResolver { private String[] partitions; public void Initialize() { // create the partition connection string table // web1, web2 partitions = new String[] { "192.168.1.1" }; } public String ResolvePartition(Object key) { String oHost = System.Web.HttpContext.Current.Request.Url.Host.ToLower().Trim(); if (oHost.StartsWith("10.0.0") || oHost.Equals("localhost")) return "tcpip=127.0.0.1:42424"; String sid = (String)key; // hash the incoming session ID into // one of the available partitions Int32 partitionID = Math.Abs(sid.GetHashCode()) % partitions.Length; return ("tcpip=" + partitions[partitionID] + ":42424"); } } -theo

    Read the article

  • Unable to locate essential development tools Ubuntu 11.04

    - by Anita 7
    I'm using Ubuntu 11.04 (VMware). I aim to implement OpenMP. Im using gcc 4.5 compiler. I tried to install it by using the command sudo apt-get install gcc 4.5. Afterwards I proceed with gcc -fopenmp foo.c BUT the output was: gcc: foo.c: No such file or directory gcc: no input files –. Now I tried to install the package by using : ubuntu@ubuntu:~$ sudo apt-get install essential Reading package lists... Done Building dependency tree Reading state information... Done E: Unable to locate package essential. I also tried apt-cache search essential and after that sudo apt-get install essential-dev But the same error again, E: Unable to locate package essential-dev Any solution,please? Do I need to download any package? What should I do? Thank you in advance :))

    Read the article

  • Google I/O 2010 - GWT Linkers target HTML5 WebWorkers & more

    Google I/O 2010 - GWT Linkers target HTML5 WebWorkers & more Google I/O 2010 - GWT Linkers target HTML5 Web Workers, Chrome Extensions, and more GWT 301 Matt Mastracci At its core GWT has a well-defined and customizable mechanism -- called Linkers -- that controls exactly how GWT's compiled JavaScript should be packaged, served, and run. This session will describe how to create linkers and explains some of the linkers we've created, including a linker that turns a GWT module into an HTML5 Web Worker and one that generates an HTML App Cache manifest automatically. For all I/O 2010 sessions, please go to code.google.com From: GoogleDevelopers Views: 6 1 ratings Time: 59:59 More in Science & Technology

    Read the article

  • How do I restrict the WCF service called by an ASP.NET AJAX page to only allow calls for that page?

    - by NovaJoe
    I have an AjaxControlToolkit DynamicPopulate control that is updated by calls to a WCF service. I know I can check the HttpContext in the service request to see if a user of the page (and thus, the control) is authenticated. However, I don't want anyone clever to be able to call the service directly, even if they're logged in. I want access to the service to be allowed ONLY to requests that are made from the page. Mainly, I don't want anyone to be able to programatically make a large number of calls and then reverse-engineer the algorithm that sits behind the service. Any clever ideas on how this can be done? Maybe I'm over-thinking this? Thanks in advance.

    Read the article

  • MVC Custom Model Binder Binding Multiple Values

    - by BMD86
    Hello everyone, I have a scenario in which I have multiple sources to bind to my model. For one, I have a view tied to a strongly-typed model, but this scenario also entails posting data to this view from a 3rd party site. Essentially, what I believe I am after in the custom model binding is to investigate the form values in the Request object within HTTPContext to see if I have a field such as "postedFirstName". If so, I want to bind that value instead of the textbox "FirstName" in my view. I've done a good bit of searching but have not find anything that exactly addresses such a scenario. This link was close, I thought, but not quite: http://stackoverflow.com/questions/970335/asp-net-mvc-mixing-custom-and-default-model-binding Any input is greatly appreciated!

    Read the article

  • an asp.net routing issue

    - by Adam Right
    my route implementation on Global.asax protected void Application_Start(object sender, EventArgs e) { this.intRoutes(RouteTable.Routes); } void intRoutes(RouteCollection Rts) { Rts.MapPageRoute("search", "{language}/{page}", "~/search.aspx"); Rts.MapPageRoute("category", "{language}/{name}/{no}/{categoryname}", "~/category.aspx"); Rts.MapPageRoute("product", "{language}/{name}/{no}/{productname}", "~/product.aspx"); } the problem is; if i use product routing on a hyperlink, like as follows; <asp:HyperLink ID="hyProduct" NavigateUrl='<%#HttpUtility.UrlDecode(((Page)HttpContext.Current.Handler).GetRouteUrl("product", new{ language=getUIFromHelper(),name=getNameFromHelper(),no=Eval("code"),productname=getProductNameFromHelper(Eval("name"))})) %>' runat="server" Text="something" /> everything goes fine, the link is written as expected like /en/products/06.008.001.150.0510/davis-fish-seeker-green but when i click that link the category.aspx page runs insted of product.aspx. am i missing out something ?

    Read the article

  • How to CURL and avoid timeout death (Twitter Down) [migrated]

    - by David
    Twitter is down right now, and one of my site's home pages relies on getting data from Twitter (relies is the problem - it should be more of an accessory feature, as it just shows follow count from its feed). Here's the code in question: function socials_Twitter_GetFollowerCount($username) { $method = function () use ($username) { return file_get_contents('https://api.twitter.com/1/users/show.json?screen_name='.$username.'&include_entities=true'); }; $json = cache('bmdtwitter', 3600, $method, false); $json = json_decode($json, true); return intval($json['followers_count']); } What is a good way to make it so if Twitter is down (or not responsive for some reasonable amount of time), our site doesn't appear to be down (I think the timeout maybe defaulting to 30-60 seconds or more).

    Read the article

  • Is there an apt command to download a deb file from the repositories to the current directory?

    - by Lekensteyn
    I am often interested in the installation triggers (postinst, postrm) or certain parts of packages (like /usr/share and /etc). Currently, I am running the next command to retrieve the source code: apt-get source [package-name] The downside is, this file is often much bigger than the binary package and does not reflect the installation tree. Right now, I am downloading the packages through http://packages.ubuntu.com/: Search for [package-name] Select the package Click on amd64/i386 for download Download the actual file This takes too long for me and as someone who really likes the shell, I would like to do something like the next (imaginary) command: apt-get get-deb-file [package-name] I could not find something like this in the apt-get manual page. The most close I found was the --download-only switch, but this puts the package in /var/cache/apt/archives (which requires root permissions) and not in the current directory.

    Read the article

  • Visual Studio Load Tests Virtual Users Simulation

    - by Eldar
    Hello, I'm currently working on writing a load testing application that takes advantage of Load Test using Visual Studio 2010. The load test will simulate 20 users on the same machine, and I need some data to be shared in-memory between all simulated users. I was suprised I couldn't find documentation answering the following question: What seperates each virtual user's running context from the other? Does each virtual user runs the tests in its own process? Maybe in its own app domain? Or just on its own thread? I need to know because if each user is running tests in its own process then all the in-memory cache isn't shared and is created for each user instead of one time for all of them, which is bad for me.

    Read the article

  • Are HTTP requests cached? [closed]

    - by nischayn22
    Many HTTP requests are sent repeatedly by browsers on almost every page load, such as requesting the jQuery .js file etc. Since these are already used on too many sites doesn't modern browsers keep a cache for this? I am thinking of a system where the browser has a cached copy of the .js file used very very frequently. On a new request for the .js file, it sends the server a request for a hash of the .js file (provided the server can reply to that) and compares the returned hash with the cached copy's hash... rest is intuitive.

    Read the article

  • JQuery Ajax Always Fires Error Function

    - by CccTrash
    AddFileToDB.ashx: public void ProcessRequest (HttpContext context) { context.Response.ContentType = "application/json"; context.Response.Write("{ \"filename\": \"test.jpg\" }"); } JQuery: $.ajax({ url: 'AddFileToDB.ashx', dataType: 'json', success: function(data) { alert(data.filename); }, error: function(data) { alert('error'); }, }); This always results in the error function being called. I don't know why? Thoughts? AddFileToDB.ashx gets fired, but success never gets run.

    Read the article

  • Dash empy and blank Software Center

    - by fra_casula
    I've Ubuntu 12.10.1 LTS x62 fresh of installation... My dash is empty, search doesn't work and applications list is empty. If I run the Ubuntu Software Center it freeze in a blank screen. I tried with: restart reinstall of unity packages deleting ~/.local/share/zeitgeist folder other tricks found in the other askubuntu answers UPDATE: From ~/.xsession-errors I/O warning : failed to load external entity "/home/francesco/.compiz/session/10699c9c27649d05db134910329935534100000027470033" Initializing session options...done (compiz:2813): GConf-CRITICAL **: gconf_client_add_dir: assertion `gconf_valid_key (dirname, NULL)' failed ** (zeitgeist-datahub:3070): WARNING **: zeitgeist-datahub.vala:227: Unable to get name "org.gnome.zeitgeist.datahub" on the bus! EDIT: Deleting .cache folder solves the Ubuntu Software Center blank screen issue, thanks to jokerdino!!

    Read the article

  • How to get the computer name (hostname in a web aplication)?

    - by Filipe
    Hi, how can I get the client's computer name in a web application. The user in a network. Regards // Already tryed this option string IP = System.Web.HttpContext.Current.Request.UserHostAddress; string compName = DetermineCompName(IP); System.Net.IPHostEntry teste = System.Net.Dns.GetHostEntry(IP); ssresult = IP + " - " + teste.HostName; // TODO: Write implementation for action private static string DetermineCompName(string IP) { IPAddress myIP = IPAddress.Parse(IP); IPHostEntry GetIPHost = Dns.GetHostEntry(myIP); string[] compName = GetIPHost.HostName.ToString().Split('.'); return compName[0]; } All of that, gives me only the IP :/

    Read the article

  • Set modified date = created date or null on record creation?

    - by User
    I've been following the convention of adding created and modified columns to most of my database tables. I also have been leaving the modified column as null on record creation and only setting a value on actual modification. The other alternative is to set the modified date to be equal to created date on record creation. I've been doing it the former way but I recent ran into one con which is seriously making me think of switching. I needed to set a database cache dependency to find out if any existing data has been changed or new data added. Instead of being able to do the following: SELECT MAX(modified) FROM customer I have to do this: SELECT GREATEST(MAX(created), MAX(modified)) FROM customer The negative being that it's a more complicated query and slower. Another thing is in file systems I believe they usually use the second convention of setting modified date = created date on creation. What are the pros and cons of the different methods? That is, what are the issues to consider?

    Read the article

  • CSS not loading when site is viewed via Windows VPN

    - by Dreamling
    Internal site has recently been redesigned, but IE8 does not seem to be loading the new css rules only when viewed via VPN. I really have no clue what to look for. I can't reproduce the problem, but it's apparently affecting client for the last month. I've suggested: Reloading IE8 Checking Internet Permissions Flushing the cache I'm not really certain what direction to search for the answer. Is it likely to be a server permissions issue? a VPN connection issue? a rare ie8 CSS bug?

    Read the article

  • apt-get could not open lock file

    - by user114373
    I am trying to get an nfs client running on a Sheeva-plug running debian 2.6.22. The host is Ubuntu 12.04 and claims (from showmount -e) to be exporting the desired directory. There is no showmount binary in the sheeva-plug, so I'm trying to install it from the nfs-common package: # apt-get install nfs-common The response ends with E: could not open lock file /var/cache/apt/archives/lock - open (no such file or directory) E: Unable to lock the download directory. I am root while doing this. Similar errors arise when trying to install other packages. How do I correct these errors so apt-get will do its work?

    Read the article

  • Navigation for ASP.NET Web Forms project published on codeplex

    Navigation for ASP.NET Web Forms manages movement and data passing between aspx Pages in a unit testable manner. There is no Client-side logic, so it works in all browsers, and no Server-side cache, so it works with the browser back button.Features include loosely coupled Pages, typed data passing, empty code-behinds, context-sensitive bread crumb trail, ASP.NET Data binding integration, automatic ASP.NET Ajax history navigation and many more.The source code, binaries and comprehensive documentation...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Why are the proposed BADSIG (on apt-get update) fixes secure?

    - by EvanED
    I'm running apt-get update, and I see errors like W: GPG error: http://us.archive.ubuntu.com precise Release: The following signatures were invalid: BADSIG 40976EAF437D05B5 Ubuntu Archive Automatic Signing Key <[email protected]> It's not hard to find instructions on how to fix these problems, for instance by asking for the new keys with apt-key adv --recv-keys or rebuilding the cache; so I'm not asking about how to fix these. But why is this the right thing to do? Why is "oh, I need new keys? Cool, go get new keys" not just defeating the purpose of having a signed repository in the first place? Are the keys signed by a master key that apt-key checks? Should we be doing some additional validation to ensure that we're getting legitimate keys?

    Read the article

  • How to implement proper identification and session managent on json post requests?

    - by IBr
    I have some minor messaging connection to server from website via json requests. I have single endpoint which distributes requests according to identification data. I am using asynchronous server and handle data when it comes. Now I am thinking about extending requests with some kind of session. What is the best way to define session? Get cookie when registered and use token as long as session runs with each request? Should I implement timeout for token? Is there alternative methods? Can I cache tokens to same origin requests? What could I use on client side (Web browser)? How about safety? What techniques I should use to throw away requests with malformed data, to big data, without choking server down? Should I worry?

    Read the article

  • Google s'offre une nouvelle interface, et optimise l'intelligence de ses recherches

    Google s'offre une nouvelle interface, et optimise l'intelligence de ses recherches Google s'est offert un lifting de printemps. La page d'accueil du moteur de recherche et si sobre et discrète que le moindre changement n'y passe pas inaperçu. Depuis aujourd'hui, Google.com a une nouvelle présentation. A côté d'un logo aux couleurs plus vives, le site se divise en trois panneaux : à gauche, les options ; au centre, les résultats ; et à droite, la publicité. L'onglet des options, d'habitude caché par défaut, est mis en avant. Un moyen de pousser les internautes à tester ces filtres de recherche qui ne sont pas nouveaux, mais encore trop méconnus du grand public. Par exemple, "update" : la recherche en temps...

    Read the article

  • SQL SERVER Data Pages in Buffer Pool Data Stored in MemoryCache

    This will drop all the clean buffers so we will be able to start again from there. Now, run the following script and check the execution plan of the query. Have you ever wondered what types of data are there in your cache? During SQL Server Trainings, I am usually asked if there is any [...]...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Accessing HttpApplication.Application variables from a class

    - by Young Ninja
    I set up various global parameters in Global.asax, as such: Application["PagePolicies"] = "~/Lab/Policies.aspx"; Application["PageShare"] = "/Share.aspx"; Application["FileSearchQueries"] = Server.MapPath("~/Resources/SearchQueries.xml"); ... I have no problem accessing these variables form .ascx.cs or .aspx.cs file -- ie. files that are part of the Web content. However, I can't seem to access 'Application' from basic class objects (ie. standalone .cs files). I read somewhere to use a slight variations in .cs files, as follows, but it always comes throws an exception when in use: String file = (String)System.Web.HttpContext.Current.Application["FileSearchQueries"];

    Read the article

  • how to download and install from the internet >> skype and realplayer

    - by ADAM
    i had corrupted win7 by dead blue screen and stopped rebooting or installing or safe mode. just death i installed linux unubtu one i still unable to install my wireless stick modem for surfing internet wirelessly form my provider vividwireless i don't know where or how to download porg and app from the net like real player and skype and others and to intall them. even i cannot find them after download. can i revive and reboot or fix my dead win7 os from linux. i can connect internet from wireless network in the neighborhood linksys!! what is linksys anyway? how to use linux ubuntu same way like windows? ie. insalling, using cd rom for installation and downloading and cleaning history and cache and system restore? please email me to: [email protected] any help..step by step many thanks

    Read the article

< Previous Page | 154 155 156 157 158 159 160 161 162 163 164 165  | Next Page >