Search Results

Search found 20353 results on 815 pages for 'website review'.

Page 390/815 | < Previous Page | 386 387 388 389 390 391 392 393 394 395 396 397  | Next Page >

  • MVC Application Design

    - by Paul Brown
    Hello I am about to create my first proper application in ASP.NET MVC3. It is basically a jobs site with 3 levels: 1) Users - No registration and can view all jobs posted on the website 2) Posters - Need to register and login to post adverts 3) Admin - Need to register and login to post adverts and review postings before they go live Would you suggest I use the same Jobs controller for the three levels I mention above? With a LIST action to show jobs to "Users" and a CREATE & EDIT action for the "Posters" & "Admin"? Thanks Paul

    Read the article

  • cookies handling on webrequest and response

    - by manish patel
    I have created a application that has a function mainpost. It is created to post data on a https sites. Here I want to handle cookies in this function. How can I do this task? public string Mainpost(string website, string content) { // this is what we are sending string post_data = content; // this is where we will send it string uri = website; // create a request HttpWebRequest request = (HttpWebRequest) WebRequest.Create(uri); request.KeepAlive = false; request.ProtocolVersion = HttpVersion.Version10; request.Method = "POST"; // turn our request string into a byte stream byte[] postBytes = Encoding.ASCII.GetBytes(post_data); // this is important - make sure you specify type this way request.ContentType = "application/x-www-form-urlencoded"; request.ContentLength = postBytes.Length; Stream requestStream = request.GetRequestStream(); // now send it requestStream.Write(postBytes, 0, postBytes.Length); requestStream.Close(); // grab te response and print it out to the console along with the status // code HttpWebResponse response = (HttpWebResponse)request.GetResponse(); string str = (new StreamReader(response.GetResponseStream()).ReadToEnd()); Console.WriteLine(response.StatusCode); return str; }

    Read the article

  • How can I hide the taxonomy field for authenticated users but show it for other users in Drupal 6?

    - by Jaymie
    I have a Drupal (v6.17) Content Type which includes a Taxonomy field. I want to hide this from ordinary Authenticated Users, but want it available to my Site Contributor role users, so they can review and then assign tags to user-created nodes. I've tried overriding the Node Add/Edit form in Panels 3 by creating a panel variant especially for Authenticated Users, which would exclude the Taxonomy field. However, the Taxonomy field is bundled in with the "General Form" controls - without showing this, I don't get the Title and Body fields. Is there a way I can either include the Title and Body fields without Taxonomy, OR hide just the Taxonomy field when the authenticated user role creates a node. I realise there's a CCK field which might be able to help me out here, but how do I tie that to the Taxonomy module? Any help gratefully received.

    Read the article

  • How do I track down sporadic ASP.NET performance problems in a production environment?

    - by Steve Wortham
    I've had sporadic performance problems with my website for awhile now. 90% of the time the site is very fast. But occasionally it is just really, really slow. I mean like 5-10 seconds load time kind of slow. I thought I had narrowed it down to the server I was on so I migrated everything to a new dedicated server from a completely different web hosting company. But the problems continue. I guess what I'm looking for is a good tool that'll help me track down the problem, because it's clearly not the hardware. I'd like to be able to log certain events in my ASP.NET code and have that same logger also track server performance/resources at the time. If I can then look back at the logs then I can see what exactly my website was doing at the time of extreme slowness. Is there a .NET logging system that'll allow me to make calls into it with code while simultaneously tracking performance? What would you recommend?

    Read the article

  • Problems compiling libjingle/gtk+-2.0 for Mac OS X

    - by mindthief
    Hi All, I'm trying to compile libjingle on Mac OSX Snow Leopard. The INSTALL file said to './configure', 'make' and 'make install', as usual. But make fails for me. Initially it gave some messages indicating that I didn't have pkg-config installed (I guess OSX doesn't come with it installed?), so I downloaded pkg-config from http://pkgconfig.freedesktop.org/releases/ Now I get this message: Package gtk+-2.0 was not found in the pkg-config search path. Perhaps you should add the directory containing `gtk+-2.0.pc' to the PKG_CONFIG_PATH environment variable No package 'gtk+-2.0' found I tried to install gtk by using the script at SourceForge: http://sourceforge.net/projects/gtk-osx/ (this is the website pointed to by the gtk website) Running the script didn't really seem to do anything, here is the output: $./gtk-osx-build-setup.sh Checking out jhbuild (2.27.3) from git... From git://git.gnome.org/jhbuild * tag 2.27.3 -> FETCH_HEAD Installing jhbuild... Installing jhbuild configuration... Installing gtk-osx moduleset files... Done. $ And I still get that error message about "Package gtk+-2.0 not found" while make-ing libjingle. Help will be appreciated, thanks!

    Read the article

  • umbraco front end site stopped working suddenly

    - by Srilakshmi
    Hi All, I created one webapplication and placed the default.aspx page in the root folder of the umbraco (i.e., httpdocs folder) and the application dll into the bin folder. I used the name “Default.aspx” as the other names are not working. Now the issue is all the pages are redirecting to the default.aspx page (I haven’t made any config changes anywhere in the umbraco setup) I found this root cause and removed the default.aspx page and its respective dll from the bin folder. The resource cannot be found. Description: HTTP 404. The resource you are looking for (or one of its dependencies) could have been removed, had its name changed, or is temporarily unavailable. Please review the following URL and make sure that it is spelled correctly. Requested URL: /default.aspx I stuck up here and struggling to resolve it.Please help me out on this THanks, Srilakshmi

    Read the article

  • Facebook new js api and cross-domain file

    - by vondip
    Hi all, I am building a simple facebook iframe application. I've decided since the code is separate from facebook none the less, I will also create a connect website as well. In my connect website I'm trying to figure out the following: I am using facebook's new api and I am calling the init function. I can't seem to figure out where I combine my cross-domain file. There's no mention of it in their documentation either. http://developers.facebook.com/docs/reference/javascript/FB.init I am referring to these lines of code: <div id="fb-root"></div> <script> window.fbAsyncInit = function() { FB.init({appId: 'your app id', status: true, cookie: true, xfbml: true}); }; (function() { var e = document.createElement('script'); e.async = true; e.src = document.location.protocol + '//connect.facebook.net/en_US/all.js'; document.getElementById('fb-root').appendChild(e); }()); </script>

    Read the article

  • How to store multiple cookies through PHP Curl

    - by Ahmad
    'SOUP.IO' is not providing any api. So Iam trying to use 'PHP Curl' to login and submit data through PHP. Iam able to login the website successfully(through cUrl), but when I try to submit data through cUrl, it gives me error of 'invalid user'. When I tried to analysed the code and website, I came to know that cUrl is getting values of only 1-2 cookies. Where as when I open the same page in FireFox, it shows me 6-7 cookies related to 'SOUP.IO'. Can some one guide me how to get all these 7 cookies values. Following cookies are getable by cUrl: soup_session_id Following cookies are shown in Firefox (not through cUrl): __qca, __utma, __utmb, __utmc, __utmz Following is my cUrl code: $cookie_file_path = getcwd()."/cookie/cookie.txt"; $ch = curl_init(); curl_setopt($ch, CURLOPT_URL, 'http://www.soup.io'); curl_setopt($ch, CURLOPT_VERBOSE, 1); curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, FALSE); curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, FALSE); curl_setopt($ch, CURLOPT_HEADER, TRUE); curl_setopt($ch, CURLOPT_ENCODING, 'gzip,deflate'); curl_setopt($ch, CURLOPT_COOKIEJAR, $cookie_file_path); curl_setopt($ch, CURLOPT_COOKIEFILE, $cookie_file_path); curl_setopt($ch, CURLOPT_USERAGENT, 'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US; rv:1.9.2.3) Gecko/20100401 Firefox/3.6.3 (.NET CLR 3.5.30729) FirePHP/0.4'); curl_setopt($ch, CURLOPT_MAXREDIRS, 10); curl_setopt($ch, CURLOPT_FOLLOWLOCATION, TRUE); curl_setopt($ch, CURLOPT_RETURNTRANSFER, TRUE); $result = curl_exec($ch); curl_close($ch); print_r($result); ? Can some one guide me in this regards Thanks in advance

    Read the article

  • PHP .htaccess issue, specific/dynamic keywords

    - by Kunal
    Here goes my .htaccess file's content. Options +FollowSymLinks RewriteEngine On RewriteRule ^online-products$ products.php?type=online RewriteRule ^land-products$ products.php?type=land RewriteRule ^payment-methods$ payment-methods.php RewriteRule ^withdrawal-methods$ withdrawal-methods.php RewriteRule ^deposit-methods$ deposit-methods.php RewriteRule ^product-bonuses$ product-bonuses.php RewriteRule ^law-and-regulations$ law-and-regulations.php RewriteRule ^product-news$ product-news.php RewriteRule ^product-games$ product-games.php RewriteRule ^no-products$ no-products.php RewriteRule ^page-not-found$ notfound.php RewriteCond %{SCRIPT_FILENAME} !-f RewriteCond %{SCRIPT_FILENAME} !-d RewriteRule ^casinos/(.*)$ product.php?id=$1 RewriteCond %{SCRIPT_FILENAME} !-f RewriteCond %{SCRIPT_FILENAME} !-d RewriteRule ^(.*)$ cms.php?link=$1 ErrorDocument 404 /notfound.php What I am trying to achive here is that the first set of rules apply to some specific keywords which should redirect to specific hard coded pages. But anything else parat from those keywords should redirect to cms.php as a parameter as you can see. But the problem is that every keyword is getting redirected to cms.php. Whereas I want any other keyword apart from which are already hard coded in .htaccess file should go cms.php. Not just every keyword. Example: www.sitename.com/online-products -> www.sitename.com/products.php?type=online www.sitename.com/about-the-website -> www.sitename.com/cms.php?id=about-the-website www.sitename.com/product-news -> www.sitename.com/product-news.php Also another issue I am facing is I can not use any keyword with space. Like "online-products" is fine, but I can't use "online products". Please help me out with your expert knowledge. Many thanks in advance for your kind help. Appreciate it.

    Read the article

  • How do I deploy my ASP MVC project to my Win7 system?

    - by MedicineMan
    Hi, I am deploying my first ASP MVC project. The project runs just fine, I would like to take the next step and run this outside of my visual studio environment on my local IIS. I am running Windows7, Visual Studio 2008, and I have created a basic ASP MVC project. On my solution, I find the project I would like to deploy. I right click and select Publish. I have backed up C:\inetpub\wwwroot\ and would like to deploy there. I accept all defaults, and click the "Publish" button. The Output Build window shows 1 project failed. Basically is says that it is unable to add any of the binaries to the site, copy files, create new directories... Access is denied. When I do click "Publish" at work, I don't get these errors. What do I have to do here to publish the website to make the website available to the rest of my home network? Also wwwroot appears to be readonly, but telling the folder to not be read only doesn't seem to help, it still appears to be readonly even after I've unselected this property in the property dialog.

    Read the article

  • Identify machines behind a router uniquely based on ipaddress

    - by Amith George
    Some background first. I have a .net client agent installed on each of the machines in the lan. They are interacting with my central server [website] also on the same lan. It is important for my website to figure out which of the machines can talk to each other. For example, machines of one subnet cannot directly talk to machines of another subnet without configuring the routers and such. But machines in the same subnet should be able to talk to each other directly. The problem I am facing is when the lan setup is like in Figure 1. Because Comp1, Comp2 and Comp3 are behind a router, they have got the ipaddress 192.168.1.2 till 192.168.1.4. My client agent on these machines report the same ipaddress back to the server. However, machines Comp4, Comp5 also have the same ipaddresses. Thus, as far as my server is concerned, there are 2 machines with the same ipaddress. Not just that, because the subnet mask is 255.255.255.0 for all machines, my server is fooled into thinking that Comp1 can directly talk to Comp5, which is not possible. So, how do I solve this? What do I need to change in my client or in my server, so that I can support this scenario. These two are the only things in my control.

    Read the article

  • How to process AJAX requests more securely in PHP?

    - by animuson
    Ok, so I want to send AJAX requests to my website from my Flash games to process data, but I don't want people downloading them, decompiling them, then sending fake requests to be processed, so I'm trying to figure out the most secure way to process in the PHP files. My first idea was to use Apache's built in Authorization module to require a username and password to access the pages on a separate subdomain of my website, but then you'd have to include that username and password in the AJAX request anyway so that seems kind of pointless to even try. My current option looks pretty promising but I want to make sure it will work. Basically it just checks the IP address being sent using REMOTE_ADDR to make sure it's the IP address that my server runs on. <? $allowed = new Array("64.120.211.89", "64.120.211.90"); if (!in_array($_SERVER['REMOTE_ADDR'], $allowed)) header("HTTP/1.1 403 Forbidden"); ?> Both of those IP addresses point to my server. Things I'm worried about: 1) If I send a request from Flash/ActionScript, will that affect the IP address in any way? 2) Is it possible for malicious users to change the IP address that is being sent with REMOTE_ADDR to one of my IP addresses? Any other ways you would suggest that might be more secure?

    Read the article

  • CSS column height different on Opera/IE to FF

    - by Infiniti Fizz
    Hi all, Thanks to everyone who helped with my last question but I've got a new browser-independent problem: For some reason, the image navigator (not yet functioning) on a website I'm working on is currently not displaying in the correct place on Firefox. It appears in the right place in IE8 and Opera but Firefox seems to have a problem with it. As can be seen in the below image, the imageContainer div (the image and the left/right arrows) appears on top of the footer, this is how it should look i.e. how it looks in IE8 and Opera. But in the image below, the imageContainer div is cutting into the footer div for some reason, and I don't know why. imageContainer has a margin-top: 110px; to get it in the right place at the bottom of its column. There are 2 columns, the left housing the paragraphs and imageContainer and the right housing the Calendar and contact details. The footer div also has clear: both; Also, it's not just the image that is falling into the footer, it's the arrows as well only they are the same colour as the footer so this isn't immediately apparent. Any ideas why it isn't displaying correctly? Is there a better way of aligning the imageContainer to the bottom of it's column (to keep the box shape of the website) other than using the margin-top to position it? Thanks in advance, infinitifizz

    Read the article

  • Webcrawler, feedback?

    - by Jan Kuboschek
    Hey folks, every once in a while I have the need to automate data collection tasks from websites. Sometimes I need a bunch of URLs from a directory, sometimes I need an XML sitemap (yes, I know there is lots of software for that and online services). Anyways, as follow up to my previous question I've written a little webcrawler that can visit websites. Basic crawler class to easily and quickly interact with one website. Override "doAction(String URL, String content)" to process the content further (e.g. store it, parse it). Concept allows for multi-threading of crawlers. All class instances share processed and queued lists of links. Instead of keeping track of processed links and queued links within the object, a JDBC connection could be established to store links in a database. Currently limited to one website at a time, however, could be expanded upon by adding an externalLinks stack and adding to it as appropriate. JCrawler is intended to be used to quickly generate XML sitemaps or parse websites for your desired information. It's lightweight. Is this a good/decent way to write the crawler, provided the limitations above? http://pastebin.com/VtgC4qVE - Main.java http://pastebin.com/gF4sLHEW - JCrawler.java http://pastebin.com/VJ1grArt - HTMLUtils.java Thanks for your feedback in advance! :)

    Read the article

  • E-Commerce Security: Only Credit Card Fields Encrypted?!

    - by bizarreunprofessionalanddangerous
    I'd like your opinions on how a major bricks-and-mortar company is running the security for its shopping Web site. After a recent update, when you are logged into your shopping account, the session is now not secured. No 'https', no browser 'lock'. All the personal contact info, shopping history -- and if I'm not mistaken submit and change password -- are being sent unencrypted. There is a small frame around the credit card fields that is https. There's a little notice: "Our website is secure. Our website uses frames and because of this the secure icon will not appear in your browser" On top of this the most prominent login fields for the site are broken, and haven't gotten fixed for a week or longer (giving the distinct impression they have no clue what's going on and can't be trusted with anything). Now is it just me -- or is this simply incomprehensible for a billion dollar company, significant shopping site, in the year 2010. No lock. "We use frames" (maybe they forget "Best viewed in IE4"). Customers complaining, as you can see from their FAQ "explaining" why you aren't seeing https. I'm getting nowhere trying to convince customer service that they REALLY need to do something about this, and am about to head for the CEO. But I just want to make sure this is as BIZARRE and unprofessional and dangerous a situation as I think it is. (I'm trying to visualize what their Web technical team consists of. I'm getting A) some customer service reps who were given a 3 hour training course on Web site maintenance, B) a 14 year old boy in his bedroom masquerading as a major technical services company, C) a guy in a hut in a jungle with an e-commerce book from 1996.)

    Read the article

  • WCF service consuming passively issued SAML token

    - by Neillyboy
    What is the best way to pass an existing SAML token from a website already authenticated via a passive STS? We have built an Identity Provider which is issuing passive claims to the website for authentication. We have this working. Now we would like to add some WCF services into the mix - calling them from the context of the already authenticated web application. Ideally we would just like to pass the SAML token on without doing anything to it (i.e. adding new claims / re-signing). All of the examples I have seen require the ActAs sts implementation - but is this really necessary? This seems a bit bloated for what we want to achieve. I would have thought a simple implementation passing the bootstrap token into the channel - using the CreateChannelActingAs or CreateChannelWithIssuedToken mechanism (and setting ChannelFactory.Credentials.SupportInteractive = false) to call the WCF service with the correct binding (what would that be?) would have been enough. We are using the Fabrikam example code as reference, but as I say, think the ActAs functionality here is overkill for what we are trying to achieve.

    Read the article

  • iframe form not submitting in IE7

    - by Lauren
    For some reason I can submit the form data on this Review and Submit page here in Chrome and FF but not IE7: https://checkout.netsuite.com/s.nl?c=659197&n=1&sc=4&category=confirm Email:[email protected] Pass:test03 Click on "here" where it says "Your Third Party Shipper Numbers (To enter one, click here.)" I removed my javascript that automatically refreshes the page to make sure that wasn't refreshing before anything was submitted somehow. Could the difference in IE7 have to do with the fact that the domain of the form (forms.netsuite.com) is different than the domain of the parent page(checkout.netsuite.com) and it's being submitted over HTTPS?

    Read the article

  • Which is more secure GET or POST sending parameters with cURL at PHP

    - by Steve
    I want to connect in a secure way with an API and I am using cURL to do it using HTTPS and SSL. Now, i was wondering what is better in terms of security, sending the data through GET or POST: $ch = curl_init("http://api.website.com/connect.php?user=xxx&pass=xxxx); curl_setopt($ch, CURLOPT_URL, $url); curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, FALSE); curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, 2); $result = curl_exec($ch); curl_close($ch); Or $param['user'] = 'xxxx'; $param['pass'] = 'xxxx'; $ch = curl_init("http://api.website.com/connect.php); curl_setopt($ch, CURLOPT_POST, true); curl_setopt($ch, CURLOPT_POSTFIELDS, $Parameters); curl_setopt($ch, CURLOPT_URL, $url); curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, FALSE); curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, 2); $result = curl_exec($ch); curl_close($ch); I also realized that POST is much more slower retrieving the data.

    Read the article

  • JSoup - Select only one listobject

    - by Zyril
    I'm trying to extract some certain data from a website using JSoup and Java. So far I've been successful in what I'm trying to achieve. <ul class="beverageFacts"> <li><span>Årgång</span><strong>**2009**&nbsp;</strong></li> I want to extract what is inside the ** in the above HTML. I can do this by using the code that follows in JSoup: doc.select("ul.beverageFacts li:lt(1) strong"); I'm using the lt(1) because there are several more list items following that I want to omit. Now to my problem; there's an optional information tab on the site I'm extracting data from, and it also has a class called "beverageFacts". My code will at the moment extract that data too, which I don't want it to do. The code is further down in the source of the website, and I've tried to use the indexer :lt(1) here aswell, but it wont work. <div id="beverageMoreFacts" style="display: block"> <ul class="beverageFacts"><li class="half"> <span> Färg</span><strong> Ljusgul färg.</strong> My overall result is that I extract "2009 Ljusgul färg." instead of only "2009". How can I write my code so it will only extract the first part, which it succesfully does, and omits the rest? EDIT: I get the same result using: doc.select("ul.beverageFacts li:eq(0) strong"); Thanks, Z

    Read the article

  • Which syntax is better for return value?

    - by Omar Kooheji
    I've been doing a massive code review and one pattern I notice all over the place is this: public bool MethodName() { bool returnValue = false; if (expression) { // do something returnValue = MethodCall(); } else { // do something else returnValue = Expression; } return returnValue; } This is not how I would have done this I would have just returned the value when I knew what it was. which of these two patterns is more correct? I stress that the logic always seems to be structured such that the return value is assigned in one plave only and no code is executed after it's assigned.

    Read the article

  • Adding a div element inside a panel?

    - by Bar Mako
    I'm working with GWT and I'm trying to add google-maps to my website. Since I want to use google-maps V3 I'm using JSNI. In order to display the map in my website I need to create a div element with id="map" and get it in the initialization function of the map. I did so, and it worked out fine but its location on the webpage is funny and I want it to be attached to a panel I'm creating in my code. So my question is how can I do it? Can I create a div somehow with GWT inside a panel ? I've tried to do create a new HTMLPanel like this: runsPanel.add(new HTMLPanel("<div id=\"map\"></div>")); Where runsPanel is a the panel I want to to be attached to. Yet, it fails to retrive the div when I use the following initialization function: private native JavaScriptObject initializeMap() /*-{ var latLng = new $wnd.google.maps.LatLng(31.974, 34.813); //around Rishon-LeTsiyon var mapOptions = { zoom : 14, center : latLng, mapTypeId : $wnd.google.maps.MapTypeId.ROADMAP }; var mapDiv = $doc.getElementById('map'); if (mapDiv == null) { alert("MapDiv is null!"); } var map = new $wnd.google.maps.Map(mapDiv, mapOptions); return map; }-*/; (It pops the alert - "MapDiv is null!") Any ideas? Thanks

    Read the article

  • How to deploy a number of disparate project types?

    - by niteice
    This question is similar to http://stackoverflow.com/questions/1900269/whats-the-best-way-to-deploy-an-executable-process-on-a-web-server. The situation is this: I'm developing a product that needs to be deployed to a web server. It consists of 4 website projects, a background service, a couple of command-line tools, and two assemblies shared by all of these components. Now, I also happen to administer the server that this product will be deployed on. So I'm familiar with everything that may need to be done to perform an update: Copy website files Replace the service binary Install updated components in the GAC Configure IIS Update database schema After some research it seems that, to reduce deployment time and to be able to let the other sysadmins handle deployment, I want to deploy all of these as an MSI, except that I don't know a thing about installers. I know VS can generate web deployment projects, but where do I go from there? Being able to simply click Next a few times on an installer is my goal for deploying updates. It would also be nice to modularize it, so for example, I could distribute the four websites among multiple servers and have everything appear as individual components in the installer, and as one entity in Add/Remove Programs. Is all of this too much to ask in a single package?

    Read the article

  • Which is best Postfix Log analyzer?

    - by Anto Binish Kaspar
    Which is best Postfix Log analyzer? We are looking for good log analyzer for postfix. We need to analyze the following How many mails queued ? How many mails not delivered ? Why mails are not delivered ? And is it possible to view the subject for the all mail status instead of message id? I mean to review the status of the single mail. We are using Sawmill analyzer now. But the management is not satisfied with the report from the sawmaill, since its missing single message status and subject.

    Read the article

  • C# - Google like query engine.-

    - by MRFerocius
    Guys; Hope you are fine. I have to make a Web Project (very simple) I will have a DB with 2 tables. One table has 2 fields. From the WebPage I need a Google like search query, for example I have Movie Title and Movie Review on the Table. I need to be able to search those 2 fields like this: "Best Movie" + Action I will need to make a query to the DB to search for the "Best Movie" string togheter plus optional ACTION word on 2 fields of the table. Am I clear??? :) Does somebody know if this has already been made, and if it´s public and free and where to get it :) Thanks in advanced EDIT: My concern is to translate the Google like Symbols ("", +, -, ~) to build a valid query.

    Read the article

  • asp.net Configuration Error on host

    - by zey
    I uploaded my asp.net site to hosting site , and my site browse correctly . But when I go to login url , it's show me the error Configuration Error Description: An error occurred during the processing of a configuration file required to service this request. Please review the specific error details below and modify your configuration file appropriately. Parser Error Message: It is an error to use a section registered as allowDefinition='MachineToApplication' beyond application level. This error can be caused by a virtual directory not being configured as an application in IIS. Source Error: Line 23: </assemblies> Line 24: </compilation> Line 25: <authentication mode="Forms"> Line 26: <forms loginUrl="~/Account/Login.aspx" timeout="2880" /> Line 27: </authentication> How can I fix it ?

    Read the article

< Previous Page | 386 387 388 389 390 391 392 393 394 395 396 397  | Next Page >