Search Results

Search found 16467 results on 659 pages for 'request filtering'.

Page 437/659 | < Previous Page | 433 434 435 436 437 438 439 440 441 442 443 444  | Next Page >

  • Using Knockout.js to bind bootstrap daterange picker and parse span contents

    - by jmkr
    I'm new to knockout and trying to get what should be a simple task up and running. I'm working on an MVC4 .NET app with the intention of binding a date range picker to make ajax requests for updating Highchart graph data. I'm using Dan Grossman's bootstrap-themed date picker and it's been great so far (https://github.com/dangrossman/bootstrap-daterangepicker). The basic goal is to watch the span that this jQuery date ranger picker updates, and then use knockout to pass this value to another part of the app for the ajax request. I've tried everything I can find online.. valueUpdate:change on the span to using some jQuery within knockout to get the same goal done, to using a subscribe function to watch the value of the span before and after the date picker is used. Apparently this uses the jQuery .change() event handler, which is only good on inputs, selects, and textareas.. not spans. Here's the fiddle of what I have so far: http://jsfiddle.net/eyygK/9/ Appreciate any help and input.

    Read the article

  • Google Charts Through CURL

    - by swt83
    I have a PHP class that helps me generate URLs for custom charts using Google Chart service. These URLs work fine when I load them in my browser, but I'm trying to pull them using CURL so I can access the charts on secure https websites. Whenever I try and pull a chart via CURL, I get an Error 400 Bad Request. Any idea on how to get around this? Everything I have tried has failed. $url = urldecode($_GET['url']); $session = curl_init($url); // Open the Curl session curl_setopt($session, CURLOPT_HEADER, false); // Don't return HTTP headers curl_setopt($session, CURLOPT_RETURNTRANSFER, true); // Do return the contents of the call $image = curl_exec($session); // Make the call #header("Content-Type: image/png"); // Set the content type appropriately curl_close($session); // And close the session die($image);

    Read the article

  • Calling WebMethods / WebService using jquery is blocking

    - by Sir Psycho
    Hi, I'm generating a file on the server which takes some time. For this, I have a hidden iframe which I then set the .src attribute to an aspx file i.e iframe.src = "/downloadFile.aspx" While this is taking place, I'd like to have a call to a web service return the progress. To do this, I thought I could use window.setInterval or window.setTimeout but Javascript seems to be blocked as soon as I set the iframe src attribute. Does anyone know how to get around this or perhaps try a different approach? I have also tried handlers, but the request never gets to the server so I'm assuming is a browser/javascript issue.

    Read the article

  • How to implement Comet server side with Python?

    - by Morgan Cheng
    I once tried to implement Comet in PHP. Soon, I found that PHP is not suitable for Comet, since each HTTP request will occupy one process/thread. As a result, it doesn't scale well. I just installed mod_python in my XAMPP. I thought it would be easy to implement Comet with Python asynchronous programming. But still cannot get a clue how to implement it. Is there any idea how to implement Comet in mod_python?

    Read the article

  • .NET HttpListener - no traffic when listening to "https://*.8080" when browser proxy is set???

    - by Greg
    Hi, Background - I can get HttpListener working fine for HTTP traffic. I'm having trouble with HTTPS traffic however. QUESTION: How can I change the code below so that a browser request to a "https" URL will actually be picked up by my HttpListener? Notes - At the moment with firefox's proxy settings set to "localhost:8080", when I listen to traffic on port 8080 ("https://*:8080/"), and I enter a HTTPS url in firefox, I am getting no traffic being picked up? (when I listen to just http and enter normal http url's it works fine) _httpListener = new HttpListener(); _httpListener.Prefixes.Add("https://*:8080/"); _httpListener.Start(); thanks

    Read the article

  • Browser Detection and Zend MVC

    - by Vincent
    I have a PHP application using Zend MVC framework. The entry point for every request to the application is in /public/index.php. I have a Browser class that has functions to check if the user's browser is compatible with application or not. My dilemma is, index.php is executed for every controller call. So there are chances that this file gets executed multiple times within the same page and hence redirection becomes an issue. What's the best way to solve the looping issue? Thanks

    Read the article

  • Importing Thawte trial certificates into a Java keystore

    - by lindelof
    Hello, I'm trying to configure a Tomcat server with SSL. I've generated a keypair thus: $ keytool -genkeypair -alias tomcat -keyalg RSA -keystore keys Next I generate a certificate signing request: $ keytool -certreq -keyalg RSA -alias tomcat -keystore keys -file tomcat.csr Then I copy-paste the contents of tomcat.csr into a form on Thawte's website, asking for a trial SSL certificate. In return I get two certificates delimited with -----BEGIN ... -----END, that I save under tomcat.crt and thawte.crt. (Thawte calls the second certificate a 'Thawte Test CA Root' certificate). When I try to import either of them it fails: $ keytool -importcert -alias tomcat -file tomcat.crt -keystore keys Enter keystore password: keytool error: java.lang.Exception: Failed to establish chain from reply $ keytool -importcert -alias thawte -file thawtetest.crt -keystore keys Enter keystore password: keytool error: java.lang.Exception: Input not an X.509 certificate Adding the -trustcacerts option to either of these commands doesn't change anything either. Any idea what I am doing wrong here?

    Read the article

  • Oracle BI Server Modeling, Part 1- Designing a Query Factory

    - by bob.ertl(at)oracle.com
      Welcome to Oracle BI Development's BI Foundation blog, focused on helping you get the most value from your Oracle Business Intelligence Enterprise Edition (BI EE) platform deployments.  In my first series of posts, I plan to show developers the concepts and best practices for modeling in the Common Enterprise Information Model (CEIM), the semantic layer of Oracle BI EE.  In this segment, I will lay the groundwork for the modeling concepts.  First, I will cover the big picture of how the BI Server fits into the system, and how the CEIM controls the query processing. Oracle BI EE Query Cycle The purpose of the Oracle BI Server is to bridge the gap between the presentation services and the data sources.  There are typically a variety of data sources in a variety of technologies: relational, normalized transaction systems; relational star-schema data warehouses and marts; multidimensional analytic cubes and financial applications; flat files, Excel files, XML files, and so on. Business datasets can reside in a single type of source, or, most of the time, are spread across various types of sources. Presentation services users are generally business people who need to be able to query that set of sources without any knowledge of technologies, schemas, or how sources are organized in their company. They think of business analysis in terms of measures with specific calculations, hierarchical dimensions for breaking those measures down, and detailed reports of the business transactions themselves.  Most of them create queries without knowing it, by picking a dashboard page and some filters.  Others create their own analysis by selecting metrics and dimensional attributes, and possibly creating additional calculations. The BI Server bridges that gap from simple business terms to technical physical queries by exposing just the business focused measures and dimensional attributes that business people can use in their analyses and dashboards.   After they make their selections and start the analysis, the BI Server plans the best way to query the data sources, writes the optimized sequence of physical queries to those sources, post-processes the results, and presents them to the client as a single result set suitable for tables, pivots and charts. The CEIM is a model that controls the processing of the BI Server.  It provides the subject areas that presentation services exposes for business users to select simplified metrics and dimensional attributes for their analysis.  It models the mappings to the physical data access, the calculations and logical transformations, and the data access security rules.  The CEIM consists of metadata stored in the repository, authored by developers using the Administration Tool client.     Presentation services and other query clients create their queries in BI EE's SQL-92 language, called Logical SQL or LSQL.  The API simply uses ODBC or JDBC to pass the query to the BI Server.  Presentation services writes the LSQL query in terms of the simplified objects presented to the users.  The BI Server creates a query plan, and rewrites the LSQL into fully-detailed SQL or other languages suitable for querying the physical sources.  For example, the LSQL on the left below was rewritten into the physical SQL for an Oracle 11g database on the right. Logical SQL   Physical SQL SELECT "D0 Time"."T02 Per Name Month" saw_0, "D4 Product"."P01  Product" saw_1, "F2 Units"."2-01  Billed Qty  (Sum All)" saw_2 FROM "Sample Sales" ORDER BY saw_0, saw_1       WITH SAWITH0 AS ( select T986.Per_Name_Month as c1, T879.Prod_Dsc as c2,      sum(T835.Units) as c3, T879.Prod_Key as c4 from      Product T879 /* A05 Product */ ,      Time_Mth T986 /* A08 Time Mth */ ,      FactsRev T835 /* A11 Revenue (Billed Time Join) */ where ( T835.Prod_Key = T879.Prod_Key and T835.Bill_Mth = T986.Row_Wid) group by T879.Prod_Dsc, T879.Prod_Key, T986.Per_Name_Month ) select SAWITH0.c1 as c1, SAWITH0.c2 as c2, SAWITH0.c3 as c3 from SAWITH0 order by c1, c2   Probably everybody reading this blog can write SQL or MDX.  However, the trick in designing the CEIM is that you are modeling a query-generation factory.  Rather than hand-crafting individual queries, you model behavior and relationships, thus configuring the BI Server machinery to manufacture millions of different queries in response to random user requests.  This mass production requires a different mindset and approach than when you are designing individual SQL statements in tools such as Oracle SQL Developer, Oracle Hyperion Interactive Reporting (formerly Brio), or Oracle BI Publisher.   The Structure of the Common Enterprise Information Model (CEIM) The CEIM has a unique structure specifically for modeling the relationships and behaviors that fill the gap from logical user requests to physical data source queries and back to the result.  The model divides the functionality into three specialized layers, called Presentation, Business Model and Mapping, and Physical, as shown below. Presentation services clients can generally only see the presentation layer, and the objects in the presentation layer are normally the only ones used in the LSQL request.  When a request comes into the BI Server from presentation services or another client, the relationships and objects in the model allow the BI Server to select the appropriate data sources, create a query plan, and generate the physical queries.  That's the left to right flow in the diagram below.  When the results come back from the data source queries, the right to left relationships in the model show how to transform the results and perform any final calculations and functions that could not be pushed down to the databases.   Business Model Think of the business model as the heart of the CEIM you are designing.  This is where you define the analytic behavior seen by the users, and the superset library of metric and dimension objects available to the user community as a whole.  It also provides the baseline business-friendly names and user-readable dictionary.  For these reasons, it is often called the "logical" model--it is a virtual database schema that persists no data, but can be queried as if it is a database. The business model always has a dimensional shape (more on this in future posts), and its simple shape and terminology hides the complexity of the source data models. Besides hiding complexity and normalizing terminology, this layer adds most of the analytic value, as well.  This is where you define the rich, dimensional behavior of the metrics and complex business calculations, as well as the conformed dimensions and hierarchies.  It contributes to the ease of use for business users, since the dimensional metric definitions apply in any context of filters and drill-downs, and the conformed dimensions enable dashboard-wide filters and guided analysis links that bring context along from one page to the next.  The conformed dimensions also provide a key to hiding the complexity of many sources, including federation of different databases, behind the simple business model. Note that the expression language in this layer is LSQL, so that any expression can be rewritten into any data source's query language at run time.  This is important for federation, where a given logical object can map to several different physical objects in different databases.  It is also important to portability of the CEIM to different database brands, which is a key requirement for Oracle's BI Applications products. Your requirements process with your user community will mostly affect the business model.  This is where you will define most of the things they specifically ask for, such as metric definitions.  For this reason, many of the best-practice methodologies of our consulting partners start with the high-level definition of this layer. Physical Model The physical model connects the business model that meets your users' requirements to the reality of the data sources you have available. In the query factory analogy, think of the physical layer as the bill of materials for generating physical queries.  Every schema, table, column, join, cube, hierarchy, etc., that will appear in any physical query manufactured at run time must be modeled here at design time. Each physical data source will have its own physical model, or "database" object in the CEIM.  The shape of each physical model matches the shape of its physical source.  In other words, if the source is normalized relational, the physical model will mimic that normalized shape.  If it is a hypercube, the physical model will have a hypercube shape.  If it is a flat file, it will have a denormalized tabular shape. To aid in query optimization, the physical layer also tracks the specifics of the database brand and release.  This allows the BI Server to make the most of each physical source's distinct capabilities, writing queries in its syntax, and using its specific functions. This allows the BI Server to push processing work as deep as possible into the physical source, which minimizes data movement and takes full advantage of the database's own optimizer.  For most data sources, native APIs are used to further optimize performance and functionality. The value of having a distinct separation between the logical (business) and physical models is encapsulation of the physical characteristics.  This encapsulation is another enabler of packaged BI applications and federation.  It is also key to hiding the complex shapes and relationships in the physical sources from the end users.  Consider a routine drill-down in the business model: physically, it can require a drill-through where the first query is MDX to a multidimensional cube, followed by the drill-down query in SQL to a normalized relational database.  The only difference from the user's point of view is that the 2nd query added a more detailed dimension level column - everything else was the same. Mappings Within the Business Model and Mapping Layer, the mappings provide the binding from each logical column and join in the dimensional business model, to each of the objects that can provide its data in the physical layer.  When there is more than one option for a physical source, rules in the mappings are applied to the query context to determine which of the data sources should be hit, and how to combine their results if more than one is used.  These rules specify aggregate navigation, vertical partitioning (fragmentation), and horizontal partitioning, any of which can be federated across multiple, heterogeneous sources.  These mappings are usually the most sophisticated part of the CEIM. Presentation You might think of the presentation layer as a set of very simple relational-like views into the business model.  Over ODBC/JDBC, they present a relational catalog consisting of databases, tables and columns.  For business users, presentation services interprets these as subject areas, folders and columns, respectively.  (Note that in 10g, subject areas were called presentation catalogs in the CEIM.  In this blog, I will stick to 11g terminology.)  Generally speaking, presentation services and other clients can query only these objects (there are exceptions for certain clients such as BI Publisher and Essbase Studio). The purpose of the presentation layer is to specialize the business model for different categories of users.  Based on a user's role, they will be restricted to specific subject areas, tables and columns for security.  The breakdown of the model into multiple subject areas organizes the content for users, and subjects superfluous to a particular business role can be hidden from that set of users.  Customized names and descriptions can be used to override the business model names for a specific audience.  Variables in the object names can be used for localization. For these reasons, you are better off thinking of the tables in the presentation layer as folders than as strict relational tables.  The real semantics of tables and how they function is in the business model, and any grouping of columns can be included in any table in the presentation layer.  In 11g, an LSQL query can also span multiple presentation subject areas, as long as they map to the same business model. Other Model Objects There are some objects that apply to multiple layers.  These include security-related objects, such as application roles, users, data filters, and query limits (governors).  There are also variables you can use in parameters and expressions, and initialization blocks for loading their initial values on a static or user session basis.  Finally, there are Multi-User Development (MUD) projects for developers to check out units of work, and objects for the marketing feature used by our packaged customer relationship management (CRM) software.   The Query Factory At this point, you should have a grasp on the query factory concept.  When developing the CEIM model, you are configuring the BI Server to automatically manufacture millions of queries in response to random user requests. You do this by defining the analytic behavior in the business model, mapping that to the physical data sources, and exposing it through the presentation layer's role-based subject areas. While configuring mass production requires a different mindset than when you hand-craft individual SQL or MDX statements, it builds on the modeling and query concepts you already understand. The following posts in this series will walk through the CEIM modeling concepts and best practices in detail.  We will initially review dimensional concepts so you can understand the business model, and then present a pattern-based approach to learning the mappings from a variety of physical schema shapes and deployments to the dimensional model.  Along the way, we will also present the dimensional calculation template, and learn how to configure the many additivity patterns.

    Read the article

  • issues with horizontal scrolling using jQuery.ScrollTo / jQuery.SerialScroll

    - by kapil.israni
    Hi, I am trying to develop auto horizontal scrolling for our website using - jQuery.ScrollTo / jQuery.SerialScroll. I am not sure if this is the best jquery library to do that, but if there's something better, please let me know. Here's the behavior that I want, check out foursquare's "Recent Activity" list. The data that will refresh will come from a ajax request that I make every few seconds using window.setInterval. I am not really a CSS/java script guy so I havent been able to figure out jQuery.SerialScroll. Here's the website - look at the "Live job Feeds" list. Currently the list does refresh the data coming from the ajax call, but I dont see the effect, the animation, in fact I dont even think serialScroll is being used. Right now I am doing a - $('#feed-ticker').prepend(content) to pre-append the newly arrived data. You can do a view source to look at the current code. Any help would be really appreciated. Thanks.

    Read the article

  • Asp net MVC controllers and widgets

    - by Josemalive
    Hi, I have some doubts about ASP.Net MVC, and i would like to ask few questions. If i understood well, a controller/action is selected from a httprequest. As one request is used to get one web page, could we call to these controllers "page controllers"? My other question is about the widgets and RenderPartial method. If a widget represent a classic asp.net webcontrol or usercontrol, and i want to render this widget in a lot of pages, how could avoid repeat the logic of the widget if this logic is in the "page controller"? Any help would be appreciated. Thanks in advance. Best Regards. Jose

    Read the article

  • iPhone GameKit Picker Fundamental Connection Issues

    - by Kyle
    Hello.. This is one of the more interesting things I've seen in iPhone development. The following question has nothing to do with code because I'm using an SDK Example from Apple (Tanks example). I have a 3GS iPhone, and a 3G iPhone both showing the GameKit picker screen. Both will eventually show the other phone in range just fine (It does take about 25 seconds, though). If I pick the 3G iPhone with my 3GS, the 3G will get a connection request and a connection can be made. However, it will ABSOLUTELY not work in the vice versa. Both phones have bluetooth switched on, and both phones are running the latest OS version. The simple fact is I'm using the SDK example, and it's just not working for the 3G trying to issue the connection. Is there any way to explain this extremely odd behavior? Thanks alot for reading!

    Read the article

  • CodePlex Daily Summary for Tuesday, May 04, 2010

    CodePlex Daily Summary for Tuesday, May 04, 2010New ProjectsAlbum photo de club - Club's Photos Album: Un album photos permettant d'afficher les photos et le détails des membres d'un club - Photo album allowing to view photos and details of the membersBlog.Net Blogging Components: Blog.Net server-side blogging components to add a blog to your current ASP.NET website.FilePirate - Really Advanced LAN File Sharing: Really Advanced, yet super easy, LAN Party File Sharing written using the .Net Framework and C#. Ditch DirectConnect or Windows File Sharing at y...Fisiogest: Programa de gestión de una clínica de fisioterapiaIdeaNMR: An online repository of NMR experiment automated setups with wiki type documentation library and client program providing automated experiment setu...Introducción a Unity: Código de ejemplo del uso de Unity en diferentes situaciones. - Registro de clases, instancias e interfaces. - Resolución de clases, instancias e...Iowa City .NET Developers: This is the project site for the Iowa City .NET Developers.isanywhere: A command line utility to see if one or more files (given a filemask) are to be found anywhere inside a specific directory, or elsewhere inside one...LczCode: lczLog4net udp logs viewer: UdpLogViewer is a .NET 4 WinForm application that receives udp messages from log4net and shows them in a grid. It is possible to filter them or sh...New Silverlight XPS Viewer (In Sl4): New Silverlight XPS viewer Novuz: Novuz is a usenet indexer and reporter. It's developed both in Visual Studio 2010 and MonoDevelop, one of the key features of Novuz is that it sho...PodSnatch: PodSnatch is a podcast client that makes it easy to download rss-enclosures. Multiple simultaneous downloads enabled by threading. GUI is built wi...Robot Shootans: A simple top down shooter game where the player has to kill robots running at them. Written in C++ using SDL with various extentionsSharePoint Rsync List: This program will syncronize files and directories from and unc/local/sharepoint to a SharePoint 2007 or 2010 server. Supports of to 2GB files and ...SignInAndStorageLib: SignInAndStorageLib makes properly handling both sign in and storage issues in Xbox 360 XBLIG XNA games simple. Written in C#, SignInAndStorageLib...SilverBBS: ANSI-style bbs experience delivered via Silverlight. Silverlight flip-down counter: A Silverlight widget that enables you to count down towards a preconfigured event on a configured date.SmartieFly: Smartie Fly is a quiz software program written in C# using Silverlight. It uses SQL Server as a backend database. VS2010 Framework Driven Testing: CodedUITests generate a lot of code, and they break on every change to the object under test. Goals: - write new tests manually, but with as litt...WMediaCatalog: Advanced multimedia cataloguer. Allows users to keep their musical collections well organized and provides flexible methods of filtering, serarching WPathFinder: A simple path management application for windows. Functionality includes: - Add/remove/change path entries easily. - Search for all instances of a...Yasminoku: Yasminoku is an open source "Sudoku" alike game totally written in DHTML (JavaScript, CSS and HTML) that uses mouse. Includes sudoku solver. This c...New ReleasesAlbum photo de club - Club's Photos Album: App - version 0.4: version 0.4 - Critère d'affichage des membres : nom, année, ville - Navigation entre les images d'un membres - Navigation entre les membres - Affi...Album photo de club - Club's Photos Album: Code - Version 0.4: Code source de la version 0.4BigDecimal: Concept Evaluation Release 2 (BigDecimals): This in the second updates release of BigDecimals. It has the four simple arithmetic rules Addition, Subtract, Multiple and Division.CBM-Command: 2010-05-03: New features in this build Keyboard Shortcuts Panel Swapping Panel Toggling On/Off Toggling 40/80 Columns Confirming Quit Confirming GO64...Directory Linker: Directory Linker 2: This release introduces Undo Support and Symbolic File Link support. More details can be found here http://www.humblecoder.co.uk/?p=141DotNetNuke Skins Pack: DotNetNuke 80 Skins Pack: This released is the first for DNN 4 & 5 with Skin Token Design (legacy skin support on DNN 4 & 5)DTLoggedExec: 1.0.0.0: -FIRST NON-BETA RELEASE! :) -Code cleaned up -Added SetPackageInfo method to ILogProvider interface to make easier future improvements -Deprecated...GenerateTypedBamApi: Version 2.1: Changes in this release: NEW: Support for Office Data Connectivity Components 2010 NEW: Include both x86 and x64 EXE's due to lack of support in ...HobbyBrew Mobile: Beta 1 Refresh: Risolto bug circa il salvataggio di ricette (veniva impostato scorrettamente che si trattava di Mash Design "infusione" se ri-aperte con hobbyBrew)...Home Access Plus+: v4.2: Version 4.2 Added Overrides into the Booking System Some slight CSS changes to the Help Desk Updated the config tool to work anywhere on the LA...Hubble.Net - Open source full-text search database: V0.8.3.0: V0.8.3.0 Show server version in about dialog. Fix a bug of deleting querycache files. V0.8.2.9 Change sql client to support userid and password Ch...IdeaNMR: IdeaNMR Client: This is a client program with an example package.kdar: KDAR 0.0.21: KDAR - Kernel Debugger Anti Rootkit - signature's bases updated - usability increased - NDIS6 MINIPORT_BLOCK checks addedLightWeight Application Server: 0.4.1: One step further to beta - yet another release for c# developers audience only. Changes: 1. API - added a LWAS.Infrastructure.Storage service to d...Log4net udp logs viewer: UdpLogViewer 1.0: First release of UdpLogViewer, version 1.0.MDownloader: MDownloader-0.15.11.58370: Fixed minor bugs.Metabolite Enterprise Libraries for EPiServer CMS using Page Type Builder: Metabolite Enterprise Libraries 1.2 Beta 2: This is the beta release of the Metabolite Enterprise Libraries 1.2 Beta 2 for use with EPiServer 6 and Page Type Builder 1.2 Beta 2.Microsoft Silverlight Analytics Framework: Version 1.4.3 Installer: Pre-release Installer for Visual Studio 2010 and Expression Blend 4 RCSupports both Silverlight 3 and Silverlight 4 Release NotesFixed null referen...MultipointTUIO: Multipoint SDK v1.5 Release: Rebuilt against v1.5 of the Microsoft Multipoint SDK, this mean Windows 7 support (and 64bit I think!)My Notepad: My Notepad: This is the status of My Notepad until now. This is many built in features but has to undergo a lot of modifications. The release does not include ...New Silverlight XPS Viewer (In Sl4): Silverlight XPS Viewer: Background: During my development last week I was working on a Silverlight based XPS viewer. During this viewer we came to a situation in which the...NSIS Autorun: NSIS Autorun 0.1.6: This release includes source code, executable binary, files and example materials.Open Diagram: Open Diagram 5.0 Beta May 2010: This is the first beta release of Open Diagram 5.0. Select Crainiate.Diagramming.Examples.Forms as the startup project to view the current Class D...Pocket Wiki: PC Wiki (zip) 1.0.1: PC Version of Pocket Wiki. Unzip and run. Requires .NET Framework 2.0Pocket Wiki: Pocket Wiki 1.0.1 (cab): Pocket Wiki cab installation - requires DotNet 2.0 or greater. Default wiki language is "slash" - a syntax I created that is easy to type on keyboa...Pocket Wiki: Pocket Wiki.sbp: Pocket Wiki Source Code (version .72) - Basic4PPCPublish to Photo Frame: 1.0.2.0: This version adds: add borders to portrait images, for photo frames that crop them incorrectly.Reflection Studio: Reflection Studio 0.1: First download release, it contains a lot of things but allways in beta version. Hope you will like the preview.SharePoint 2010 PowerShell Scripts & Utilities: PSSP2010 Utils 0.1: This is the initial release with SPInstallUtils.psm1 module. This module includes Get-SPPrerequisites and New-SPInstallPackage cmdlets. Refer to th...Silverlight 4.0 Popup Menu: Context Menu for Silverlight 4.0 v1.1 Beta: Multilevel menus are now supported. Added design time support for the PopupMenuItem elements. The project is now under Subversion.Silverlight flip-down counter: FlipDownCounter v1.0: The final release of the Silverlight flip-down counter. Please refer to the included readme file for information on how to use the counter.Stratosphere: Stratosphere 1.0.0.1: Moved scalable block file system implementation to Stratosphere.FileSystemSystem.AddIn Pipeline Builder: Pipeline Builder 1.2: Lots of improvements from the CTP, version 1.0: - Added dialogue for possible overwrite if the file has changed: possibility of ignoring changes (p...ThoughtWorks Cruise Notification Interceptor: 1.0.1: Fixed an issue with the regex that parses the incoming notification. This issue would send failure messages when the build was "fixed".ThreadSafeControls: ThreadSafeControls v0.1: This is the first binary release of the ThreadSafeControls library. I'll call it a pre-alpha release.TracerX Logger/Viewer for .NET: 4.0: View this CodeProject article for documentation on how to use the latest version of the Logger. About the DownloadsVersion: 4.0.1005.1163 Changese...VCC: Latest build, v2.1.30503.0: Automatic drop of latest buildVisual Studio DSite: Lottery Game (Visual C++ 2008): An advanced lottery game made in visual c 2008.VivoSocial: VivoSocial 7.1.3: Version 7.1.3 of VivoSocial has been released. If you experienced any issues with the previous version, please update your modules to the 7.1.3 rel...Xrns2XMod: Xrns2XMod 1.0: Features added Conversion of all possible convertible features between Renoise and MOD / XM. FlacBox lib updated (thanks to Yuri) NAudio lib in...Most Popular ProjectsRawrWBFS ManagerAJAX Control ToolkitMicrosoft SQL Server Product Samples: Databasepatterns & practices – Enterprise LibrarySilverlight ToolkitiTuner - The iTunes CompanionWindows Presentation Foundation (WPF)ASP.NETDotNetNuke® Community EditionMost Active ProjectsIonics Isapi Rewrite Filterpatterns & practices – Enterprise LibraryRawrHydroServer - CUAHSI Hydrologic Information System ServerAJAX Control Frameworkpatterns & practices: Azure Security GuidanceNB_Store - Free DotNetNuke Ecommerce Catalog ModuleBlogEngine.NETTinyProjectDambach Linear Algebra Framework

    Read the article

  • TempData Error in ASP .NET MVC2

    - by Mauro
    Hi, I'm looking for a description of the reasons why this error is generated and what are the possible fixes to it. In my ASP.NET MVC2 controller I just added a TempData data passing. Server Error in '/' Application. The SessionStateTempDataProvider class requires session state to be enabled. Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code. Exception Details: System.InvalidOperationException: The SessionStateTempDataProvider class requires session state to be enabled.

    Read the article

  • WHoosh (full text search) index problem

    - by Rama Vadakattu
    iam having the following problem with whoosh full text search engine. 1.After syncdb i am creating the intial index from the database objects. 2.it is working fine.I can able to search the data and see the results. 3.after that in one of my view i have added another document (via signals) to the index (during a request --response) 4.that' it from then onwards i could not able to search any data , for which i have successfully found results before adding new document (before step 3) ix = storage.open_index() writer = ix.writer() writer.add_document(.............) I have tried hard to resolve but i could not. Any ideas on how to resolve this problem?

    Read the article

  • How to run the servlet program in netbeans IDE?

    - by Venkats
    I am new to java servlets. I learning from the basic. I have a simple servlet program, but i don't know how to run it. import java.io.*; import javax.servlet.*; import javax.servlet.http.*; public class HelloWorld extends HttpServlet { public void doGet(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException { PrintWriter out = response.getWriter(); out.println("Hello World"); } } How can run the above program in netbeans. I am using the netbeans6.8. Whats are the procedures which i have to follow? Thanks in Advance.

    Read the article

  • ASP.NET MVC Ajax Form: Is enctype correct? Why doesn't file get uploaded?

    - by Fabio Milheiro
    In the case that the user doesn't have Javascript activated, in order to draw a form, I begin this way: <% using (Html.BeginForm("Create", "Language", FormMethod.Post, new {enctype="multipart/form-data"})) { %> If the user has Javascript activated, the following code is used: <% using (Ajax.BeginForm("Create", "Language", new AjaxOptions { UpdateTargetId = "CommonArea" }, new { enctype = "multipart/form-data" })) { %> The problem is this: In the first case, I can get the file uploaded using the following instruction in the business layer: // Get the uploaded file HttpPostedFile Flag = HttpContext.Current.Request.Files["Flag"]; In the second case, this instruction doesn't work. How do I know upload that file using the Ajax.BeginForm? Is the code right? Can anyone more experience advise about using jQuery plug-in to upload file before the form submission? Thank you

    Read the article

  • Handling redirected URL within Flex app?

    - by fortpointuiguy
    We have a Flex client and a server that is using the Spring/Blazeds project. After the user logs in and is authenticated, the spring security layer sends a redirect to a new URL which is where our main application is located. However, within the flex client, I'm currently using HTTPService for the initial request and I get the redirected page sent back to me in its entirety. How can I just get the URL so that I can use navigatetourl to get where the app to go where it needs to? Any help would greatly be appreciated. Thanks!

    Read the article

  • Check ReturnUrl is valid before redirecting

    - by Josh
    I'm using ASP.NET Membership and Form Authentication and before redirecting to the returnURL I wanted to validate it. For those unfamiliar with the workflow, basically if you request a page that requires that you are authenticated, you are redirected to a login page. In the URL string you'll see a parameter called returnURL, e.g. http://example.com/login.aspx?ReturnUrl=%2fprotected%2fdefault.aspx Whether you use this in a redirect such as Response.Redirect(returnURL) or indirectly through the FormsAuthentication.RedirectFromLoginPage method, it passes without validating returnURL. FormsAuthentication.RedirectFromLoginPage does have a security check that it is isn't leaving the domain, but that still doesn't stop someone from putting enough random characters to cause an error. I tried using System.IO.File.Exists(Server.MapPath(returnURL)) but given enough illegal characters it cause Server.MapPath to error. Note: URLEncoding doesn't work because we are not cleaning a parameter, but the primary URL. Any other suggestions for validating or cleaning the returnURL value?

    Read the article

  • Hello Operator, My Switch Is Bored

    - by Paul White
    This is a post for T-SQL Tuesday #43 hosted by my good friend Rob Farley. The topic this month is Plan Operators. I haven’t taken part in T-SQL Tuesday before, but I do like to write about execution plans, so this seemed like a good time to start. This post is in two parts. The first part is primarily an excuse to use a pretty bad play on words in the title of this blog post (if you’re too young to know what a telephone operator or a switchboard is, I hate you). The second part of the post looks at an invisible query plan operator (so to speak). 1. My Switch Is Bored Allow me to present the rare and interesting execution plan operator, Switch: Books Online has this to say about Switch: Following that description, I had a go at producing a Fast Forward Cursor plan that used the TOP operator, but had no luck. That may be due to my lack of skill with cursors, I’m not too sure. The only application of Switch in SQL Server 2012 that I am familiar with requires a local partitioned view: CREATE TABLE dbo.T1 (c1 int NOT NULL CHECK (c1 BETWEEN 00 AND 24)); CREATE TABLE dbo.T2 (c1 int NOT NULL CHECK (c1 BETWEEN 25 AND 49)); CREATE TABLE dbo.T3 (c1 int NOT NULL CHECK (c1 BETWEEN 50 AND 74)); CREATE TABLE dbo.T4 (c1 int NOT NULL CHECK (c1 BETWEEN 75 AND 99)); GO CREATE VIEW V1 AS SELECT c1 FROM dbo.T1 UNION ALL SELECT c1 FROM dbo.T2 UNION ALL SELECT c1 FROM dbo.T3 UNION ALL SELECT c1 FROM dbo.T4; Not only that, but it needs an updatable local partitioned view. We’ll need some primary keys to meet that requirement: ALTER TABLE dbo.T1 ADD CONSTRAINT PK_T1 PRIMARY KEY (c1);   ALTER TABLE dbo.T2 ADD CONSTRAINT PK_T2 PRIMARY KEY (c1);   ALTER TABLE dbo.T3 ADD CONSTRAINT PK_T3 PRIMARY KEY (c1);   ALTER TABLE dbo.T4 ADD CONSTRAINT PK_T4 PRIMARY KEY (c1); We also need an INSERT statement that references the view. Even more specifically, to see a Switch operator, we need to perform a single-row insert (multi-row inserts use a different plan shape): INSERT dbo.V1 (c1) VALUES (1); And now…the execution plan: The Constant Scan manufactures a single row with no columns. The Compute Scalar works out which partition of the view the new value should go in. The Assert checks that the computed partition number is not null (if it is, an error is returned). The Nested Loops Join executes exactly once, with the partition id as an outer reference (correlated parameter). The Switch operator checks the value of the parameter and executes the corresponding input only. If the partition id is 0, the uppermost Clustered Index Insert is executed, adding a row to table T1. If the partition id is 1, the next lower Clustered Index Insert is executed, adding a row to table T2…and so on. In case you were wondering, here’s a query and execution plan for a multi-row insert to the view: INSERT dbo.V1 (c1) VALUES (1), (2); Yuck! An Eager Table Spool and four Filters! I prefer the Switch plan. My guess is that almost all the old strategies that used a Switch operator have been replaced over time, using things like a regular Concatenation Union All combined with Start-Up Filters on its inputs. Other new (relative to the Switch operator) features like table partitioning have specific execution plan support that doesn’t need the Switch operator either. This feels like a bit of a shame, but perhaps it is just nostalgia on my part, it’s hard to know. Please do let me know if you encounter a query that can still use the Switch operator in 2012 – it must be very bored if this is the only possible modern usage! 2. Invisible Plan Operators The second part of this post uses an example based on a question Dave Ballantyne asked using the SQL Sentry Plan Explorer plan upload facility. If you haven’t tried that yet, make sure you’re on the latest version of the (free) Plan Explorer software, and then click the Post to SQLPerformance.com button. That will create a site question with the query plan attached (which can be anonymized if the plan contains sensitive information). Aaron Bertrand and I keep a close eye on questions there, so if you have ever wanted to ask a query plan question of either of us, that’s a good way to do it. The problem The issue I want to talk about revolves around a query issued against a calendar table. The script below creates a simplified version and adds 100 years of per-day information to it: USE tempdb; GO CREATE TABLE dbo.Calendar ( dt date NOT NULL, isWeekday bit NOT NULL, theYear smallint NOT NULL,   CONSTRAINT PK__dbo_Calendar_dt PRIMARY KEY CLUSTERED (dt) ); GO -- Monday is the first day of the week for me SET DATEFIRST 1;   -- Add 100 years of data INSERT dbo.Calendar WITH (TABLOCKX) (dt, isWeekday, theYear) SELECT CA.dt, isWeekday = CASE WHEN DATEPART(WEEKDAY, CA.dt) IN (6, 7) THEN 0 ELSE 1 END, theYear = YEAR(CA.dt) FROM Sandpit.dbo.Numbers AS N CROSS APPLY ( VALUES (DATEADD(DAY, N.n - 1, CONVERT(date, '01 Jan 2000', 113))) ) AS CA (dt) WHERE N.n BETWEEN 1 AND 36525; The following query counts the number of weekend days in 2013: SELECT Days = COUNT_BIG(*) FROM dbo.Calendar AS C WHERE theYear = 2013 AND isWeekday = 0; It returns the correct result (104) using the following execution plan: The query optimizer has managed to estimate the number of rows returned from the table exactly, based purely on the default statistics created separately on the two columns referenced in the query’s WHERE clause. (Well, almost exactly, the unrounded estimate is 104.289 rows.) There is already an invisible operator in this query plan – a Filter operator used to apply the WHERE clause predicates. We can see it by re-running the query with the enormously useful (but undocumented) trace flag 9130 enabled: Now we can see the full picture. The whole table is scanned, returning all 36,525 rows, before the Filter narrows that down to just the 104 we want. Without the trace flag, the Filter is incorporated in the Clustered Index Scan as a residual predicate. It is a little bit more efficient than using a separate operator, but residual predicates are still something you will want to avoid where possible. The estimates are still spot on though: Anyway, looking to improve the performance of this query, Dave added the following filtered index to the Calendar table: CREATE NONCLUSTERED INDEX Weekends ON dbo.Calendar(theYear) WHERE isWeekday = 0; The original query now produces a much more efficient plan: Unfortunately, the estimated number of rows produced by the seek is now wrong (365 instead of 104): What’s going on? The estimate was spot on before we added the index! Explanation You might want to grab a coffee for this bit. Using another trace flag or two (8606 and 8612) we can see that the cardinality estimates were exactly right initially: The highlighted information shows the initial cardinality estimates for the base table (36,525 rows), the result of applying the two relational selects in our WHERE clause (104 rows), and after performing the COUNT_BIG(*) group by aggregate (1 row). All of these are correct, but that was before cost-based optimization got involved :) Cost-based optimization When cost-based optimization starts up, the logical tree above is copied into a structure (the ‘memo’) that has one group per logical operation (roughly speaking). The logical read of the base table (LogOp_Get) ends up in group 7; the two predicates (LogOp_Select) end up in group 8 (with the details of the selections in subgroups 0-6). These two groups still have the correct cardinalities as trace flag 8608 output (initial memo contents) shows: During cost-based optimization, a rule called SelToIdxStrategy runs on group 8. It’s job is to match logical selections to indexable expressions (SARGs). It successfully matches the selections (theYear = 2013, is Weekday = 0) to the filtered index, and writes a new alternative into the memo structure. The new alternative is entered into group 8 as option 1 (option 0 was the original LogOp_Select): The new alternative is to do nothing (PhyOp_NOP = no operation), but to instead follow the new logical instructions listed below the NOP. The LogOp_GetIdx (full read of an index) goes into group 21, and the LogOp_SelectIdx (selection on an index) is placed in group 22, operating on the result of group 21. The definition of the comparison ‘the Year = 2013’ (ScaOp_Comp downwards) was already present in the memo starting at group 2, so no new memo groups are created for that. New Cardinality Estimates The new memo groups require two new cardinality estimates to be derived. First, LogOp_Idx (full read of the index) gets a predicted cardinality of 10,436. This number comes from the filtered index statistics: DBCC SHOW_STATISTICS (Calendar, Weekends) WITH STAT_HEADER; The second new cardinality derivation is for the LogOp_SelectIdx applying the predicate (theYear = 2013). To get a number for this, the cardinality estimator uses statistics for the column ‘theYear’, producing an estimate of 365 rows (there are 365 days in 2013!): DBCC SHOW_STATISTICS (Calendar, theYear) WITH HISTOGRAM; This is where the mistake happens. Cardinality estimation should have used the filtered index statistics here, to get an estimate of 104 rows: DBCC SHOW_STATISTICS (Calendar, Weekends) WITH HISTOGRAM; Unfortunately, the logic has lost sight of the link between the read of the filtered index (LogOp_GetIdx) in group 22, and the selection on that index (LogOp_SelectIdx) that it is deriving a cardinality estimate for, in group 21. The correct cardinality estimate (104 rows) is still present in the memo, attached to group 8, but that group now has a PhyOp_NOP implementation. Skipping over the rest of cost-based optimization (in a belated attempt at brevity) we can see the optimizer’s final output using trace flag 8607: This output shows the (incorrect, but understandable) 365 row estimate for the index range operation, and the correct 104 estimate still attached to its PhyOp_NOP. This tree still has to go through a few post-optimizer rewrites and ‘copy out’ from the memo structure into a tree suitable for the execution engine. One step in this process removes PhyOp_NOP, discarding its 104-row cardinality estimate as it does so. To finish this section on a more positive note, consider what happens if we add an OVER clause to the query aggregate. This isn’t intended to be a ‘fix’ of any sort, I just want to show you that the 104 estimate can survive and be used if later cardinality estimation needs it: SELECT Days = COUNT_BIG(*) OVER () FROM dbo.Calendar AS C WHERE theYear = 2013 AND isWeekday = 0; The estimated execution plan is: Note the 365 estimate at the Index Seek, but the 104 lives again at the Segment! We can imagine the lost predicate ‘isWeekday = 0’ as sitting between the seek and the segment in an invisible Filter operator that drops the estimate from 365 to 104. Even though the NOP group is removed after optimization (so we don’t see it in the execution plan) bear in mind that all cost-based choices were made with the 104-row memo group present, so although things look a bit odd, it shouldn’t affect the optimizer’s plan selection. I should also mention that we can work around the estimation issue by including the index’s filtering columns in the index key: CREATE NONCLUSTERED INDEX Weekends ON dbo.Calendar(theYear, isWeekday) WHERE isWeekday = 0 WITH (DROP_EXISTING = ON); There are some downsides to doing this, including that changes to the isWeekday column may now require Halloween Protection, but that is unlikely to be a big problem for a static calendar table ;)  With the updated index in place, the original query produces an execution plan with the correct cardinality estimation showing at the Index Seek: That’s all for today, remember to let me know about any Switch plans you come across on a modern instance of SQL Server! Finally, here are some other posts of mine that cover other plan operators: Segment and Sequence Project Common Subexpression Spools Why Plan Operators Run Backwards Row Goals and the Top Operator Hash Match Flow Distinct Top N Sort Index Spools and Page Splits Singleton and Range Seeks Bitmaps Hash Join Performance Compute Scalar © 2013 Paul White – All Rights Reserved Twitter: @SQL_Kiwi

    Read the article

  • RequestDispatcher forward between Tomcat instances

    - by MontyBongo
    I have a scenario where I have single entry point Servlet and further Servlets that requests are forwarded to that undertake heavy processing. I am looking at options to distribute this load and I would like to know if it is possible using Tomcat or another platform to forward requests between Servlets sitting on different servers using a cluster type configuration or similar. I have found some documentation on clustering Servlets and Tomcat but none indicate if Servlet request forwarding is possible from what I can see. http://java.sun.com/blueprints/guidelines/designing_enterprise_applications_2e/web-tier/web-tier5.html http://tomcat.apache.org/tomcat-5.5-doc/cluster-howto.html

    Read the article

  • Where to place ClientAccessPolicy.xml for Local WCF Service?

    - by cam
    I'm trying to create a basic WCF Service and Silverlight client. I've followed the following tutorial: http://channel9.msdn.com/shows/Endpoint/Endpoint-Screencasts-Creating-Your-First-WCF-Client/ Since Silverlight 4 was incompatible with the WSHttpBinding, I changed it to BasicHttpBinding. Unfortunately I keep getting this error now: "An error occurred while trying to make a request to URI'**'.This could be due to attempting to access a service in a cross-domain way without a proper cross-domain policy in place, or a policy that is unsuitable for SOAP services. You may need to contact the owner of the service to publish a cross-domain policy file and to ensure it allows SOAP-related HTTP headers to be sent." I placed clientaccesspolicy.xml in the root directory of the WCF project (which is in the same solution as the Silverlight client). This did not solve the problem. What do I need to do?

    Read the article

  • Error while configuring Sql server 2005 for full text search

    - by San
    I am trying to index a table for full text search on a SQL server 2005. When I select the change tracking as Automatic and click on the next button, I get the following error TITLE: Microsoft SQL Server This wizard will close because it encountered the following error: For help, click: http://go.microsoft.com/fwlink?ProdName=Microsoft+SQL+Server+Management+Studio&ProdVer=9.00.4035.00&EvtSrc=Microsoft.SqlServer.Management.UI.WizardFrameworkErrorSR&EvtID=UncaughtException&LinkId=20476 ------------------------------ ADDITIONAL INFORMATION: Failed to retrieve data for this request. (Microsoft.SqlServer.SmoEnum) For help, click: http://go.microsoft.com/fwlink?ProdName=Microsoft+SQL+Server&LinkId=20476 An exception occurred while executing a Transact-SQL statement or batch. (Microsoft.SqlServer.ConnectionInfo) The EXECUTE permission was denied on the object 'sp_help_category', database 'msdb', schema 'dbo'. The SELECT permission was denied on the object 'sysjobs_view', database 'msdb', schema 'dbo'. (Microsoft SQL Server, Error: 229) For help, click: http://go.microsoft.com/fwlink?ProdName=Microsoft+SQL+Server&ProdVer=09.00.4035&EvtSrc=MSSQLServer&EvtID=229&LinkId=20476 ------------------------------ BUTTONS: OK

    Read the article

  • Can't disable jQuery cache

    - by robert_d
    Update I figured out that it must be caching problem but I can't turn cache off. Here is my changed script: <script src="../../Scripts/jquery-1.4.1.js" type="text/javascript"></script> <script type="text/javascript"> jQuery.ajaxSetup({ // Disable caching of AJAX responses cache: false }); jQuery("#button1").click(function (e) { window.setInterval(refreshResult, 3000); }); function refreshResult() { jQuery("#divResult").load("/Home/Refresh"); } </script> It updates part of a web page every 3 sec. It works only once after clearing web browser cache, after that it doesn't work - requests are made to /Home/Refresh without interval of 3 seconds and nothing is displayed on the web page; subsequent requests send cookie ASP.NET_SessionId=wrkx1avgvzwozcn1frsrb2yh. I am using ASP.NET MVC 2 and c#. I have a problem with jQuery, here is how my web app works Search.aspx web page which contains a form and jQuery script posts data to Search() action in Home controller after user clicks button1 button. Search.aspx: <%@ Page Title="" Language="C#" MasterPageFile="~/Views/Shared/Site.Master" Inherits="System.Web.Mvc.ViewPage<GLSChecker.Models.WebGLSQuery>" %> <asp:Content ID="Content1" ContentPlaceHolderID="TitleContent" runat="server"> Title </asp:Content> <asp:Content ID="Content2" ContentPlaceHolderID="MainContent" runat="server"> <h2>Search</h2> <% Html.EnableClientValidation(); %> <% using (Html.BeginForm()) {%> <fieldset> <div class="editor-label"> <%: Html.LabelFor(model => model.Url) %> </div> <div class="editor-field"> <%: Html.TextBoxFor(model => model.Url, new { size = "50" } ) %> <%: Html.ValidationMessageFor(model => model.Url) %> </div> <div class="editor-label"> <%: Html.LabelFor(model => model.Location) %> </div> <div class="editor-field"> <%: Html.TextBoxFor(model => model.Location, new { size = "50" } ) %> <%: Html.ValidationMessageFor(model => model.Location) %> </div> <div class="editor-label"> <%: Html.LabelFor(model => model.KeywordLines) %> </div> <div class="editor-field"> <%: Html.TextAreaFor(model => model.KeywordLines, 10, 60, null)%> <%: Html.ValidationMessageFor(model => model.KeywordLines)%> </div> <p> <input id ="button1" type="submit" value="Search" /> </p> </fieldset> <% } %> <script src="../../Scripts/jquery-1.4.1.js" type="text/javascript"></script> <script type="text/javascript"> jQuery("#button1").click(function (e) { window.setInterval(refreshResult, 5000); }); function refreshResult() { jQuery("#divResult").load("/Home/Refresh"); } </script> <div id="divResult"> </div> </asp:Content> [HttpPost] public ActionResult Search(WebGLSQuery queryToCreate) { if (!ModelState.IsValid) return View("Search"); queryToCreate.Remote_Address = HttpContext.Request.ServerVariables["REMOTE_ADDR"]; Session["Result"] = null; SearchKeywordLines(queryToCreate); Thread.Sleep(15000); return View("Search"); }//Search() After button1 button is clicked the above script from Search.aspx web page runs. Search() action in controller runs for longer period of time. I simulate this in testing by putting Thread.Sleep(15000); in Search()action. 5 sec. after Submit button was pressed, the above jQuery script calls Refresh() action in Home controller. public ActionResult Refresh() { ViewData["Result"] = DateTime.Now; return PartialView(); } Refresh() renders this partial <%@ Control Language="C#" Inherits="System.Web.Mvc.ViewUserControl" % <%= ViewData["Result"] % The problem is that in Internet Explorer 8 there is only one request to /Home/Refresh; in Firefox 3.6.3 all requests to /Home/Refresh are made but nothing is displayed on the web page. Another problem with Firefox is that requests to /Home/Refresh are made every second not every 5 seconds. I noticed that after I clear Firefox cache the script works well first time button1 is pressed, but after that it doesn't work. I would be grateful for helpful suggestions.

    Read the article

  • wcf data service security configuration

    - by Daniel Pratt
    I'm in the process of setting up a WCF Data Services web service and I'm trying to sort out the security configuration. Although there's quite a lot of documentation out there for configuring WCF security, a lot of it seems to be outmoded or does not apply to my scenario. Ultimately, I am planning on managing authorization of operations via change interceptors. Thus, all I really need is the simplest way to permit a client to pass credentials along with a request and to be able to authenticate those credentials against either AD or an ASP.NET membership provider (I'd much prefer the latter unless it makes things much more complicated). I'm intending to manage encryption at the transport level (i.e. HTTPS). I'm hoping that the eventual solution does not involve a huge web.config. Likewise, I'd much prefer to avoid writing custom code for the purpose of authentication.

    Read the article

  • New Facebook FQL table vs. Grap API

    - by PanosJee
    Hello everyone I just read the new User fql table fields at http://developers.facebook.com/docs/reference/fql/user As I can see a lot of the fields have been deprecated such as work_history or books and movies. It is quite essential for my app to get all those fields for my user's friends in a single fql query. If i am not wrong the only way to do this is to get those extra fields using the Graph API by requesting them seperately for every friend of my user. Is there any way to do it in a more efficient way without so many calls? Can I subscribe to real time updates for the request fields for my user's friends (i do not care about the logged in user data)? Thank you a lot

    Read the article

< Previous Page | 433 434 435 436 437 438 439 440 441 442 443 444  | Next Page >