Search Results

Search found 19950 results on 798 pages for 'url scheme'.

Page 575/798 | < Previous Page | 571 572 573 574 575 576 577 578 579 580 581 582  | Next Page >

  • CodePlex Daily Summary for Tuesday, September 25, 2012

    CodePlex Daily Summary for Tuesday, September 25, 2012Popular ReleasesRawr: Rawr 5.0.0: This is the Downloadable WPF version of Rawr!For web-based version see http://elitistjerks.com/rawr.php You can find the version notes at: http://rawr.codeplex.com/wikipage?title=VersionNotes Rawr Addon (NOT UPDATED YET FOR MOP)We now have a Rawr Official Addon for in-game exporting and importing of character data hosted on Curse. The Addon does not perform calculations like Rawr, it simply shows your exported Rawr data in wow tooltips and lets you export your character to Rawr (including ba...Coevery - Free CRM: Coevery 1.0.0.26: The zh-CN issue has been solved. We also add a project management module.VidCoder: 1.4.1 Beta: Updated to HandBrake 4971. This should fix some issues with stuck PGS subtitles. Fixed build break which prevented pre-compiled XML serializers from showing up. Fixed problem where a preset would get errantly marked as modified when re-opening the encode settings window or importing a new preset.D3 Loot Tracker: 1.3: Added the ability to reload a previous session to be able to resume it. Removed goblin detection, let's keep this an item tracking utility only. Fixed a bug with crafting sound setting not working properly. Completely re-styled the UI.JSLint for Visual Studio 2010: 1.4.0: VS2012 support is alphaBlackJumboDog: Ver5.7.2: 2012.09.23 Ver5.7.2 (1)InetTest?? (2)HTTP?????????????????100???????????Player Framework by Microsoft: Player Framework for Windows 8 (Preview 6): IMPORTANT: List of breaking changes from preview 5 Added separate samples download with .vsix dependencies instead of source dependencies Support for FreeWheel SmartXML ad responses Support for Smooth Streaming SDK DownloaderPlugins Support for VMAP and TTML polling for live scenarios Support for custom smooth streaming byte stream and scheme handlers Support for new play time and position tracking plugin Added IsLiveChanged event Added AdaptivePlugin.MaxBitrate property Add...WPF Application Framework (WAF): WPF Application Framework (WAF) 2.5.0.8: Version: 2.5.0.8 (Milestone 8): This release contains the source code of the WPF Application Framework (WAF) and the sample applications. Requirements .NET Framework 4.0 (The package contains a solution file for Visual Studio 2010) The unit test projects require Visual Studio 2010 Professional Changelog Legend: [B] Breaking change; [O] Marked member as obsolete WAF: Mark the class DataModel as serializable. InfoMan: Minor improvements. InfoMan: Add unit tests for all modules. Othe...LogicCircuit: LogicCircuit 2.12.9.20: Logic Circuit - is educational software for designing and simulating logic circuits. Intuitive graphical user interface, allows you to create unrestricted circuit hierarchy with multi bit buses, debug circuits behavior with oscilloscope, and navigate running circuits hierarchy. Changes of this versionToolbars on text note dialog are more flexible now. You can select font face, size, color, and background of text you are typing. RAM now can be initialized to one of the following: random va...SiteMap Editor for Microsoft Dynamics CRM 2011: SiteMap Editor (1.1.2020.421): New features: Disable a specific part of SiteMap to keep the data without displaying them in the CRM application. It simply comments XML part of the sitemap (thanks to rboyers for this feature request) Right click an item and click on "Disable" to disable it Items disabled are greyed and a suffix "- disabled" is added Right click an item and click on "Enable" to enable it Refresh list of web resources in the web resources pickerHigLabo: HigLabo_20120919: Add XXXAsync method to all Client class for async await pattern. (HttpClient,BoxNetClient,DropboxClient,FacebookClient,FtpClient,RssClient,SugarSyncClient,TwitterClient,WindowsLiveClient) Add all api to HigLabo.Net.Ftp project. Add strong name to all assembly. Add HttpBodyMultipartFormData to provide upload multipart form data with http protocol. Add HttpBodyFormUrlEncodedData to provide form url encoded post data with http protocol. FacebookClient,RssClient,WindowsLiveClient,BoxNetClient cl...AJAX Control Toolkit: September 2012 Release: AJAX Control Toolkit Release Notes - September 2012 Release Version 60919September 2012 release of the AJAX Control Toolkit. AJAX Control Toolkit .NET 4.5 – AJAX Control Toolkit for .NET 4.5 and sample site (Recommended). AJAX Control Toolkit .NET 4 – AJAX Control Toolkit for .NET 4 and sample site (Recommended). AJAX Control Toolkit .NET 3.5 – AJAX Control Toolkit for .NET 3.5 and sample site (Recommended). Notes: - The current version of the AJAX Control Toolkit is not compatible with ...Sense/Net CMS - Enterprise Content Management: SenseNet 6.1.2 Community Edition: Sense/Net 6.1.2 Community EditionMain new featuresOur current release brings a lot of bugfixes, including the resolution of js/css editing cache issues, xlsx file handling from Office, expense claim demo workspace fixes and much more. Besides fixes 6.1.2 introduces workflow start options and other minor features like a reusable Reject client button for approval scenarios and resource editor enhancements. We have also fixed an issue with our install package to bring you a flawless installation...Solution Extender for Microsoft Dynamics CRM 2011: Solution Extender (2.0.0.6): Fix a problem when serializing entity records (this fix the problem when exporting queues)Visual C++ Directories Editor: VC++ Directories 2012 Editor v1.0 ML (x32-x64): version 1.0 ML for Visual C++ 2012WinRT XAML Toolkit: WinRT XAML Toolkit - 1.2.3: WinRT XAML Toolkit based on the Windows 8 RTM SDK. Download the latest source from the SOURCE CODE page. For compiled version use NuGet. You can add it to your project in Visual Studio by going to View/Other Windows/Package Manager Console and entering: PM> Install-Package winrtxamltoolkit Features AsyncUI extensions Controls and control extensions Converters Debugging helpers Imaging IO helpers VisualTree helpers Samples Recent changes NOTE: Namespace changes DebugConsol...Python Tools for Visual Studio: 1.5 RC: PTVS 1.5RC Available! We’re pleased to announce the release of Python Tools for Visual Studio 1.5 RC. Python Tools for Visual Studio (PTVS) is an open-source plug-in for Visual Studio which supports programming with the Python language. PTVS supports a broad range of features including CPython/IronPython, Edit/Intellisense/Debug/Profile, Cloud, HPC, IPython, etc. support. The primary new feature for the 1.5 release is Django including Azure support! The http://www.djangoproject.com is a pop...Launchbar: Lanchbar 4.0.0: This application requires .NET 4.5 which you can find here: www.microsoft.com/visualstudio/downloadsAssaultCube Reloaded: 2.5.4 -: Linux has Ubuntu 11.10 32-bit precompiled binaries and Ubuntu 10.10 64-bit precompiled binaries, but you can compile your own as it also contains the source. If you are using Mac or other operating systems, please wait while we try to package for those OSes. Try to compile it. If it fails, download a virtual machine. The server pack is ready for both Windows and Linux, but you might need to compile your own for Linux (source included) Changelog: New logo Improved airstrike! Reset nukes...Extended WPF Toolkit: Extended WPF Toolkit - 1.7.0: Want an easier way to install the Extended WPF Toolkit?The Extended WPF Toolkit is available on Nuget. What's new in the 1.7.0 Release?New controls Zoombox Pie New features / bug fixes PropertyGrid.ShowTitle property added to allow showing/hiding the PropertyGrid title. Modifications to the PropertyGrid.EditorDefinitions collection will now automatically be applied to the PropertyGrid. Modifications to the PropertyGrid.PropertyDefinitions collection will now be reflected automaticaly...New ProjectsAffine transformations (Iterated function system): Small app for generating fractals using iterated function system.aspnet mvc store: Lite ASP.NET MVC CMSAugmented Reality: .Autocomplete: AutocompleteBCS to provide stock information in SharePoint 2013: This .Net assembly BCS external system provides live, read only data on Dow Jones 30 stocks details from MSN money webservices.Blood Alcohol Measurement Tool: Az alkalmazás kijelzi a felhasználó véralkoholszint változását az elfogyasztott alkoholos italok függvényében.Busqueda Incremental con un TEXTBOX: Hola, aqui estoy de nuevo con un aporte mas para la comunidad, en muchos foros he visto que estan buscando como hacer una busqueda incremental en un TEXTBOX.CRM 2011: Reassign or Transfer Personal Views: This Project allows CRM Administrator to quickly transfer (i.e. assign) advanced find views from one CRM user to another CRM user. ctripITSM doc: this is a documentation share for ctripITSM projectet Sprint 3: etsprint3Finance App for Windows 8: This is an WinJS Windows 8 application that computes various financial metrics.gadgets: Windows Sidebar Gargetsjprj: jprjKerosene ORM: Kerosene is a self-adaptive and configuration-less ORM library, with a SQL syntax based on C# dynamics, WCF, and Entity Framework capabilities for POCO objects.Lobster: ?? ?? ?????????.My Google Map: MyGoogleMap est un outils de génération de carte. Onestop.Contrib.CustomAdmin: Onestop.Contrib.CustomAdmin is a Theme for Orchard CMS providing for changing the admin dashboard elements such as Title and Logo.Onestop.Contrib.Disqus: Onestop.Contrib.Disqus is an advanced commenting module for Orchard CMS that uses Disqus.Onestop.Contrib.LayoutSelector: Onestop.Contrib.LayoutSelector is a simple part for switching to different versions of Layout.cshtml when editing Orchard content items.Onestop.Contrib.Navigation: Onestop.Contrib.Navigation is an advanced Menu Management system designed for Orchard CMS. Onestop.Contrib.Seo: Onestop.Contrib.Seo is an advanced Search Engine Optimization module for Orchard CMSOnestop.Contrib.SlideShow: Onestop.Contrib.SlideShow is an advanced module for managing slide animations on a page.pf2012: Simple HTML5 game as PF2012 electronic greeting.PHP-2012: this is for php for schoolPruebaProyecto: Carmen Asencio AmbrosioPulawJS: MVC Platform for JavaScript. Inspired by Zend FrameworkRevolution Emulator: Habbo Hotel flash emulator targeting the .NET 4.5 VM, written in C#.Ruby Rookie: This Project is for learning Ruby purposesSISLOG_Proy2: SISLOGSkyShellEx: SkyShellEx allows to sync any folder to SkyDrive via a simple ShellExtension. The sync option appears on the context menu of folders where applicable. sosoft: Sosoft Project.C# WinForm.Include Alarm Clock.sound9: sound9testddgit09242012: dTFS Agile Work Item Rollover: This is a command line utility used to rollover incomplete work from one sprint to the next. The tool has both interactive and silent modes, allowing you to seTFS Deployment Studio: Helps deploying applications built using TFS to the servers where they belong.TreeView In-place Editing in MVVM: This project demonstrates a clean way of doing the in-place editing in the WPF TreeView controlUniSoft: Teste source controlValidation Framework for .NET: Framework for validation of method paramters and return values.WarOfDev: war of developer, make your coding interesting . Waterhouse: A C# console application that takes various text inputs and converts it to Morse Code by blinking the numlock indicator.Web Package Pro: in dev

    Read the article

  • Possible to register Selenium RC's with the Hudson Selenium Grid Hub w/o the RC's being slaves in th

    - by Rodreegez
    I am trying to get Hudson to run my ruby based selenium tests. I have installed the Selenium Grid plugin, but I don't want to have the RC's running as slaves in a Hudson cluster. The reason for this is I don't want to waste the next six years of my life trying to configure each of my projects in various Windows environments. Hudson currently pulls each project from Github and builds it just fine. With a regular Selenium Grid setup, I am able to edit the grid_configuration.yml file to represent the various environments I wish to tests against, then pass environment variables to the rake task that runs the test i.e. which browser/platfom to run on and the URL of the application under test -- usually a port on the hub machine running in a specific environment. In this way, the machines on which the RC's run don't need to know anything about the source code of my apps, they just need to have selenium-grid installed and have registered with the hub. Is there a way of elegantly emulating this with Hudson?

    Read the article

  • Row Number Transformation

    The Row Number Transformation calculates a row number for each row, and adds this as a new output column to the data flow. The column number is a sequential number, based on a seed value. Each row receives the next number in the sequence, based on the defined increment value. The final row number can be stored in a variable for later analysis, and can be used as part of a process to validate the integrity of the data movement. The Row Number transform has a variety of uses, such as generating surrogate keys, or as the basis for a data partitioning scheme when combined with the Conditional Split transformation. Properties Property Data Type Description Seed Int32 The first row number or seed value. Increment Int32 The value added to the previous row number to make the next row number. OutputVariable String The name of the variable into which the final row number is written post execution. (Optional). The three properties have been configured to support expressions, or they can set directly in the normal manner. Expressions on components are only visible on the hosting Data Flow task, not at the individual component level. Sometimes the data type of the property is incorrectly set when the properties are created, see the Troubleshooting section below for details on how to fix this. Installation The component is provided as an MSI file which you can download and run to install it. This simply places the files on disk in the correct locations and also installs the assemblies in the Global Assembly Cache as per Microsoft’s recommendations. You may need to restart the SQL Server Integration Services service, as this caches information about what components are installed, as well as restarting any open instances of Business Intelligence Development Studio (BIDS) / Visual Studio that you may be using to build your SSIS packages. For 2005/2008 Only - Finally you will have to add the transformation to the Visual Studio toolbox manually. Right-click the toolbox, and select Choose Items.... Select the SSIS Data Flow Items tab, and then check the Row Number transformation in the Choose Toolbox Items window. This process has been described in detail in the related FAQ entry for How do I install a task or transform component? We recommend you follow best practice and apply the current Microsoft SQL Server Service pack to your SQL Server servers and workstations, and this component requires a minimum of SQL Server 2005 Service Pack 1. Downloads The Row Number Transformation  is available for SQL Server 2005, SQL Server 2008 (includes R2) and SQL Server 2012. Please choose the version to match your SQL Server version, or you can install multiple versions and use them side by side if you have more than one version of SQL Server installed. Row Number Transformation for SQL Server 2005 Row Number Transformation for SQL Server 2008 Row Number Transformation for SQL Server 2012 Version History SQL Server 2012 Version 3.0.0.6 - SQL Server 2012 release. Includes upgrade support for both 2005 and 2008 packages to 2012. (5 Jun 2012) SQL Server 2008 Version 2.0.0.5 - SQL Server 2008 release. (15 Oct 2008) SQL Server 2005 Version 1.2.0.34 – Updated installer. (25 Jun 2008) Version 1.2.0.7 - SQL Server 2005 RTM Refresh. SP1 Compatibility Testing. Added the ability to reuse an existing column to hold the generated row number, as an alternative to the default of adding a new column to the output. (18 Jun 2006) Version 1.2.0.7 - SQL Server 2005 RTM Refresh. SP1 Compatibility Testing. Added the ability to reuse an existing column to hold the generated row number, as an alternative to the default of adding a new column to the output. (18 Jun 2006) Version 1.0.0.0 - Public Release for SQL Server 2005 IDW 15 June CTP (29 Aug 2005) Screenshot Code Sample The following code sample demonstrates using the Data Generator Source and Row Number Transformation programmatically in a very simple package. Package package = new Package(); package.Name = "Data Generator & Row Number"; // Add the Data Flow Task Executable taskExecutable = package.Executables.Add("STOCK:PipelineTask"); // Get the task host wrapper, and the Data Flow task TaskHost taskHost = taskExecutable as TaskHost; MainPipe dataFlowTask = (MainPipe)taskHost.InnerObject; // Add Data Generator Source IDTSComponentMetaData100 componentSource = dataFlowTask.ComponentMetaDataCollection.New(); componentSource.Name = "Data Generator"; componentSource.ComponentClassID = "Konesans.Dts.Pipeline.DataGenerator.DataGenerator, Konesans.Dts.Pipeline.DataGenerator, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b2ab4a111192992b"; CManagedComponentWrapper instanceSource = componentSource.Instantiate(); instanceSource.ProvideComponentProperties(); instanceSource.SetComponentProperty("RowCount", 10000); // Add Row Number Tx IDTSComponentMetaData100 componentRowNumber = dataFlowTask.ComponentMetaDataCollection.New(); componentRowNumber.Name = "FlatFileDestination"; componentRowNumber.ComponentClassID = "Konesans.Dts.Pipeline.RowNumberTransform.RowNumberTransform, Konesans.Dts.Pipeline.RowNumberTransform, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b2ab4a111192992b"; CManagedComponentWrapper instanceRowNumber = componentRowNumber.Instantiate(); instanceRowNumber.ProvideComponentProperties(); instanceRowNumber.SetComponentProperty("Increment", 10); // Connect the two components together IDTSPath100 path = dataFlowTask.PathCollection.New(); path.AttachPathAndPropagateNotifications(componentSource.OutputCollection[0], componentRowNumber.InputCollection[0]); #if DEBUG // Save package to disk, DEBUG only new Application().SaveToXml(String.Format(@"C:\Temp\{0}.dtsx", package.Name), package, null); #endif package.Execute(); foreach (DtsError error in package.Errors) { Console.WriteLine("ErrorCode : {0}", error.ErrorCode); Console.WriteLine(" SubComponent : {0}", error.SubComponent); Console.WriteLine(" Description : {0}", error.Description); } package.Dispose(); Troubleshooting Make sure you have downloaded the version that matches your version of SQL Server. We offer separate downloads for SQL Server 2005, SQL Server 2008 and SQL Server 2012. If you get an error when you try and use the component along the lines of The component could not be added to the Data Flow task. Please verify that this component is properly installed.  ... The data flow object "Konesans ..." is not installed correctly on this computer, this usually indicates that the internal cache of SSIS components needs to be updated. This is held by the SSIS service, so you need restart the the SQL Server Integration Services service. You can do this from the Services applet in Control Panel or Administrative Tools in Windows. You can also restart the computer if you prefer. You may also need to restart any current instances of Business Intelligence Development Studio (BIDS) / Visual Studio that you may be using to build your SSIS packages. Once installation is complete you need to manually add the task to the toolbox before you will see it and to be able add it to packages - How do I install a task or transform component? Please also make sure you have installed a minimum of SP1 for SQL 2005. The IDtsPipelineEnvironmentService was added in SQL Server 2005 Service Pack 1 (SP1) (See  http://support.microsoft.com/kb/916940). If you get an error Could not load type 'Microsoft.SqlServer.Dts.Design.IDtsPipelineEnvironmentService' from assembly 'Microsoft.SqlServer.Dts.Design, Version=9.0.242.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91'. when trying to open the user interface, it implies that your development machine has not had SP1 applied. Very occasionally we get a problem to do with the properties not being created with the correct data type. Since there is no way to programmatically to define the data type of a pipeline component property, it can only infer it. Whilst we set an integer value as we create the property, sometimes SSIS decides to define it is a decimal. This is often highlighted when you use a property expression against the property and get an error similar to Cannot convert System.Int32 to System.Decimal. Unfortunately this is beyond our control and there appears to be no pattern as to when this happens. If you do have more information we would be happy to hear it. To fix this issue you can manually edit the package file. In Visual Studio right click the package file from the Solution Explorer and select View Code, which will open the package as raw XML. You can now search for the properties by name or the component name. You can then change the incorrect property data types highlighted below from Decimal to Int32. <component id="37" name="Row Number Transformation" componentClassID="{BF01D463-7089-41EE-8F05-0A6DC17CE633}" … >     <properties>         <property id="38" name="UserComponentTypeName" …>         <property id="41" name="Seed" dataType="System.Int32" ...>10</property>         <property id="42" name="Increment" dataType="System.Decimal" ...>10</property>         ... If you are still having issues then contact us, but please provide as much detail as possible about error, as well as which version of the the task you are using and details of the SSIS tools installed.

    Read the article

  • Spring security access with multiple roles

    - by Evgeny Makarov
    I want to define access for some pages for user who has one of following roles (ROLE1 or ROLE2) I'm trying to configure this in my spring security xml file as following: <security:http entry-point-ref="restAuthenticationEntryPoint" access-decision-manager-ref="accessDecisionManager" xmlns="http://www.springframework.org/schema/security" use-expressions="true"> <!-- skipped configuration --> <security:intercept-url pattern="/rest/api/myUrl*" access="hasRole('ROLE1') or hasRole('ROLE2')" /> <!-- skipped configuration --> </security:http> I've tried various ways like: access="hasRole('ROLE1, ROLE2')" access="hasRole('ROLE1', 'ROLE2')" access="hasAnyRole('[ROLE1', 'ROLE2]')" etc but nothing seems to be working. I'm keep getting exception java.lang.IllegalArgumentException: Unsupported configuration attributes: or java.lang.IllegalArgumentException: Failed to parse expression 'hasAnyRole(['ROLE1', 'ROLE2'])' how should it be configured? Thanks

    Read the article

  • Deploying Asp.Net MVC 2 /C# 4.0 application on IIS 6

    - by Mose
    Hi, I got a problem migrating from VS.Net 2008 / MVC 1 to VS.NET 2010 (+C# 4.0) / MVC 2 The web.config has been updated, the site runs well in Cassini, but my problem now is deploying on IIS 6. I updated the web site to run using ASP.Net 4, but whatever URL I try, I always have a 404 error. It's as if the routing was not taken into account (yes, the wildcard mapping has been done). I do not understand this mess and could not google anything interesting... Thanks for your suggestions ! O.

    Read the article

  • Add note model in Rails

    - by dannymcc
    Hi Everyone, I am following the 15 minute blog tutorial on Ruby on Rails .com: http://media.rubyonrails.org/video/rails_blog_2.mov and am stumbling into some issues. I am using the following alterations to the names in the tutorial: posts = kases comments = notes I have setup the models as follows: class Kase < ActiveRecord::Base validates_presence_of :jobno has_many :notes belongs_to :company # foreign key: company_id belongs_to :person # foreign key in join table belongs_to :surveyor, :class_name => "Company", :foreign_key => "appointedsurveyor_id" belongs_to :surveyorperson, :class_name => "Person", :foreign_key => "surveyorperson_id" def to_param jobno end and... class Note < ActiveRecord::Base belongs_to :kase end The Notes controller look like this: # POST /notes # POST /notes.xml def create @kase = Kase.find(params[:kase_id]) @note = @kase.notes.build(params[:note]) redirect_to @kase end and the database scheme for Kases looks like this: create_table "notes", :force => true do |t| t.integer "kase_id" t.text "body" t.datetime "created_at" t.datetime "updated_at" end and for kases... create_table "kases", :force => true do |t| t.string "jobno" t.date "dateinstructed" t.string "clientref" t.string "clientcompanyname" t.text "clientcompanyaddress" t.string "clientcompanyfax" t.string "casehandlername" t.string "casehandlertel" t.string "casehandleremail" t.text "casesubject" t.string "transport" t.string "goods" t.string "claimantname" t.string "claimantaddressline1" t.string "claimantaddressline2" t.string "claimantaddressline3" t.string "claimantaddresscity" t.string "claimantaddresspostcode" t.string "claimantcontact" t.string "claimanttel" t.string "claimantmob" t.string "claimantemail" t.string "claimanturl" t.string "lyingatlocationname" t.string "lyingatlocationaddressline1" t.string "lyingatlocationaddressline2" t.string "lyingatlocationaddressline3" t.string "lyingatlocationaddresscity" t.string "lyingatlocationaddresspostcode" t.string "lyingatlocationcontactname" t.string "lyingattel" t.string "lyingatmobile" t.string "lyingatlocationurl" t.text "comments" t.string "invoicenumber" t.string "netamount" t.string "vat" t.string "grossamount" t.date "dateclosed" t.date "datepaid" t.datetime "filecreated" t.string "avatar_file_name" t.string "avatar_content_type" t.integer "avatar_file_size" t.datetime "avatar_updated_at" t.datetime "created_at" t.datetime "updated_at" t.string "kase_status" t.string "invoice_date" t.integer "surveyorperson_id" t.integer "appointedsurveyor_id" t.integer "person_id" t.string "company_id" t.string "dischargeamount" t.string "dishchargeheader" t.text "highrisesubject" end Whenever I enter a note into the kase show view's note entry form: <h2>Notes</h2> <div id="sub-notes"> <%= render :partial => @kase.notes %> </div> <% form_for [@kase, Note.new] do |f| %> <p> <%= f.label :body, "New Note" %><br /> <%= f.text_area :body %> </p> <p><%= f.submit "Add Note" %></p> <% end %> partial: <% div_for note do %> <p> <strong>Created <%= time_ago_in_words(note.created_at) %> ago</strong><br /> <%= h(note.body) %> </p> <% end %> I get the following error: ActiveRecord::RecordNotFound in NotesController#create Couldn't find Kase with ID=Test Case I have tried removing the def to_param jobno end from the kase model, but the same error shows. Any ideas what I'm missing? Thanks, Danny

    Read the article

  • missing elements from pcap?

    - by Matthew
    When I check the attributes available to the module pcap, I expect to see something like 'DLT_AIRONET_HEADER', 'DLT_APPLE_IP_OVER_IEEE1394', 'DLT_ARCNET', 'DLT_ARCNET_LINUX', 'DLT_ATM_CLIP', 'DLT_ATM_RFC1483', 'DLT_AURORA', 'DLT_AX25', 'DLT_CHAOS', 'DLT_CISCO_IOS', 'DLT_C_HDLC', 'DLT_DOCSIS', 'DLT_ECONET', 'DLT_EN10MB', 'DLT_EN3MB', 'DLT_ENC', 'DLT_FDDI', 'DLT_FRELAY', 'DLT_IEEE802', 'DLT_IEEE802_11', 'DLT_IEEE802_11_RADIO', 'DLT_IEEE802_11_RADIO_AVS', 'DLT_IPFILTER', 'DLT_IP_OVER_FC', 'DLT_JUNIPER_ATM1', 'DLT_JUNIPER_ATM2', 'DLT_JUNIPER_ES', 'DLT_JUNIPER_GGSN', 'DLT_JUNIPER_MFR', 'DLT_JUNIPER_MLFR', 'DLT_JUNIPER_MLPPP', 'DLT_JUNIPER_MONITOR', 'DLT_JUNIPER_SERVICES', 'DLT_LINUX_IRDA', 'DLT_LINUX_SLL', 'DLT_LOOP', 'DLT_LTALK', 'DLT_NULL', 'DLT_PFLOG', 'DLT_PPP', 'DLT_PPP_BSDOS', 'DLT_PPP_ETHER', 'DLT_PPP_SERIAL', 'DLT_PRISM_HEADER', 'DLT_PRONET', 'DLT_RAW', 'DLT_RIO', 'DLT_SLIP', 'DLT_SLIP_BSDOS', 'DLT_SUNATM', 'DLT_SYMANTEC_FIREWALL', 'DLT_TZSP', 'builtins', 'doc', 'file', 'name', '_newclass', '_object', '_pcap', '_swig_getattr', '_swig_setattr', 'aton', 'dltname', 'dltvalue', 'findalldevs', 'lookupdev', 'lookupnet', 'ntoa', 'pcapObject', 'pcapObjectPtr'] With note on pcapObject. However, all I get when running dir(pcap) is ['DLT_ARCNET', 'DLT_AX25', 'DLT_CHAOS', 'DLT_EN10MB', 'DLT_EN3MB', 'DLT_FDDI', 'DLT_IEEE802', 'DLT_LINUX_SLL', 'DLT_LOOP', 'DLT_NULL', 'DLT_PFLOG', 'DLT_PFSYNC', 'DLT_PPP', 'DLT_PRONET', 'DLT_RAW', 'DLT_SLIP', 'author', 'builtins', 'copyright', 'doc', 'file', 'license', 'name', 'url', 'version', 'bpf', 'dltoff', 'ex_name', 'lookupdev', 'pcap', 'sys'] Noting the lack of pcapObject. Why is this? What could cause this?

    Read the article

  • Firefox Extension needs to get cookie from PHP redirected external page.

    - by Tyler
    I am writing a firefox extension that interacts with a JSON server interface. I receive a url to the server which then redirects to the client site that provides the cookie. I need to be able to set this cookie in the users browser without physically loading it in the browser. Is this possible through an AJAX call? I tried using a hidden iframe, however firefox does not seem to like a php redirect in the iframe. My current solution is to load the site in a second tab that never gains focus and then auto close it when the cookie is set. This is very messy and would prefer something more streamlines. Any thoughts?

    Read the article

  • Accessing and binding events to an ASP.NET Grid view using Jquery

    - by Sayem Ahmed
    I have an ASP page which displays a text box when it loads. It takes an input number, send it to the server through post back, and then displays some record in a grid view. After a number is input into the box, the server fetches some data from a database and add records to the grid view. It also contains a link column, whose URL is set to "#", so that the page isn't redirected when it is clicked. Now I want to bind a jquery "click" event to that link. How can I do that ? I have tried that to do myself but failed, because it is not available when the DOM is loaded (since it only contains rows when a number is input through the box), and is being modified through ASP.NET Ajax post back.

    Read the article

  • Encoding in python with lxml - complex solution

    - by Vojtech R.
    Hi, I need to download and parse webpage with lxml and build UTF-8 xml output. I thing schema in pseudocode is more illustrative: from lxml import etree webfile = urllib2.urlopen(url) root = etree.parse(webfile.read(), parser=etree.HTMLParser(recover=True)) txt = my_process_text(etree.tostring(root.xpath('/html/body'), encoding=utf8)) output = etree.Element("out") output.text = txt outputfile.write(etree.tostring(output, encoding=utf8)) So webfile can be in any encoding (lxml should handle this). Outputfile have to be in utf-8. I'm not sure where to use encoding/coding. Is this schema ok? (I cant find good tutorial about lxml and encoding, but I can find many problems with this...) I need robust approved solution so I ask you seniors. Many thanks

    Read the article

  • Client Side Includes on HTML pages

    - by Roy Rico
    Previously I had thought that there were only ways to get content from external URLs into your page. These 2 ways are use an IFRAME or Javascript to include it into your pages. I've just learned of a new way using the tag. <object type="text/html" frameborder="0" data="http://Server/URL/"></object> I have found some content online that confirms this ability, but it doesn't talk much about features such as Accessibility and SEO of the page. Does anyone have any experience with this? Is there any information available regarding using this method and how it effects Accessibility and SEO?

    Read the article

  • Java Design Questions - Class, Function, Access Modifiers

    - by Ron
    I am newbie to Java. I have some design questions. Say I have a crawler application, that does the following: 1. Crawls a url and gets its content 2. Parses the contents 3. Displays the contents How do you decide between implementing a function or a class? -- Should the parser be a function of the crawler class, or should it be a class in itself, so it can be used by other applications as well? -- If it should be a class, should it be protected or public class? How do you decide between implementing a public or protected class? -- If I had to create a class to generate stats from the parsed contents for eg, should that class be protected (so only the crawler class can access it) or should it be public? Thanks Ron

    Read the article

  • Strange HtmlUnit behavior (bug?)

    - by roddik
    Hello. Take a look at this: WebClient client = new WebClient(); WebRequestSettings wrs = new WebRequestSettings(new URL("http://stackoverflow.com/ping/?what-the-duck?"), HttpMethod.HEAD); client.getPage(wrs); Running this code results in throwing FileNotFoundException, because HTTP Status code on the page is 404 and getting the same page again with the GET method, with User-Agent set to Java-.... Why does it GET the page (it doesn't happen with "normal" status codes)? Is this a bug? Thanks Here is the entire server response: HTTP/1.1 404 Not Found Cache-Control: private Content-Length: 7502 Content-Type: text/html; charset=utf-8 Server: Microsoft-IIS/7.5 Date: Thu, 11 Feb 2010 14:12:11 GMT Where does it tell client to GET something? And how can I force WebClient to ignore it? Here's a screenshot of HTTPDebugger: The problem here is I don't understand why the second request is being sent and why is it sent with different useragent.

    Read the article

  • How do you unit test web page authorization using ASP.NET MVC?

    - by Kevin Pang
    Let's say you have a profile page that can only be accessed by the owner of that profile. This profile page is located at: User/Profile/{userID} Now, I imagine in order to prevent access to this page by other users, you could structure your UserController class's Profile function to check the current session's identity: HttpContext.Current.User.Identity.Name If the id matches the one in the url, then you proceed. Otherwise you redirect to some sort of error page. My question is how do you unit test something like this? I'm guessing that you need to use some sort of dependency injection instead of the HttpContext in the controller to do the check on, but I am unclear what the best way to do that is. Any advice would be helpful.

    Read the article

  • How to embedd cgi in html

    - by neversaint
    I have no problem executing a cgi file under the normal url like this: http://www.myhost.com/mydir/cgi-bin/test.cgi However when I tried to embedd it into HTML file (called index.html) like this: <HTML> <BODY> <P>Here's the output from my program: <FORM ACTION="/var/www/mydir/cgi-bin/test.cgi" METHOD=POST> </FORM> </P> </BODY> </HTML> The CGI doesn't get executed when I do: http://www.myhost.com/mydir/index.html The CGI file (test.cgi) simply looks like this: #!/usr/bin/perl -wT use CGI::Carp qw(fatalsToBrowser); print "Test cgi!\n"; What's the right way to do it?

    Read the article

  • Lightbox for embeddable JavaScript widget? Like Feedback tabs for UserVoice/GetSatisfaction

    - by Eliot Sykes
    There are so many lightboxes to choose from, I'm looking for a very lightweight one to use in an embedded javascript widget that would be used on a number of different web sites. This would work in a similar way to the GetSatisfaction/UserVoice feedback tab. Here are the requirements for the lightbox: Very small javascript download (animation not needed) Self contained, not dependent on any libraries such as jquery, etc. Works in major browsers Lightbox displays HTML content from a given URL Close button (like GetSatisfaction or UserVoice) Dims background Avoids javascript namespace conflicts (or can easily be made to avoid them) CSS styling of lightbox does not interfere with site styling Have you used an existing lightbox scripts for this same purpose with similar requirements? Did you roll your own? Insights welcome! Thanks, Eliot

    Read the article

  • Search Route in ASP.NET MVC

    - by olst
    Hi. I have a simple search form in my master page and a serach controller and view. I'm trying to get the following route for the string search term "myterm" (for example): root/search/myterm The form in the master page : <% using (Html.BeginForm("SearchResults", "Search", FormMethod.Post, new { id = "search_form" })) { %> <input name="searchTerm" type="text" class="textfield" /> <input name="search" type="submit" value="search" class="button" /> <%} %> The Controller Action: public ActionResult SearchResults(string searchTerm){...} The Route I'm Using : routes.MapRoute( "Search", "search/{term}", new { controller = "Search", action = "SearchResults", term = (string)null } ); routes.MapRoute( "Default", "{controller}/{action}", new { controller = "Home", action = "Index" } ); I'm always getting the url "root/search" without the search term, no matter what search term I enter. Thanks.

    Read the article

  • HMAC SHA1 ColdFusion

    - by Chris
    Please help! I have been pulling out my hair over this one. :) I have a site that I need to HMAC SHA1 for authentication. It currently works with another language but now I need to move it to ColdFusion. For the life of me I cannot get the strings to match. Any assistance would be much appreciated. Data: https%3A%2F%2Fwww%2Etestwebsite%2Ecom%3Fid%3D5447 Key: 265D5C01D1B4C8FA28DC55C113B4D21005BB2B348859F674977B24E0F37C81B05FAE85FB75EA9CF53ABB9A174C59D98C7A61E2985026D2AA70AE4452A6E3F2F9 Correct answer: WJd%2BKxmFxGWdbw4xQJZXd3%2FHkFQ%3d My answer: knIVr6wIt6%2Fl7mBJPTTbwQoTIb8%3d Both are Base64 encoded and then URL encoded.

    Read the article

  • Is it possible to tell IIS 7 to process the request queue in parallel?

    - by Uwe Keim
    Currently we are developing an ASMX, ASP 2.0, IIS 7 web service that does some calculations (and return a dynamically generated document) and will take approx. 60 seconds to run. Since whe have a big machine with multiple cores and lots of RAM, I expected that IIS tries its best to route the requests that arrive in its requests queue to all available threads of the app pool's thread pool. But we experience quiet the opposite: When we issue requests to the ASMX web service URL from multiple different clients, the IIS seems to serially process these requests. I.e. request 1 arrives, is being processed, then request 2 is being processed, then request 3, etc. Question: Is it possible (without changing the C# code of the web service) to configure IIS to process requests in parallel, if enough threads are available? If yes: how should I do it? It no: any workarounds/tips? Thanks Uwe

    Read the article

  • How do you position a background image inside a <div>?

    - by Giffyguy
    My code currently looks like this: <div style="position: fixed; width: 35.25%; height: 6.75%; left: 0%; top: 4.625%; right: 64.75%; bottom: 88.625%; color: #D1E231; text-align: center; background-color: #666666; background-image: url('FleurTR.png'); background-position: right top;"> <div> The <div> shows up just fine, with the grey background color, but the background image won't show up at all. What am I missing here? There's no reason I should have to specify background-attachment or background-repeat, right? (I don't want it to repeat.)

    Read the article

  • Windows Azure: Major Updates for Mobile Backend Development

    - by ScottGu
    This week we released some great updates to Windows Azure that make it significantly easier to develop mobile applications that use the cloud. These new capabilities include: Mobile Services: Custom API support Mobile Services: Git Source Control support Mobile Services: Node.js NPM Module support Mobile Services: A .NET API via NuGet Mobile Services and Web Sites: Free 20MB SQL Database Option for Mobile Services and Web Sites Mobile Notification Hubs: Android Broadcast Push Notification Support All of these improvements are now available to use immediately (note: some are still in preview).  Below are more details about them. Mobile Services: Custom APIs, Git Source Control, and NuGet Windows Azure Mobile Services provides the ability to easily stand up a mobile backend that can be used to support your Windows 8, Windows Phone, iOS, Android and HTML5 client applications.  Starting with the first preview we supported the ability to easily extend your data backend logic with server side scripting that executes as part of client-side CRUD operations against your cloud back data tables. With today’s update we are extending this support even further and introducing the ability for you to also create and expose Custom APIs from your Mobile Service backend, and easily publish them to your Mobile clients without having to associate them with a data table. This capability enables a whole set of new scenarios – including the ability to work with data sources other than SQL Databases (for example: Table Services or MongoDB), broker calls to 3rd party APIs, integrate with Windows Azure Queues or Service Bus, work with custom non-JSON payloads (e.g. Windows Periodic Notifications), route client requests to services back on-premises (e.g. with the new Windows Azure BizTalk Services), or simply implement functionality that doesn’t correspond to a database operation.  The custom APIs can be written in server-side JavaScript (using Node.js) and can use Node’s NPM packages.  We will also be adding support for custom APIs written using .NET in the future as well. Creating a Custom API Adding a custom API to an existing Mobile Service is super easy.  Using the Windows Azure Management Portal you can now simply click the new “API” tab with your Mobile Service, and then click the “Create a Custom API” button to create a new Custom API within it: Give the API whatever name you want to expose, and then choose the security permissions you’d like to apply to the HTTP methods you expose within it.  You can easily lock down the HTTP verbs to your Custom API to be available to anyone, only those who have a valid application key, only authenticated users, or administrators.  Mobile Services will then enforce these permissions without you having to write any code: When you click the ok button you’ll see the new API show up in the API list.  Selecting it will enable you to edit the default script that contains some placeholder functionality: Today’s release enables Custom APIs to be written using Node.js (we will support writing Custom APIs in .NET as well in a future release), and the Custom API programming model follows the Node.js convention for modules, which is to export functions to handle HTTP requests. The default script above exposes functionality for an HTTP POST request. To support a GET, simply change the export statement accordingly.  Below is an example of some code for reading and returning data from Windows Azure Table Storage using the Azure Node API: After saving the changes, you can now call this API from any Mobile Service client application (including Windows 8, Windows Phone, iOS, Android or HTML5 with CORS). Below is the code for how you could invoke the API asynchronously from a Windows Store application using .NET and the new InvokeApiAsync method, and data-bind the results to control within your XAML:     private async void RefreshTodoItems() {         var results = await App.MobileService.InvokeApiAsync<List<TodoItem>>("todos", HttpMethod.Get, parameters: null);         ListItems.ItemsSource = new ObservableCollection<TodoItem>(results);     }    Integrating authentication and authorization with Custom APIs is really easy with Mobile Services. Just like with data requests, custom API requests enjoy the same built-in authentication and authorization support of Mobile Services (including integration with Microsoft ID, Google, Facebook and Twitter authentication providers), and it also enables you to easily integrate your Custom API code with other Mobile Service capabilities like push notifications, logging, SQL, etc. Check out our new tutorials to learn more about to use new Custom API support, and starting adding them to your app today. Mobile Services: Git Source Control Support Today’s Mobile Services update also enables source control integration with Git.  The new source control support provides a Git repository as part your Mobile Service, and it includes all of your existing Mobile Service scripts and permissions. You can clone that git repository on your local machine, make changes to any of your scripts, and then easily deploy the mobile service to production using Git. This enables a really great developer workflow that works on any developer machine (Windows, Mac and Linux). To use the new support, navigate to the dashboard for your mobile service and select the Set up source control link: If this is your first time enabling Git within Windows Azure, you will be prompted to enter the credentials you want to use to access the repository: Once you configure this, you can switch to the configure tab of your Mobile Service and you will see a Git URL you can use to use your repository: You can use this URL to clone the repository locally from your favorite command line: > git clone https://scottgutodo.scm.azure-mobile.net/ScottGuToDo.git Below is the directory structure of the repository: As you can see, the repository contains a service folder with several subfolders. Custom API scripts and associated permissions appear under the api folder as .js and .json files respectively (the .json files persist a JSON representation of the security settings for your endpoints). Similarly, table scripts and table permissions appear as .js and .json files, but since table scripts are separate per CRUD operation, they follow the naming convention of <tablename>.<operationname>.js. Finally, scheduled job scripts appear in the scheduler folder, and the shared folder is provided as a convenient location for you to store code shared by multiple scripts and a few miscellaneous things such as the APNS feedback script. Lets modify the table script todos.js file so that we have slightly better error handling when an exception occurs when we query our Table service: todos.js tableService.queryEntities(query, function(error, todoItems){     if (error) {         console.error("Error querying table: " + error);         response.send(500);     } else {         response.send(200, todoItems);     }        }); Save these changes, and now back in the command line prompt commit the changes and push them to the Mobile Services: > git add . > git commit –m "better error handling in todos.js" > git push Once deployment of the changes is complete, they will take effect immediately, and you will also see the changes be reflected in the portal: With the new Source Control feature, we’re making it really easy for you to edit your mobile service locally and push changes in an atomic fashion without sacrificing ease of use in the Windows Azure Portal. Mobile Services: NPM Module Support The new Mobile Services source control support also allows you to add any Node.js module you need in the scripts beyond the fixed set provided by Mobile Services. For example, you can easily switch to use Mongo instead of Windows Azure table in our example above. Set up Mongo DB by either purchasing a MongoLab subscription (which provides MongoDB as a Service) via the Windows Azure Store or set it up yourself on a Virtual Machine (either Windows or Linux). Then go the service folder of your local git repository and run the following command: > npm install mongoose This will add the Mongoose module to your Mobile Service scripts.  After that you can use and reference the Mongoose module in your custom API scripts to access your Mongo database: var mongoose = require('mongoose'); var schema = mongoose.Schema({ text: String, completed: Boolean });   exports.get = function (request, response) {     mongoose.connect('<your Mongo connection string> ');     TodoItemModel = mongoose.model('todoitem', schema);     TodoItemModel.find(function (err, items) {         if (err) {             console.log('error:' + err);             return response.send(500);         }         response.send(200, items);     }); }; Don’t forget to push your changes to your mobile service once you are done > git add . > git commit –m "Switched to use Mongo Labs" > git push Now our Mobile Service app is using Mongo DB! Note, with today’s update usage of custom Node.js modules is limited to Custom API scripts only. We will enable it in all scripts (including data and custom CRON tasks) shortly. New Mobile Services NuGet package, including .NET 4.5 support A few months ago we announced a new pre-release version of the Mobile Services client SDK based on portable class libraries (PCL). Today, we are excited to announce that this new library is now a stable .NET client SDK for mobile services and is no longer a pre-release package. Today’s update includes full support for Windows Store, Windows Phone 7.x, and .NET 4.5, which allows developers to use Mobile Services from ASP.NET or WPF applications. You can install and use this package today via NuGet. Mobile Services and Web Sites: Free 20MB Database for Mobile Services and Web Sites Starting today, every customer of Windows Azure gets one Free 20MB database to use for 12 months free (for both dev/test and production) with Web Sites and Mobile Services. When creating a Mobile Service or a Web Site, simply chose the new “Create a new Free 20MB database” option to take advantage of it: You can use this free SQL Database together with the 10 free Web Sites and 10 free Mobile Services you get with your Windows Azure subscription, or from any other Windows Azure VM or Cloud Service. Notification Hubs: Android Broadcast Push Notification Support Earlier this year, we introduced a new capability in Windows Azure for sending broadcast push notifications at high scale: Notification Hubs. In the initial preview of Notification Hubs you could use this support with both iOS and Windows devices.  Today we’re excited to announce new Notification Hubs support for sending push notifications to Android devices as well. Push notifications are a vital component of mobile applications.  They are critical not only in consumer apps, where they are used to increase app engagement and usage, but also in enterprise apps where up-to-date information increases employee responsiveness to business events.  You can use Notification Hubs to send push notifications to devices from any type of app (a Mobile Service, Web Site, Cloud Service or Virtual Machine). Notification Hubs provide you with the following capabilities: Cross-platform Push Notifications Support. Notification Hubs provide a common API to send push notifications to iOS, Android, or Windows Store at once.  Your app can send notifications in platform specific formats or in a platform-independent way.  Efficient Multicast. Notification Hubs are optimized to enable push notification broadcast to thousands or millions of devices with low latency.  Your server back-end can fire one message into a Notification Hub, and millions of push notifications can automatically be delivered to your users.  Devices and apps can specify a number of per-user tags when registering with a Notification Hub. These tags do not need to be pre-provisioned or disposed, and provide a very easy way to send filtered notifications to an infinite number of users/devices with a single API call.   Extreme Scale. Notification Hubs enable you to reach millions of devices without you having to re-architect or shard your application.  The pub/sub routing mechanism allows you to broadcast notifications in a super-efficient way.  This makes it incredibly easy to route and deliver notification messages to millions of users without having to build your own routing infrastructure. Usable from any Backend App. Notification Hubs can be easily integrated into any back-end server app, whether it is a Mobile Service, a Web Site, a Cloud Service or an IAAS VM. It is easy to configure Notification Hubs to send push notifications to Android. Create a new Notification Hub within the Windows Azure Management Portal (New->App Services->Service Bus->Notification Hub): Then register for Google Cloud Messaging using https://code.google.com/apis/console and obtain your API key, then simply paste that key on the Configure tab of your Notification Hub management page under the Google Cloud Messaging Settings: Then just add code to the OnCreate method of your Android app’s MainActivity class to register the device with Notification Hubs: gcm = GoogleCloudMessaging.getInstance(this); String connectionString = "<your listen access connection string>"; hub = new NotificationHub("<your notification hub name>", connectionString, this); String regid = gcm.register(SENDER_ID); hub.register(regid, "myTag"); Now you can broadcast notification from your .NET backend (or Node, Java, or PHP) to any Windows Store, Android, or iOS device registered for “myTag” tag via a single API call (you can literally broadcast messages to millions of clients you have registered with just one API call): var hubClient = NotificationHubClient.CreateClientFromConnectionString(                   “<your connection string with full access>”,                   "<your notification hub name>"); hubClient.SendGcmNativeNotification("{ 'data' : {'msg' : 'Hello from Windows Azure!' } }", "myTag”); Notification Hubs provide an extremely scalable, cross-platform, push notification infrastructure that enables you to efficiently route push notification messages to millions of mobile users and devices.  It will make enabling your push notification logic significantly simpler and more scalable, and allow you to build even better apps with it. Learn more about Notification Hubs here on MSDN . Summary The above features are now live and available to start using immediately (note: some of the services are still in preview).  If you don’t already have a Windows Azure account, you can sign-up for a free trial and start using them today.  Visit the Windows Azure Developer Center to learn more about how to build apps with it. Hope this helps, Scott P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu

    Read the article

  • Create signed urls for CloudFront with Ruby

    - by wiseleyb
    History: I created a key and pem file on Amazon. I created a private bucket I created a public distribution and used origin id to connect to the private bucket: works I created a private distribution and connected it the same as #3 - now I get access denied: expected I'm having a really hard time generating a url that will work. I've been trying to follow the directions described here: http://docs.amazonwebservices.com/AmazonCloudFront/latest/DeveloperGuide/index.html?PrivateContent.html This is what I've got so far... doesn't work though - still getting access denied: def url_safe(s) s.gsub('+','-').gsub('=','_').gsub('/','~').gsub(/\n/,'').gsub(' ','') end def policy_for_resource(resource, expires = Time.now + 1.hour) %({"Statement":[{"Resource":"#{resource}","Condition":{"DateLessThan":{"AWS:EpochTime":#{expires.to_i}}}}]}) end def signature_for_resource(resource, key_id, private_key_file_name, expires = Time.now + 1.hour) policy = url_safe(policy_for_resource(resource, expires)) key = OpenSSL::PKey::RSA.new(File.readlines(private_key_file_name).join("")) url_safe(Base64.encode64(key.sign(OpenSSL::Digest::SHA1.new, (policy)))) end def expiring_url_for_private_resource(resource, key_id, private_key_file_name, expires = Time.now + 1.hour) sig = signature_for_resource(resource, key_id, private_key_file_name, expires) "#{resource}?Expires=#{expires.to_i}&Signature=#{sig}&Key-Pair-Id=#{key_id}" end resource = "http://d27ss180g8tp83.cloudfront.net/iwantu.jpeg" key_id = "APKAIS6OBYQ253QOURZA" pk_file = "doc/pk-APKAIS6OBYQ253QOURZA.pem" puts expiring_url_for_private_resource(resource, key_id, pk_file) Can anyone tell me what I'm doing wrong here?

    Read the article

  • How do I determine if truncation is being applied in my style through JS?

    - by Avry
    I am applying truncation using CSS styles: .yui-skin-sam td:not(.yui-dt-editable) .yui-dt-liner{ white-space: nowrap; overflow: hidden; text-overflow: ellipsis; -ms-text-overflow: ellipsis; -o-text-overflow: ellipsis; -moz-binding: url('ellipsis.xml#ellipsis'); } .yui-skin-sam td[class~=yui-dt-editable] .yui-dt-liner{ white-space: nowrap; overflow: hidden; text-overflow: ellipsis; -ms-text-overflow: ellipsis; -o-text-overflow: ellipsis; } (Sidenote: I'm not sure if this is the best way to write my CSS. This is a Firefox specific workaround since truncation on Firefox only sort-of works). I want a tool-tip to appear over text that is truncated. How do I detect if text is truncated so that I can display a tool-tip?

    Read the article

  • Serializing an object into the body of a WCF request using webHttpBinding

    - by Bert
    I have a WCF service exposed with a webHttpBinding endpoint. [OperationContract(IsOneWay = true)] [WebInvoke(Method = "POST", RequestFormat = WebMessageFormat.Json, BodyStyle = WebMessageBodyStyle.Bare, UriTemplate = "/?action=DoSomething&v1={value1}&v2={value2}")] void DoSomething(string value1, string value2, MySimpleObject value3); In theory, if I call this, the first two parameters (value1 & value 2) are taken from the Uri and the final one (value3) should be deserialized from the body of the request. Assuming I am using Json as the RequestFormat, what is the best way of serialising an instance of MySimpleObject into the body of the request before I send it ? This, for instance, does not seem to work : HttpWebRequest sendRequest = (HttpWebRequest)WebRequest.Create(url); sendRequest.ContentType = "application/json"; sendRequest.Method = "POST"; using (var sendRequestStream = sendRequest.GetRequestStream()) { DataContractJsonSerializer jsonSerializer = new DataContractJsonSerializer(typeof(MySimpleObject)); jsonSerializer.WriteObject(sendRequestStream, obj); sendRequestStream.Close(); } sendRequest.GetResponse().Close();

    Read the article

  • Quick question about PayPal IPN Security

    - by Alix Axel
    PayPal IPN sends a POST request with a variable number of fields to the notify URL, in order to confirm that the POST request is legit we need to resubmit the same request along with a additional cmd=_notify-validate field to PayPal, which then replies SUCCESS or FAILURE. My question is, why do we need to resend the request to PayPal? Wouldn't something like this work? if (preg_match('~^(?:.+[.])?paypal[.]com$~i', gethostbyaddr($_SERVER['REQUEST_ADDR'])) > 0) { // request came from PayPal, it's legit. } Iff we can trust the server to correctly resolve IPs, I assume we can trust PayPal POST requests, no?

    Read the article

< Previous Page | 571 572 573 574 575 576 577 578 579 580 581 582  | Next Page >