Search Results

Search found 16923 results on 677 pages for 'facebook graph api'.

Page 675/677 | < Previous Page | 671 672 673 674 675 676 677  | Next Page >

  • CodePlex Daily Summary for Wednesday, November 06, 2013

    CodePlex Daily Summary for Wednesday, November 06, 2013Popular ReleasesWin_8 (??? Devel Studio 2 ??? 3): Win8 0.8 + Sample (.dvs): ???????------------------------------------------ 1. ????????? ??????????? ????????? ??????. 2. ?????????? ???? , ????????? ? ?????????? ???? ????. 3. ?????????? ?????? ??????. 4. ?????????? ????????? ???????????. 5. ????????????? ??? . English----------------------------------------------------------------------- 1. Added ability to load icons. 2. Fixed bugs related to obtaining names of shapes. 3. Fixed a memory leak. 4. Fixed some defects. 5. Optimized code. ---------------------------...NWTCompiler: NWTCompiler v2.4.0: This version fixes a bug with Pinyin and adds support for the 2013 English NWT.ConEmu - Windows console with tabs: ConEmu 131105 [Alpha]: ConEmu - developer build x86 and x64 versions. Written in C++, no additional packages required. Run "ConEmu.exe" or "ConEmu64.exe". Some useful information you may found: http://superuser.com/questions/tagged/conemu http://code.google.com/p/conemu-maximus5/wiki/ConEmuFAQ http://code.google.com/p/conemu-maximus5/wiki/TableOfContents If you want to use ConEmu in portable mode, just create empty "ConEmu.xml" file near to "ConEmu.exe"CS-Script for Notepad++ (C# intellisense and code execution): Release v1.0.9.0: Implemented Recent Scripts list Added checking for plugin updates from AboutBox Multiple formatting improvements/fixes Implemented selection of the CLR version when preparing distribution package Added project panel button for showing plugin shortcuts list Added 'What's New?' panel Fixed auto-formatting scrolling artifact Implemented navigation to "logical" file (vs. auto-generated) file from output panel To avoid the DLLs getting locked by OS use MSI file for the installation.Home Access Plus+: v9.7: Updated: JSON.net Fixed: Issue with the Windows 8 App Added: Windows 8.1 App Added: Win: Self Signed HAP+ Install Support Added: Win: Delete File Support Added: Timeout for the Logon Tracker Removed: Error Dialogs on the User Card Fixed: Green line showing over the booking form Note: a web.config file update is requiredWPF Extended DataGrid: WPF Extended DataGrid 2.0.0.10 binaries: Now row summaries are updated whenever autofilter value sis modified.Community Forums NNTP bridge: Community Forums NNTP Bridge V55 (LiveConnect): This is a which can be used with the new LiveConnect authentication and the MSDN forums. It fixed a bug where the authentication does not work after 1 hour. A logfile will be stored in "%AppData%\Community\CommunityForumsNNTPServer". If you have any problems please feel free to sent me the file "LogFile.txt" or attached it to a issue.xUnit.net - Unit testing framework for C# and .NET (a successor to NUnit): xUnit.net Visual Studio Runner: A placeholder for downloading Visual Studio runner VSIX files, in case the Gallery is down (or you want to downgrade to older versions).Social Network Importer for NodeXL: SocialNetImporter(v.1.9.1): This new version includes: - Include me option is back - Fixed the login bug reported latelyVeraCrypt: VeraCrypt version 1.0c: Changes between 1.0b and 1.0c (11 November 2013) : Set correctly the minimum required version in volumes header (this value must always follow the program version after any major changes). This also solves also the hidden volume issueCaptcha MVC: Captcha MVC 2.5: v 2.5: Added support for MVC 5. The DefaultCaptchaManager is no longer throws an error if the captcha values was entered incorrectly. Minor changes. v 2.4.1: Fixed issues with deleting incorrect values of the captcha token in the SessionStorageProvider. This could lead to a situation when the captcha was not working with the SessionStorageProvider. Minor changes. v 2.4: Changed the IIntelligencePolicy interface, added ICaptchaManager as parameter for all methods. Improved font size ...Role Based Views in Microsoft Dynamics CRM 2011: Role Based Views in CRM 2011 - 1.0.0.0: Set the default view for a user for a particular entity based on security role Hide/ Show views for a user for a particular entity based on his security role Choose the preferred role for a user for view configuration when the user have more than one security role in the system. Ability to exclude/ include a user from view configuration as per business requirementsDuplica: duplica 0.2.498: this is first stable releaseDNN Blog: 06.00.01: 06.00.01 ReleaseThis is the first bugfix release of the new v6 blog module. These are the changes: Added some robustness in v5-v6 scripts to cater for some rare upgrade scenarios Changed the name of the module definition to avoid clash with Evoq Social Addition of sitemap providerStock Track: Version 1.2 Stable: Overhaul and re-think of the user interface in normal mode. Added stock history view in normal mode. Allows user to enter orders in normal mode. Allow advanced user to run database queries within the program. Improved sales statistics feature, able to calculate against a single category. Updated database script file. Now compatible with lower version of SQL Server.VG-Ripper & PG-Ripper: VG-Ripper 2.9.50: changes NEW: Added Support for "ImageHostHQ.com" links NEW: Added Support for "ImgMoney.net" links NEW: Added Support for "ImgSavy.com" links NEW: Added Support for "PixTreat.com" links Bug fixesVidCoder: 1.5.11 Beta: Added Encode Details window. Exposes elapsed time, ETA, current and average FPS, running file size, current pass and pass progress. Open it by going to Windows -> Encode Details while an encode is running. Subtitle dialog now disables the "Burn In" checkbox when it's either unavailable or it's the only option. It also disables the "Forced Only" when the subtitle type doesn't support the "Forced" flag. Updated HandBrake core to SVN 5872. Fixed crash in the preview window when a source fil...Wsus Package Publisher: Release v1.3.1311.02: Add three new Actions in Custom Updates : Work with Files (Copy, Delete, Rename), Work with Folders (Add, Delete, Rename) and Work with Registry Keys (Add, Delete, Rename). Fix a bug, where after resigning an update, the display is not refresh. Modify the way WPP sort rows in 'Updates Detail Viewer' and 'Computer List Viewer' so that dates are correctly sorted. Add a Tab in the settings form to set Proxy settings when WPP needs to go on Internet. Fix a bug where 'Manage Catalogs Subsc...uComponents: uComponents v6.0.0: This release of uComponents will compile against and support the new API in Umbraco v6.1.0. What's new in uComponents v6.0.0? New DataTypesImage Point XML DropDownList XPath Templatable List New features / Resolved issuesThe following workitems have been implemented and/or resolved: 14781 14805 14808 14818 14854 14827 14868 14859 14790 14853 14790 DataType Grid 14788 14810 14873 14833 14864 14855 / 14860 14816 14823 Drag & Drop support for rows Su...SmartStore.NET - Free ASP.NET MVC Ecommerce Shopping Cart Solution: SmartStore.NET 1.2.1: New FeaturesAdded option Limit to current basket subtotal to HadSpentAmount discount rule Items in product lists can be labelled as NEW for a configurable period of time Product templates can optionally display a discount sign when discounts were applied Added the ability to set multiple favicons depending on stores and/or themes Plugin management: multiple plugins can now be (un)installed in one go Added a field for the HTML body id to store entity (Developer) New property 'Extra...New Projects.Net PG: Just university projectA2D: 1. Cache System 2. Event System 3. IoC 4. Sql Dispatcher System 5. Session System 6. ???Command Bus 7. ????Advantage Browser: A web browser made in Visual Basic 2010 for all to add and learn from or just use.BarCoder: BarCoder is C# Web app for creating EAN-8 and EAN-16 bar codec in vector graphic image format.CLIDE .NET: CLIDE .NET The Command Line IDE for .NET Because Code is just CodeDevcken Java Library: Devcken's Java LibraryDouble Sides Flipping Control - Windows Phone: Double Sides Flipping Control The Control features the following: Two Sides Control which flipping across its center in Both Directions based on horizontal geFigTree FHMS (Funeral Home Management System): FigTree FHMS (Funeral Home Management System) application is outfitted for daily operations of your funeral home.InChatter: InChatter is a Instant messaging communication module for a desktop application programmed by C#. And the server is a WCF program.Mod.Training: Some helpful examples about Orchardy stuffPlupload MVC4 Demo: This project shows how to implement Plupload with an MVC applicationQuality Control Management System: QCMS is a web-based Test Management systemQuan Ly Nha Thuoc: Project Qu?n Lý Ti?m Thu?c Tây Email: phuoc.nh2953@sinhvien.hoasen.edu.vnRandomchaos DX11 Engine: An open source C++ DX 11 EngineService Gateway: The service gateway enables composition and agility for web sites and services.Sortable objects: Sortable objects server and client sideSTSADM ExportCrawlLog - SP Foundation 2010: STSADM extension to see/export SharePoint Fiundation 2010 MSSearch configuration and Crawl LogsTACACS Plus Extended: tacacs+ updates to support subnet specific configurations and reduced configuration with ldap access.TKinect: Framework for Testing Kinect ApplicationsWebApi Data Wrapper: This project - a collection of wrapper for WebAPI.WPF ExpressionEditor: Control representing expression editor. Allows usage of custom expression parsers.

    Read the article

  • Animation Color [on hold]

    - by user2425429
    I'm having problems in my java program for animation. I'm trying to draw a hexagon with a shape similar to that of a trapezoid. Then, I'm making it move to the right for a certain amount of time (DEMO_TIME). Animation and ScreenManager are "API" classes, and AnimationTest1 is a demo. In my test program, it runs with a black screen and white stroke color. I'd like to know why this happened and how to fix it. I'm a beginner, so I apologize for this question being stupid to all you game programmers. Here is the code I have now: import java.awt.DisplayMode; import java.awt.Graphics; import java.awt.Graphics2D; import java.awt.Image; import java.awt.Polygon; import java.util.ArrayList; import java.util.List; import java.util.concurrent.Executor; import java.util.concurrent.Executors; import javax.swing.ImageIcon; public class AnimationTest1 { public static void main(String args[]) { AnimationTest1 test = new AnimationTest1(); test.run(); } private static final DisplayMode POSSIBLE_MODES[] = { new DisplayMode(800, 600, 32, 0), new DisplayMode(800, 600, 24, 0), new DisplayMode(800, 600, 16, 0), new DisplayMode(640, 480, 32, 0), new DisplayMode(640, 480, 24, 0), new DisplayMode(640, 480, 16, 0) }; private static final long DEMO_TIME = 4000; private ScreenManager screen; private Image bgImage; private Animation anim; public void loadImages() { // create animation List<Polygon> polygons=new ArrayList(); int[] x=new int[]{20,4,4,20,40,56,56,40}; int[] y=new int[]{20,32,40,44,44,40,32,20}; polygons.add(new Polygon(x,y,8)); anim = new Animation(); //# of frames long startTime = System.currentTimeMillis(); long currTimer = startTime; long elapsedTime = 0; boolean animated = false; Graphics2D g = screen.getGraphics(); int width=200; int height=200; while (currTimer - startTime < DEMO_TIME*2) { //draw the polygons if(!animated){ for(int j=0; j<polygons.size();j++){ for(int pos=0; pos<polygons.get(j).npoints; pos++){ polygons.get(j).xpoints[pos]+=1; } } anim.setNewPolyFrame(polygons , width , height , 64); } else{ // update animation anim.update(elapsedTime); draw(g); g.dispose(); screen.update(); try{ Thread.sleep(20); } catch(InterruptedException ie){} } if(currTimer - startTime == DEMO_TIME) animated=true; elapsedTime = System.currentTimeMillis() - currTimer; currTimer += elapsedTime; } } public void run() { screen = new ScreenManager(); try { DisplayMode displayMode = screen.findFirstCompatibleMode(POSSIBLE_MODES); screen.setFullScreen(displayMode); loadImages(); } finally { screen.restoreScreen(); } } public void draw(Graphics g) { // draw background g.drawImage(bgImage, 0, 0, null); // draw image g.drawImage(anim.getImage(), 0, 0, null); } } ScreenManager: import java.awt.Color; import java.awt.DisplayMode; import java.awt.Graphics; import java.awt.Graphics2D; import java.awt.GraphicsConfiguration; import java.awt.GraphicsDevice; import java.awt.GraphicsEnvironment; import java.awt.Toolkit; import java.awt.Window; import java.awt.event.KeyListener; import java.awt.event.MouseListener; import java.awt.image.BufferStrategy; import java.awt.image.BufferedImage; import javax.swing.JFrame; import javax.swing.JPanel; public class ScreenManager extends JPanel { private GraphicsDevice device; /** Creates a new ScreenManager object. */ public ScreenManager() { GraphicsEnvironment environment=GraphicsEnvironment.getLocalGraphicsEnvironment(); device = environment.getDefaultScreenDevice(); setBackground(Color.white); } /** Returns a list of compatible display modes for the default device on the system. */ public DisplayMode[] getCompatibleDisplayModes() { return device.getDisplayModes(); } /** Returns the first compatible mode in a list of modes. Returns null if no modes are compatible. */ public DisplayMode findFirstCompatibleMode( DisplayMode modes[]) { DisplayMode goodModes[] = device.getDisplayModes(); for (int i = 0; i < modes.length; i++) { for (int j = 0; j < goodModes.length; j++) { if (displayModesMatch(modes[i], goodModes[j])) { return modes[i]; } } } return null; } /** Returns the current display mode. */ public DisplayMode getCurrentDisplayMode() { return device.getDisplayMode(); } /** Determines if two display modes "match". Two display modes match if they have the same resolution, bit depth, and refresh rate. The bit depth is ignored if one of the modes has a bit depth of DisplayMode.BIT_DEPTH_MULTI. Likewise, the refresh rate is ignored if one of the modes has a refresh rate of DisplayMode.REFRESH_RATE_UNKNOWN. */ public boolean displayModesMatch(DisplayMode mode1, DisplayMode mode2) { if (mode1.getWidth() != mode2.getWidth() || mode1.getHeight() != mode2.getHeight()) { return false; } if (mode1.getBitDepth() != DisplayMode.BIT_DEPTH_MULTI && mode2.getBitDepth() != DisplayMode.BIT_DEPTH_MULTI && mode1.getBitDepth() != mode2.getBitDepth()) { return false; } if (mode1.getRefreshRate() != DisplayMode.REFRESH_RATE_UNKNOWN && mode2.getRefreshRate() != DisplayMode.REFRESH_RATE_UNKNOWN && mode1.getRefreshRate() != mode2.getRefreshRate()) { return false; } return true; } /** Enters full screen mode and changes the display mode. If the specified display mode is null or not compatible with this device, or if the display mode cannot be changed on this system, the current display mode is used. <p> The display uses a BufferStrategy with 2 buffers. */ public void setFullScreen(DisplayMode displayMode) { JFrame frame = new JFrame(); frame.setUndecorated(true); frame.setIgnoreRepaint(true); frame.setResizable(true); device.setFullScreenWindow(frame); if (displayMode != null && device.isDisplayChangeSupported()) { try { device.setDisplayMode(displayMode); } catch (IllegalArgumentException ex) { } } frame.createBufferStrategy(2); Graphics g=frame.getGraphics(); g.setColor(Color.white); g.drawRect(0, 0, frame.WIDTH, frame.HEIGHT); frame.paintAll(g); g.setColor(Color.black); g.dispose(); } /** Gets the graphics context for the display. The ScreenManager uses double buffering, so applications must call update() to show any graphics drawn. <p> The application must dispose of the graphics object. */ public Graphics2D getGraphics() { Window window = device.getFullScreenWindow(); if (window != null) { BufferStrategy strategy = window.getBufferStrategy(); return (Graphics2D)strategy.getDrawGraphics(); } else { return null; } } /** Updates the display. */ public void update() { Window window = device.getFullScreenWindow(); if (window != null) { BufferStrategy strategy = window.getBufferStrategy(); if (!strategy.contentsLost()) { strategy.show(); } } // Sync the display on some systems. // (on Linux, this fixes event queue problems) Toolkit.getDefaultToolkit().sync(); } /** Returns the window currently used in full screen mode. Returns null if the device is not in full screen mode. */ public Window getFullScreenWindow() { return device.getFullScreenWindow(); } /** Returns the width of the window currently used in full screen mode. Returns 0 if the device is not in full screen mode. */ public int getWidth() { Window window = device.getFullScreenWindow(); if (window != null) { return window.getWidth(); } else { return 0; } } /** Returns the height of the window currently used in full screen mode. Returns 0 if the device is not in full screen mode. */ public int getHeight() { Window window = device.getFullScreenWindow(); if (window != null) { return window.getHeight(); } else { return 0; } } /** Restores the screen's display mode. */ public void restoreScreen() { Window window = device.getFullScreenWindow(); if (window != null) { window.dispose(); } device.setFullScreenWindow(null); } /** Creates an image compatible with the current display. */ public BufferedImage createCompatibleImage(int w, int h, int transparency) { Window window = device.getFullScreenWindow(); if (window != null) { GraphicsConfiguration gc = window.getGraphicsConfiguration(); return gc.createCompatibleImage(w, h, transparency); } return null; } } Animation: import java.awt.Color; import java.awt.Graphics; import java.awt.Graphics2D; import java.awt.Image; import java.awt.Polygon; import java.awt.image.BufferedImage; import java.util.ArrayList; import java.util.List; /** The Animation class manages a series of images (frames) and the amount of time to display each frame. */ public class Animation { private ArrayList frames; private int currFrameIndex; private long animTime; private long totalDuration; /** Creates a new, empty Animation. */ public Animation() { frames = new ArrayList(); totalDuration = 0; start(); } /** Adds an image to the animation with the specified duration (time to display the image). */ public synchronized void addFrame(BufferedImage image, long duration){ ScreenManager s = new ScreenManager(); totalDuration += duration; frames.add(new AnimFrame(image, totalDuration)); } /** Starts the animation over from the beginning. */ public synchronized void start() { animTime = 0; currFrameIndex = 0; } /** Updates the animation's current image (frame), if necessary. */ public synchronized void update(long elapsedTime) { if (frames.size() >= 1) { animTime += elapsedTime; /*if (animTime >= totalDuration) { animTime = animTime % totalDuration; currFrameIndex = 0; }*/ while (animTime > getFrame(0).endTime) { frames.remove(0); } } } /** Gets the Animation's current image. Returns null if this animation has no images. */ public synchronized Image getImage() { if (frames.size() > 0&&!(currFrameIndex>=frames.size())) { return getFrame(currFrameIndex).image; } else{ System.out.println("There are no frames!"); System.exit(0); } return null; } private AnimFrame getFrame(int i) { return (AnimFrame)frames.get(i); } private class AnimFrame { Image image; long endTime; public AnimFrame(Image image, long endTime) { this.image = image; this.endTime = endTime; } } public void setNewPolyFrame(List<Polygon> polys,int imagewidth,int imageheight,int time){ BufferedImage image=new BufferedImage(imagewidth, imageheight, 1); Graphics g=image.getGraphics(); for(int i=0;i<polys.size();i++){ g.drawPolygon(polys.get(i)); } addFrame(image,time); g.dispose(); } }

    Read the article

  • Logging Into a site that uses Live.com authentication with C#

    - by Josh
    I've been trying to automate a log in to a website I frequent, www.bungie.net. The site is associated with Microsoft and Xbox Live, and as such makes uses of the Windows Live ID API when people log in to their site. I am relatively new to creating web spiders/robots, and I worry that I'm misunderstanding some of the most basic concepts. I've simulated logins to other sites such as Facebook and Gmail, but live.com has given me nothing but trouble. Anyways, I've been using Wireshark and the Firefox addon Tamper Data to try and figure out what I need to post, and what cookies I need to include with my requests. As far as I know these are the steps one must follow to log in to this site. 1. Visit https: //login.live.com/login.srf?wa=wsignin1.0&rpsnv=11&ct=1268167141&rver=5.5.4177.0&wp=LBI&wreply=http:%2F%2Fwww.bungie.net%2FDefault.aspx&id=42917 2. Recieve the cookies MSPRequ and MSPOK. 3. Post the values from the form ID "PPSX", the values from the form ID "PPFT", your username, your password all to a changing URL similar to: https: //login.live.com/ppsecure/post.srf?wa=wsignin1.0&rpsnv=11&ct= (there are a few numbers that change at the end of that URL) 4. Live.com returns the user a page with more hidden forms to post. The client then posts the values from the form "ANON", the value from the form "ANONExp" and the values from the form "t" to the URL: http ://www.bung ie.net/Default.aspx?wa=wsignin1.0 5. After posting that data, the user is returned a variety of cookies the most important of which is "BNGAuth" which is the log in cookie for the site. Where I am having trouble is on fifth step, but that doesn't neccesarily mean I've done all the other steps correctly. I post the data from "ANON", "ANONExp" and "t" but instead of being returned a BNGAuth cookie, I'm returned a cookie named "RSPMaybe" and redirected to the home page. When I review the Wireshark log, I noticed something that instantly stood out to me as different between the log when I logged in with Firefox and when my program ran. It could be nothing but I'll include the picture here for you to review. I'm being returned an HTTP packet from the site before I post the data in the fourth step. I'm not sure how this is happening, but it must be a side effect from something I'm doing wrong in the HTTPS steps. ![alt text][1] http://img391.imageshack.us/img391/6049/31394881.gif using System; using System.Collections.Generic; using System.Collections.Specialized; using System.Text; using System.Net; using System.IO; using System.IO.Compression; using System.Security.Cryptography; using System.Security.Cryptography.X509Certificates; using System.Web; namespace SpiderFromScratch { class Program { static void Main(string[] args) { CookieContainer cookies = new CookieContainer(); Uri url = new Uri("https://login.live.com/login.srf?wa=wsignin1.0&rpsnv=11&ct=1268167141&rver=5.5.4177.0&wp=LBI&wreply=http:%2F%2Fwww.bungie.net%2FDefault.aspx&id=42917"); HttpWebRequest http = (HttpWebRequest)HttpWebRequest.Create(url); http.Timeout = 30000; http.UserAgent = "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.1.8) Gecko/20100202 Firefox/3.5.8 (.NET CLR 3.5.30729)"; http.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"; http.Headers.Add("Accept-Language", "en-us,en;q=0.5"); http.Headers.Add("Accept-Charset", "ISO-8859-1,utf-8;q=0.7,*;q=0.7"); http.Headers.Add("Keep-Alive", "300"); http.Referer = "http://www.bungie.net/"; http.ContentType = "application/x-www-form-urlencoded"; http.CookieContainer = new CookieContainer(); http.Method = WebRequestMethods.Http.Get; HttpWebResponse response = (HttpWebResponse)http.GetResponse(); StreamReader readStream = new StreamReader(response.GetResponseStream()); string HTML = readStream.ReadToEnd(); readStream.Close(); //gets the cookies (they are set in the eighth header) string[] strCookies = response.Headers.GetValues(8); response.Close(); string name, value; Cookie manualCookie; for (int i = 0; i < strCookies.Length; i++) { name = strCookies[i].Substring(0, strCookies[i].IndexOf("=")); value = strCookies[i].Substring(strCookies[i].IndexOf("=") + 1, strCookies[i].IndexOf(";") - strCookies[i].IndexOf("=") - 1); manualCookie = new Cookie(name, "\"" + value + "\""); Uri manualURL = new Uri("http://login.live.com"); http.CookieContainer.Add(manualURL, manualCookie); } //stores the cookies to be used later cookies = http.CookieContainer; //Get the PPSX value string PPSX = HTML.Remove(0, HTML.IndexOf("PPSX")); PPSX = PPSX.Remove(0, PPSX.IndexOf("value") + 7); PPSX = PPSX.Substring(0, PPSX.IndexOf("\"")); //Get this random PPFT value string PPFT = HTML.Remove(0, HTML.IndexOf("PPFT")); PPFT = PPFT.Remove(0, PPFT.IndexOf("value") + 7); PPFT = PPFT.Substring(0, PPFT.IndexOf("\"")); //Get the random URL you POST to string POSTURL = HTML.Remove(0, HTML.IndexOf("https://login.live.com/ppsecure/post.srf?wa=wsignin1.0&rpsnv=11&ct=")); POSTURL = POSTURL.Substring(0, POSTURL.IndexOf("\"")); //POST with cookies http = (HttpWebRequest)HttpWebRequest.Create(POSTURL); http.UserAgent = "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.1.8) Gecko/20100202 Firefox/3.5.8 (.NET CLR 3.5.30729)"; http.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"; http.Headers.Add("Accept-Language", "en-us,en;q=0.5"); http.Headers.Add("Accept-Charset", "ISO-8859-1,utf-8;q=0.7,*;q=0.7"); http.Headers.Add("Keep-Alive", "300"); http.CookieContainer = cookies; http.Referer = "https://login.live.com/login.srf?wa=wsignin1.0&rpsnv=11&ct=1268158321&rver=5.5.4177.0&wp=LBI&wreply=http:%2F%2Fwww.bungie.net%2FDefault.aspx&id=42917"; http.ContentType = "application/x-www-form-urlencoded"; http.Method = WebRequestMethods.Http.Post; Stream ostream = http.GetRequestStream(); //used to convert strings into bytes System.Text.ASCIIEncoding encoding = new System.Text.ASCIIEncoding(); //Post information byte[] buffer = encoding.GetBytes("PPSX=" + PPSX +"&PwdPad=IfYouAreReadingThisYouHaveTooMuc&login=YOUREMAILGOESHERE&passwd=YOURWORDGOESHERE" + "&LoginOptions=2&PPFT=" + PPFT); ostream.Write(buffer, 0, buffer.Length); ostream.Close(); HttpWebResponse response2 = (HttpWebResponse)http.GetResponse(); readStream = new StreamReader(response2.GetResponseStream()); HTML = readStream.ReadToEnd(); response2.Close(); ostream.Dispose(); foreach (Cookie cookie in response2.Cookies) { Console.WriteLine(cookie.Name + ": "); Console.WriteLine(cookie.Value); Console.WriteLine(cookie.Expires); Console.WriteLine(); } //SET POSTURL value string POSTANON = "http://www.bungie.net/Default.aspx?wa=wsignin1.0"; //Get the ANON value string ANON = HTML.Remove(0, HTML.IndexOf("ANON")); ANON = ANON.Remove(0, ANON.IndexOf("value") + 7); ANON = ANON.Substring(0, ANON.IndexOf("\"")); ANON = HttpUtility.UrlEncode(ANON); //Get the ANONExp value string ANONExp = HTML.Remove(0, HTML.IndexOf("ANONExp")); ANONExp = ANONExp.Remove(0, ANONExp.IndexOf("value") + 7); ANONExp = ANONExp.Substring(0, ANONExp.IndexOf("\"")); ANONExp = HttpUtility.UrlEncode(ANONExp); //Get the t value string t = HTML.Remove(0, HTML.IndexOf("id=\"t\"")); t = t.Remove(0, t.IndexOf("value") + 7); t = t.Substring(0, t.IndexOf("\"")); t = HttpUtility.UrlEncode(t); //POST the Info and Accept the Bungie Cookies http = (HttpWebRequest)HttpWebRequest.Create(POSTANON); http.UserAgent = "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.1.8) Gecko/20100202 Firefox/3.5.8 (.NET CLR 3.5.30729)"; http.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"; http.Headers.Add("Accept-Language", "en-us,en;q=0.5"); http.Headers.Add("Accept-Encoding", "gzip,deflate"); http.Headers.Add("Accept-Charset", "ISO-8859-1,utf-8;q=0.7,*;q=0.7"); http.Headers.Add("Keep-Alive", "115"); http.CookieContainer = new CookieContainer(); http.ContentType = "application/x-www-form-urlencoded"; http.Method = WebRequestMethods.Http.Post; http.Expect = null; ostream = http.GetRequestStream(); int test = ANON.Length; int test1 = ANONExp.Length; int test2 = t.Length; buffer = encoding.GetBytes("ANON=" + ANON +"&ANONExp=" + ANONExp + "&t=" + t); ostream.Write(buffer, 0, buffer.Length); ostream.Close(); //Here lies the problem, I am not returned the correct cookies. HttpWebResponse response3 = (HttpWebResponse)http.GetResponse(); GZipStream gzip = new GZipStream(response3.GetResponseStream(), CompressionMode.Decompress); readStream = new StreamReader(gzip); HTML = readStream.ReadToEnd(); //gets both cookies string[] strCookies2 = response3.Headers.GetValues(11); response3.Close(); } } } This has given me problems and I've put many hours into learning about HTTP protocols so any help would be appreciated. If there is an article detailing a similar log in to live.com feel free to point the way. I've been looking far and wide for any articles with working solutions. If I could be clearer, feel free to ask as this is my first time using Stack Overflow. Cheers, --Josh

    Read the article

  • Control layout using graphviz twopi

    - by vy32
    I am trying to draw a graph showing search prefixes using twopi. I have a simple input file and am getting this output: (full image) Here is the input file: digraph search { // ordering=out; // color=blue; // rank=same; // overlap=scale; rankdir=LR; root=root; ranksep=1.25; overlap=true; "root"; a [color=none,fontsize=12]; b [color=none,fontsize=12]; c [color=none,fontsize=12]; d [color=none,fontsize=12]; e [color=none,fontsize=12]; f [color=none,fontsize=12]; #g [color=none,fontsize=12]; h [color=none,fontsize=12]; i [color=none,fontsize=12]; j [color=none,fontsize=12]; k [color=none,fontsize=12]; l [color=none,fontsize=12]; m [color=none,fontsize=12]; n [color=none,fontsize=12]; o [color=none,fontsize=12]; p [color=none,fontsize=12]; q [color=none,fontsize=12]; r [color=none,fontsize=12]; s [color=none,fontsize=12]; t [color=none,fontsize=12]; u [color=none,fontsize=12]; v [color=none,fontsize=12]; w [color=none,fontsize=12]; x [color=none,fontsize=12]; y [color=none,fontsize=12]; #ga [color=none,fontsize=12]; gb [color=none,fontsize=12]; gc [color=none,fontsize=12]; gd [color=none,fontsize=12]; ge [color=none,fontsize=12]; gf [color=none,fontsize=12]; gg [color=none,fontsize=12]; gh [color=none,fontsize=12]; gi [color=none,fontsize=12]; gj [color=none,fontsize=12]; gk [color=none,fontsize=12]; gl [color=none,fontsize=12]; gm [color=none,fontsize=12]; gn [color=none,fontsize=12]; go [color=none,fontsize=12]; gp [color=none,fontsize=12]; gq [color=none,fontsize=12]; gr [color=none,fontsize=12]; gs [color=none,fontsize=12]; gt [color=none,fontsize=12]; gu [color=none,fontsize=12]; gv [color=none,fontsize=12]; gw [color=none,fontsize=12]; gx [color=none,fontsize=12]; gy [color=none,fontsize=12]; gaa [color=none,fontsize=12]; gab [color=none,fontsize=12]; gac [color=none,fontsize=12]; gad [color=none,fontsize=12]; gae [color=none,fontsize=12]; gaf [color=none,fontsize=12]; gag [color=none,fontsize=12]; gah [color=none,fontsize=12]; gai [color=none,fontsize=12]; gaj [color=none,fontsize=12]; gak [color=none,fontsize=12]; gal [color=none,fontsize=12]; gam [color=none,fontsize=12]; gan [color=none,fontsize=12]; gao [color=none,fontsize=12]; gap [color=none,fontsize=12]; gaq [color=none,fontsize=12]; #gaz [color=none,fontsize=12]; gas [color=none,fontsize=12]; gat [color=none,fontsize=12]; gau [color=none,fontsize=12]; gav [color=none,fontsize=12]; gaw [color=none,fontsize=12]; gax [color=none,fontsize=12]; gay [color=none,fontsize=12]; gaza [color=none,fontsize=12]; gazb [color=none,fontsize=12]; gazc [color=none,fontsize=12]; gazd [color=none,fontsize=12]; gaze [color=none,fontsize=12]; #gazf [color=none,fontsize=12]; gazg [color=none,fontsize=12]; gazh [color=none,fontsize=12]; gazi [color=none,fontsize=12]; gazj [color=none,fontsize=12]; gazk [color=none,fontsize=12]; gazl [color=none,fontsize=12]; gazm [color=none,fontsize=12]; gazn [color=none,fontsize=12]; gazo [color=none,fontsize=12]; gazp [color=none,fontsize=12]; gazq [color=none,fontsize=12]; gazr [color=none,fontsize=12]; gazs [color=none,fontsize=12]; gazt [color=none,fontsize=12]; gazu [color=none,fontsize=12]; gazv [color=none,fontsize=12]; gazw [color=none,fontsize=12]; gazx [color=none,fontsize=12]; gazy [color=none,fontsize=12]; root -> a [minlen=2]; root -> b [minlen=2]; root -> c [minlen=2]; root -> d [minlen=2]; root -> e [minlen=2]; root -> f [minlen=2]; root -> g [minlen=2]; root -> h [minlen=2]; root -> i [minlen=2]; root -> j [minlen=2]; root -> k [minlen=2]; root -> l [minlen=2]; root -> m [minlen=2]; root -> n [minlen=2]; root -> o [minlen=2]; root -> p [minlen=2]; root -> q [minlen=2]; root -> r [minlen=2]; root -> s [minlen=20]; root -> t [minlen=2]; root -> u [minlen=2]; root -> v [minlen=2]; root -> w [minlen=2]; root -> x [minlen=2]; root -> y [minlen=2]; root -> 0 [minlen=2]; root -> 1 [minlen=2]; root -> 2 [minlen=2]; root -> 3 [minlen=2]; root -> 4 [minlen=2]; root -> 5 [minlen=2]; root -> 6 [minlen=2]; root -> 7 [minlen=2]; root -> 8 [minlen=2]; root -> 9 [minlen=2]; root -> "." [minlen=2]; g -> ga ; g -> gb ; g -> gc ; g -> gd ; g -> ge ; g -> gf ; g -> gg ; g -> gh ; g -> gi ; g -> gj ; g -> gk ; g -> gl ; g -> gm ; g -> gn ; g -> go ; g -> gp ; g -> gq ; g -> gr ; g -> gs ; g -> gt ; g -> gu ; g -> gv ; g -> gw ; g -> gx ; g -> gy ; ga -> gaa ; ga -> gab ; ga -> gac ; ga -> gad ; ga -> gae ; ga -> gaf ; ga -> gag ; ga -> gah ; ga -> gai ; ga -> gaj ; ga -> gak ; ga -> gal ; ga -> gam ; ga -> gan ; ga -> gao ; ga -> gap ; ga -> gaq ; ga -> gaz ; ga -> gas ; ga -> gat ; ga -> gau ; ga -> gav ; ga -> gaw ; ga -> gax ; ga -> gay ; gaz -> gaza ; gaz -> gazb ; gaz -> gazc ; gaz -> gazd ; gaz -> gaze ; gaz -> gazf ; gaz -> gazg ; gaz -> gazh ; gaz -> gazi ; gaz -> gazj ; gaz -> gazk ; gaz -> gazl ; gaz -> gazm ; gaz -> gazn ; gaz -> gazo ; gaz -> gazp ; gaz -> gazq ; gaz -> gazr ; gaz -> gazs ; gaz -> gazt ; gaz -> gazu ; gaz -> gazv ; gaz -> gazw ; gaz -> gazx ; gaz -> gazy ; gazo -> "Blue Tuesday" ; "Blue Tuesday" [ fontsize=10]; // Layout engines: circo dot fdp neato nop nop1 nop2 osage patchwork sfdp twopi } This output is generated with: twopi -os1.png -Tpng s1.dot I'm posting here because the printout is pretty dreadful. All of the nodes hung of "gaz" are overlapping; I've tried specifying nodesep and it is simply ignored. I would like to see the lines from root to the single letters further apart, but again, I can't control that. This seems to be a bug in twopi. The documentation says it should clearly follow these directives, but it doesn't seem to. My questions: Is there any way to make twopi behave? Failing that, is there a better layout engine to use? Thanks.

    Read the article

  • Logging Into a site that uses Live.com authentication

    - by Josh
    I've been trying to automate a log in to a website I frequent, www.bungie.net. The site is associated with Microsoft and Xbox Live, and as such makes uses of the Windows Live ID API when people log in to their site. I am relatively new to creating web spiders/robots, and I worry that I'm misunderstanding some of the most basic concepts. I've simulated logins to other sites such as Facebook and Gmail, but live.com has given me nothing but trouble. Anyways, I've been using Wireshark and the Firefox addon Tamper Data to try and figure out what I need to post, and what cookies I need to include with my requests. As far as I know these are the steps one must follow to log in to this site. 1. Visit https: //login.live.com/login.srf?wa=wsignin1.0&rpsnv=11&ct=1268167141&rver=5.5.4177.0&wp=LBI&wreply=http:%2F%2Fwww.bungie.net%2FDefault.aspx&id=42917 2. Recieve the cookies MSPRequ and MSPOK. 3. Post the values from the form ID "PPSX", the values from the form ID "PPFT", your username, your password all to a changing URL similar to: https: //login.live.com/ppsecure/post.srf?wa=wsignin1.0&rpsnv=11&ct= (there are a few numbers that change at the end of that URL) 4. Live.com returns the user a page with more hidden forms to post. The client then posts the values from the form "ANON", the value from the form "ANONExp" and the values from the form "t" to the URL: http ://www.bung ie.net/Default.aspx?wa=wsignin1.0 5. After posting that data, the user is returned a variety of cookies the most important of which is "BNGAuth" which is the log in cookie for the site. Where I am having trouble is on fifth step, but that doesn't neccesarily mean I've done all the other steps correctly. I post the data from "ANON", "ANONExp" and "t" but instead of being returned a BNGAuth cookie, I'm returned a cookie named "RSPMaybe" and redirected to the home page. When I review the Wireshark log, I noticed something that instantly stood out to me as different between the log when I logged in with Firefox and when my program ran. It could be nothing but I'll include the picture here for you to review. I'm being returned an HTTP packet from the site before I post the data in the fourth step. I'm not sure how this is happening, but it must be a side effect from something I'm doing wrong in the HTTPS steps. using System; using System.Collections.Generic; using System.Collections.Specialized; using System.Text; using System.Net; using System.IO; using System.IO.Compression; using System.Security.Cryptography; using System.Security.Cryptography.X509Certificates; using System.Web; namespace SpiderFromScratch { class Program { static void Main(string[] args) { CookieContainer cookies = new CookieContainer(); Uri url = new Uri("https://login.live.com/login.srf?wa=wsignin1.0&rpsnv=11&ct=1268167141&rver=5.5.4177.0&wp=LBI&wreply=http:%2F%2Fwww.bungie.net%2FDefault.aspx&id=42917"); HttpWebRequest http = (HttpWebRequest)HttpWebRequest.Create(url); http.Timeout = 30000; http.UserAgent = "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.1.8) Gecko/20100202 Firefox/3.5.8 (.NET CLR 3.5.30729)"; http.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"; http.Headers.Add("Accept-Language", "en-us,en;q=0.5"); http.Headers.Add("Accept-Charset", "ISO-8859-1,utf-8;q=0.7,*;q=0.7"); http.Headers.Add("Keep-Alive", "300"); http.Referer = "http://www.bungie.net/"; http.ContentType = "application/x-www-form-urlencoded"; http.CookieContainer = new CookieContainer(); http.Method = WebRequestMethods.Http.Get; HttpWebResponse response = (HttpWebResponse)http.GetResponse(); StreamReader readStream = new StreamReader(response.GetResponseStream()); string HTML = readStream.ReadToEnd(); readStream.Close(); //gets the cookies (they are set in the eighth header) string[] strCookies = response.Headers.GetValues(8); response.Close(); string name, value; Cookie manualCookie; for (int i = 0; i < strCookies.Length; i++) { name = strCookies[i].Substring(0, strCookies[i].IndexOf("=")); value = strCookies[i].Substring(strCookies[i].IndexOf("=") + 1, strCookies[i].IndexOf(";") - strCookies[i].IndexOf("=") - 1); manualCookie = new Cookie(name, "\"" + value + "\""); Uri manualURL = new Uri("http://login.live.com"); http.CookieContainer.Add(manualURL, manualCookie); } //stores the cookies to be used later cookies = http.CookieContainer; //Get the PPSX value string PPSX = HTML.Remove(0, HTML.IndexOf("PPSX")); PPSX = PPSX.Remove(0, PPSX.IndexOf("value") + 7); PPSX = PPSX.Substring(0, PPSX.IndexOf("\"")); //Get this random PPFT value string PPFT = HTML.Remove(0, HTML.IndexOf("PPFT")); PPFT = PPFT.Remove(0, PPFT.IndexOf("value") + 7); PPFT = PPFT.Substring(0, PPFT.IndexOf("\"")); //Get the random URL you POST to string POSTURL = HTML.Remove(0, HTML.IndexOf("https://login.live.com/ppsecure/post.srf?wa=wsignin1.0&rpsnv=11&ct=")); POSTURL = POSTURL.Substring(0, POSTURL.IndexOf("\"")); //POST with cookies http = (HttpWebRequest)HttpWebRequest.Create(POSTURL); http.UserAgent = "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.1.8) Gecko/20100202 Firefox/3.5.8 (.NET CLR 3.5.30729)"; http.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"; http.Headers.Add("Accept-Language", "en-us,en;q=0.5"); http.Headers.Add("Accept-Charset", "ISO-8859-1,utf-8;q=0.7,*;q=0.7"); http.Headers.Add("Keep-Alive", "300"); http.CookieContainer = cookies; http.Referer = "https://login.live.com/login.srf?wa=wsignin1.0&rpsnv=11&ct=1268158321&rver=5.5.4177.0&wp=LBI&wreply=http:%2F%2Fwww.bungie.net%2FDefault.aspx&id=42917"; http.ContentType = "application/x-www-form-urlencoded"; http.Method = WebRequestMethods.Http.Post; Stream ostream = http.GetRequestStream(); //used to convert strings into bytes System.Text.ASCIIEncoding encoding = new System.Text.ASCIIEncoding(); //Post information byte[] buffer = encoding.GetBytes("PPSX=" + PPSX +"&PwdPad=IfYouAreReadingThisYouHaveTooMuc&login=YOUREMAILGOESHERE&passwd=YOURWORDGOESHERE" + "&LoginOptions=2&PPFT=" + PPFT); ostream.Write(buffer, 0, buffer.Length); ostream.Close(); HttpWebResponse response2 = (HttpWebResponse)http.GetResponse(); readStream = new StreamReader(response2.GetResponseStream()); HTML = readStream.ReadToEnd(); response2.Close(); ostream.Dispose(); foreach (Cookie cookie in response2.Cookies) { Console.WriteLine(cookie.Name + ": "); Console.WriteLine(cookie.Value); Console.WriteLine(cookie.Expires); Console.WriteLine(); } //SET POSTURL value string POSTANON = "http://www.bungie.net/Default.aspx?wa=wsignin1.0"; //Get the ANON value string ANON = HTML.Remove(0, HTML.IndexOf("ANON")); ANON = ANON.Remove(0, ANON.IndexOf("value") + 7); ANON = ANON.Substring(0, ANON.IndexOf("\"")); ANON = HttpUtility.UrlEncode(ANON); //Get the ANONExp value string ANONExp = HTML.Remove(0, HTML.IndexOf("ANONExp")); ANONExp = ANONExp.Remove(0, ANONExp.IndexOf("value") + 7); ANONExp = ANONExp.Substring(0, ANONExp.IndexOf("\"")); ANONExp = HttpUtility.UrlEncode(ANONExp); //Get the t value string t = HTML.Remove(0, HTML.IndexOf("id=\"t\"")); t = t.Remove(0, t.IndexOf("value") + 7); t = t.Substring(0, t.IndexOf("\"")); t = HttpUtility.UrlEncode(t); //POST the Info and Accept the Bungie Cookies http = (HttpWebRequest)HttpWebRequest.Create(POSTANON); http.UserAgent = "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.1.8) Gecko/20100202 Firefox/3.5.8 (.NET CLR 3.5.30729)"; http.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"; http.Headers.Add("Accept-Language", "en-us,en;q=0.5"); http.Headers.Add("Accept-Encoding", "gzip,deflate"); http.Headers.Add("Accept-Charset", "ISO-8859-1,utf-8;q=0.7,*;q=0.7"); http.Headers.Add("Keep-Alive", "115"); http.CookieContainer = new CookieContainer(); http.ContentType = "application/x-www-form-urlencoded"; http.Method = WebRequestMethods.Http.Post; http.Expect = null; ostream = http.GetRequestStream(); int test = ANON.Length; int test1 = ANONExp.Length; int test2 = t.Length; buffer = encoding.GetBytes("ANON=" + ANON +"&ANONExp=" + ANONExp + "&t=" + t); ostream.Write(buffer, 0, buffer.Length); ostream.Close(); //Here lies the problem, I am not returned the correct cookies. HttpWebResponse response3 = (HttpWebResponse)http.GetResponse(); GZipStream gzip = new GZipStream(response3.GetResponseStream(), CompressionMode.Decompress); readStream = new StreamReader(gzip); HTML = readStream.ReadToEnd(); //gets both cookies string[] strCookies2 = response3.Headers.GetValues(11); response3.Close(); } } } This has given me problems and I've put many hours into learning about HTTP protocols so any help would be appreciated. If there is an article detailing a similar log in to live.com feel free to point the way. I've been looking far and wide for any articles with working solutions. If I could be clearer, feel free to ask as this is my first time using Stack Overflow.

    Read the article

  • Why do I get "unsupported architecture" errors trying to install a Python library in OSX?

    - by Emma518
    I am trying to install a Python library in the Presto package, source http://www.cv.nrao.edu/~sransom/presto/ Using 'gmake fftfit' I get the following error: cd fftfit_src ; f2py-2.7 -c fftfit.pyf *.f running build running config_cc unifing config_cc, config, build_clib, build_ext, build commands --compiler options running config_fc unifing config_fc, config, build_clib, build_ext, build commands --fcompiler options running build_src build_src building extension "fftfit" sources creating /var/folders/sx/j_l_qvys4bv00_38pfvy3m8h0000gp/T/tmp9MmLz8/src.macosx-10.9-x86_64-2.7 f2py options: [] f2py: fftfit.pyf Reading fortran codes... Reading file 'fftfit.pyf' (format:free) Post-processing... Block: fftfit Block: cprof Block: fftfit Post-processing (stage 2)... Building modules... Building module "fftfit"... Constructing wrapper function "cprof"... c,amp,pha = cprof(y,[nmax,nh]) Constructing wrapper function "fftfit"... shift,eshift,snr,esnr,b,errb,ngood = fftfit(prof,s,phi,[nmax]) Wrote C/API module "fftfit" to file "/var/folders/sx/j_l_qvys4bv00_38pfvy3m8h0000gp/T/tmp9MmLz8/src.macosx-10.9-x86_64- 2.7/fftfitmodule.c" adding '/var/folders/sx/j_l_qvys4bv00_38pfvy3m8h0000gp/T/tmp9MmLz8/src.macosx-10.9-x86_64-2.7/fortranobject.c' to sources. adding '/var/folders/sx/j_l_qvys4bv00_38pfvy3m8h0000gp/T/tmp9MmLz8/src.macosx-10.9-x86_64-2.7' to include_dirs. copying /Library/Python/2.7/site-packages/numpy-1.8.2-py2.7-macosx-10.9- intel.egg/numpy/f2py/src/fortranobject.c -> /var/folders/sx/j_l_qvys4bv00_38pfvy3m8h0000gp/T/tmp9MmLz8/src.macosx-10.9-x86_64-2.7 copying /Library/Python/2.7/site-packages/numpy-1.8.2-py2.7-macosx-10.9-intel.egg/numpy/f2py/src/fortranobject.h -> /var/folders/sx/j_l_qvys4bv00_38pfvy3m8h0000gp/T/tmp9MmLz8/src.macosx-10.9-x86_64-2.7 build_src: building npy-pkg config files running build_ext customize UnixCCompiler customize UnixCCompiler using build_ext customize Gnu95FCompiler Found executable /usr/local/bin/gfortran customize Gnu95FCompiler customize Gnu95FCompiler using build_ext building 'fftfit' extension compiling C sources C compiler: /usr/bin/clang -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -arch ppc -arch i386 -arch x86_64 -g -O2 creating /var/folders/sx/j_l_qvys4bv00_38pfvy3m8h0000gp/T/tmp9MmLz8/var creating /var/folders/sx/j_l_qvys4bv00_38pfvy3m8h0000gp/T/tmp9MmLz8/var/folders creating /var/folders/sx/j_l_qvys4bv00_38pfvy3m8h0000gp/T/tmp9MmLz8/var/folders/sx creating /var/folders/sx/j_l_qvys4bv00_38pfvy3m8h0000gp/T/tmp9MmLz8/var/folders/sx/j_l_qvys4bv00_38pfvy3m8h0000gp creating /var/folders/sx/j_l_qvys4bv00_38pfvy3m8h0000gp/T/tmp9MmLz8/var/folders/sx/j_l_qvys4bv00_38pfvy3m8h00 00gp/T creating /var/folders/sx/j_l_qvys4bv00_38pfvy3m8h0000gp/T/tmp9MmLz8/var/folders/sx/j_l_qvys4bv00_38pfvy3m8h00 00gp/T/tmp9MmLz8 creating /var/folders/sx/j_l_qvys4bv00_38pfvy3m8h0000gp/T/tmp9MmLz8/var/folders/sx/j_l_qvys4bv00_38pfvy3m8h00 00gp/T/tmp9MmLz8/src.macosx-10.9-x86_64-2.7 compile options: '-I/var/folders/sx/j_l_qvys4bv00_38pfvy3m8h0000gp/T/tmp9MmLz8/src.macosx-10.9- x86_64-2.7 -I/Library/Python/2.7/site-packages/numpy-1.8.2-py2.7-macosx-10.9- intel.egg/numpy/core/include - I/opt/local/Library/Frameworks/Python.framework/Versions/2.7/include/python2.7 -c' clang: /var/folders/sx/j_l_qvys4bv00_38pfvy3m8h0000gp/T/tmp9MmLz8/src.macosx-10.9-x86_64- 2.7/fftfitmodule.c In file included from /var/folders/sx/j_l_qvys4bv00_38pfvy3m8h0000gp/T/tmp9MmLz8/src.macosx- 10.9-x86_64-2.7/fftfitmodule.c:16: In file included from /opt/local/Library/Frameworks/Python.framework/Versions/2.7/include/python2.7/Python.h:19: In file included from /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/../lib/clang/ 5.1/include/limits.h:38: In file included from /usr/include/limits.h:63: /usr/include/sys/cdefs.h:658:2: error: Unsupported architecture #error Unsupported architecture ^ In file included from /var/folders/sx/j_l_qvys4bv00_38pfvy3m8h0000gp/T/tmp9MmLz8/src.macosx- 10.9-x86_64-2.7/fftfitmodule.c:16: In file included from /opt/local/Library/Frameworks/Python.framework/Versions/2.7/include/python2.7/Python.h:19: In file included from /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/../lib/clang/ 5.1/include/limits.h:38: In file included from /usr/include/limits.h:64: /usr/include/machine/limits.h:8:2: error: architecture not supported #error architecture not supported ^ In file included from /var/folders/sx/j_l_qvys4bv00_38pfvy3m8h0000gp/T/tmp9MmLz8/src.macosx- 10.9-x86_64-2.7/fftfitmodule.c:16: In file included from /opt/local/Library/Frameworks/Python.framework/Versions/2.7/include/python2.7/Python.h:33: In file included from /usr/include/stdio.h:67: In file included from /usr/include/_types.h:27: In file included from /usr/include/sys/_types.h:33: /usr/include/machine/_types.h:34:2: error: architecture not supported #error architecture not supported ^ In file included from /var/folders/sx/j_l_qvys4bv00_38pfvy3m8h0000gp/T/tmp9MmLz8/src.macosx- 10.9-x86_64-2.7/fftfitmodule.c:16: In file included from /opt/local/Library/Frameworks/Python.framework/Versions/2.7/include/python2.7/Python.h:33: In file included from /usr/include/stdio.h:67: In file included from /usr/include/_types.h:27: /usr/include/sys/_types.h:94:9: error: unknown type name '__int64_t' typedef __int64_t __darwin_blkcnt_t; /* total blocks */ ^ /usr/include/sys/_types.h:95:9: error: unknown type name '__int32_t' typedef __int32_t __darwin_blksize_t; /* preferred block size */ ^ /usr/include/sys/_types.h:96:9: error: unknown type name '__int32_t' typedef __int32_t __darwin_dev_t; /* dev_t */ ^ /usr/include/sys/_types.h:99:9: error: unknown type name '__uint32_t' typedef __uint32_t __darwin_gid_t; /* [???] process and group IDs */ ^ /usr/include/sys/_types.h:100:9: error: unknown type name '__uint32_t' typedef __uint32_t __darwin_id_t; /* [XSI] pid_t, uid_t, or gid_t*/ ^ /usr/include/sys/_types.h:101:9: error: unknown type name '__uint64_t' typedef __uint64_t __darwin_ino64_t; /* [???] Used for 64 bit inodes */ ^ /usr/include/sys/_types.h:107:9: error: unknown type name '__darwin_natural_t' typedef __darwin_natural_t __darwin_mach_port_name_t; /* Used by mach */ ^ /usr/include/sys/_types.h:109:9: error: unknown type name '__uint16_t' typedef __uint16_t __darwin_mode_t; /* [???] Some file attributes */ ^ /usr/include/sys/_types.h:110:9: error: unknown type name '__int64_t' typedef __int64_t __darwin_off_t; /* [???] Used for file sizes */ ^ /usr/include/sys/_types.h:111:9: error: unknown type name '__int32_t' typedef __int32_t __darwin_pid_t; /* [???] process and group IDs */ ^ /usr/include/sys/_types.h:131:9: error: unknown type name '__uint32_t' typedef __uint32_t __darwin_sigset_t; /* [???] signal set */ ^ /usr/include/sys/_types.h:132:9: error: unknown type name '__int32_t' typedef __int32_t __darwin_suseconds_t; /* [???] microseconds */ ^ /usr/include/sys/_types.h:133:9: error: unknown type name '__uint32_t' typedef __uint32_t __darwin_uid_t; /* [???] user IDs */ ^ /usr/include/sys/_types.h:134:9: error: unknown type name '__uint32_t' typedef __uint32_t __darwin_useconds_t; /* [???] microseconds */ ^ In file included from /var/folders/sx/j_l_qvys4bv00_38pfvy3m8h0000gp/T/tmp9MmLz8/src.macosx- 10.9-x86_64-2.7/fftfitmodule.c:16: In file included from /opt/local/Library/Frameworks/Python.framework/Versions/2.7/include/python2.7/Python.h:33: In file included from /usr/include/stdio.h:71: /usr/include/sys/_types/_va_list.h:31:9: error: unknown type name '__darwin_va_list'; did you mean '__builtin_va_list'? typedef __darwin_va_list va_list; ^ note: '__builtin_va_list' declared here In file included from /var/folders/sx/j_l_qvys4bv00_38pfvy3m8h0000gp/T/tmp9MmLz8/src.macosx- 10.9-x86_64-2.7/fftfitmodule.c:16: In file included from /opt/local/Library/Frameworks/Python.framework/Versions/2.7/include/python2.7/Python.h:33: In file included from /usr/include/stdio.h:72: /usr/include/sys/_types/_size_t.h:30:9: error: unknown type name '__darwin_size_t'; did you mean '__darwin_ino_t'? typedef __darwin_size_t size_t; ^ /usr/include/sys/_types.h:103:26: note: '__darwin_ino_t' declared here typedef __darwin_ino64_t __darwin_ino_t; /* [???] Used for inodes */ ^ fatal error: too many errors emitted, stopping now [-ferror-limit=] 20 errors generated. In file included from /var/folders/sx/j_l_qvys4bv00_38pfvy3m8h0000gp/T/tmp9MmLz8/src.macosx- 10.9-x86_64-2.7/fftfitmodule.c:16: In file included from /opt/local/Library/Frameworks/Python.framework/Versions/2.7/include/python2.7/Python.h:19: In file included from /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/../lib/clang/5.1/include/limits.h:38: In file included from /usr/include/limits.h:63: /usr/include/sys/cdefs.h:658:2: error: Unsupported architecture #error Unsupported architecture ^ In file included from /var/folders/sx/j_l_qvys4bv00_38pfvy3m8h0000gp/T/tmp9MmLz8/src.macosx- 10.9-x86_64-2.7/fftfitmodule.c:16: In file included from /opt/local/Library/Frameworks/Python.framework/Versions/2.7/include/python2.7/Python.h:19: In file included from /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/../lib/clang/ 5.1/include/limits.h:38: In file included from /usr/include/limits.h:64: /usr/include/machine/limits.h:8:2: error: architecture not supported #error architecture not supported ^ In file included from /var/folders/sx/j_l_qvys4bv00_38pfvy3m8h0000gp/T/tmp9MmLz8/src.macosx- 10.9-x86_64-2.7/fftfitmodule.c:16: In file included from /opt/local/Library/Frameworks/Python.framework/Versions/2.7/include/python2.7/Python.h:33: In file included from /usr/include/stdio.h:67: In file included from /usr/include/_types.h:27: In file included from /usr/include/sys/_types.h:33: /usr/include/machine/_types.h:34:2: error: architecture not supported #error architecture not supported ^ In file included from /var/folders/sx/j_l_qvys4bv00_38pfvy3m8h0000gp/T/tmp9MmLz8/src.macosx- 10.9-x86_64-2.7/fftfitmodule.c:16: In file included from /opt/local/Library/Frameworks/Python.framework/Versions/2.7/include/python2.7/Python.h:33: In file included from /usr/include/stdio.h:67: In file included from /usr/include/_types.h:27: /usr/include/sys/_types.h:94:9: error: unknown type name '__int64_t' typedef __int64_t __darwin_blkcnt_t; /* total blocks */ ^ /usr/include/sys/_types.h:95:9: error: unknown type name '__int32_t' typedef __int32_t __darwin_blksize_t; /* preferred block size */ ^ /usr/include/sys/_types.h:96:9: error: unknown type name '__int32_t' typedef __int32_t __darwin_dev_t; /* dev_t */ ^ /usr/include/sys/_types.h:99:9: error: unknown type name '__uint32_t' typedef __uint32_t __darwin_gid_t; /* [???] process and group IDs */ ^ /usr/include/sys/_types.h:100:9: error: unknown type name '__uint32_t' typedef __uint32_t __darwin_id_t; /* [XSI] pid_t, uid_t, or gid_t*/ ^ /usr/include/sys/_types.h:101:9: error: unknown type name '__uint64_t' typedef __uint64_t __darwin_ino64_t; /* [???] Used for 64 bit inodes */ ^ /usr/include/sys/_types.h:107:9: error: unknown type name '__darwin_natural_t' typedef __darwin_natural_t __darwin_mach_port_name_t; /* Used by mach */ ^ /usr/include/sys/_types.h:109:9: error: unknown type name '__uint16_t' typedef __uint16_t __darwin_mode_t; /* [???] Some file attributes */ ^ /usr/include/sys/_types.h:110:9: error: unknown type name '__int64_t' typedef __int64_t __darwin_off_t; /* [???] Used for file sizes */ ^ /usr/include/sys/_types.h:111:9: error: unknown type name '__int32_t' typedef __int32_t __darwin_pid_t; /* [???] process and group IDs */ ^ /usr/include/sys/_types.h:131:9: error: unknown type name '__uint32_t' typedef __uint32_t __darwin_sigset_t; /* [???] signal set */ ^ /usr/include/sys/_types.h:132:9: error: unknown type name '__int32_t' typedef __int32_t __darwin_suseconds_t; /* [???] microseconds */ ^ /usr/include/sys/_types.h:133:9: error: unknown type name '__uint32_t' typedef __uint32_t __darwin_uid_t; /* [???] user IDs */ ^ /usr/include/sys/_types.h:134:9: error: unknown type name '__uint32_t' typedef __uint32_t __darwin_useconds_t; /* [???] microseconds */ ^ In file included from /var/folders/sx/j_l_qvys4bv00_38pfvy3m8h0000gp/T/tmp9MmLz8/src.macosx-10.9-x86_64-2.7/fftfitmodule.c:16: In file included from /opt/local/Library/Frameworks/Python.framework/Versions/2.7/include/python2.7/Python.h:33: In file included from /usr/include/stdio.h:71: /usr/include/sys/_types/_va_list.h:31:9: error: unknown type name '__darwin_va_list'; did you mean '__builtin_va_list'? typedef __darwin_va_list va_list; ^ note: '__builtin_va_list' declared here In file included from /var/folders/sx/j_l_qvys4bv00_38pfvy3m8h0000gp/T/tmp9MmLz8/src.macosx-10.9-x86_64-2.7/fftfitmodule.c:16: In file included from /opt/local/Library/Frameworks/Python.framework/Versions/2.7/include/python2.7/Python.h:33: In file included from /usr/include/stdio.h:72: /usr/include/sys/_types/_size_t.h:30:9: error: unknown type name '__darwin_size_t'; did you mean '__darwin_ino_t'? typedef __darwin_size_t size_t; ^ /usr/include/sys/_types.h:103:26: note: '__darwin_ino_t' declared here typedef __darwin_ino64_t __darwin_ino_t; /* [???] Used for inodes */ ^ fatal error: too many errors emitted, stopping now [-ferror-limit=] 20 errors generated. error: Command "/usr/bin/clang -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -arch ppc -arch i386 -arch x86_64 -g -O2 -I/var/folders/sx/j_l_qvys4bv00_38pfvy3m8h0000gp/T/tmp9MmLz8/src.macosx- 10.9-x86_64-2.7 -I/Library/Python/2.7/site-packages/numpy-1.8.2-py2.7-macosx-10.9- intel.egg/numpy/core/include - I/opt/local/Library/Frameworks/Python.framework/Versions/2.7/include/python2.7 -c /var/folders/sx/j_l_qvys4bv00_38pfvy3m8h0000gp/T/tmp9MmLz8/src.macosx-10.9-x86_64-2.7/fftfitmodule.c -o /var/folders/sx/j_l_qvys4bv00_38pfvy3m8h0000gp/T/tmp9MmLz8/var/folders/sx/j_l_qvys4bv00_38pfvy3m8h00 00gp/T/tmp9MmLz8/src.macosx-10.9-x86_64-2.7/fftfitmodule.o" failed with exit status 1 Makefile:5: recipe for target 'fftfit' failed gmake: *** [fftfit] Error 1 How can I solve this architecture problem?

    Read the article

  • BES Express - configure MDS to push messages from 3rd party web application

    - by Max Gontar
    Hi! I have developed IIS web service to send PAP messages using Blackberry Push API over MDS. And there is an application installed on device, configured to receive push messages on appropriate port. Everything works well on MDS simulator. But it's not working well in real environment: I have installed BES Express and register several devices. I can browse MDS url with appropriate port, so url is correct. Also port enabled for reliable pushes is used in push message and in device application. Here is MDS simulator log: <2011-01-12 14:00:03.456 EET>:[272]:<MDS-CS_MDS>:<DEBUG>:<LAYER = SCM, EVENT = PapServlet: request from 0:0:0:0:0:0:0:1 564 bytes...> <2011-01-12 14:00:03.476 EET>:[273]:<MDS-CS_MDS>:<DEBUG>:<LAYER = SCM, EVENT = Mapping PAP request to push request for pushID:pushID:asdas> <2011-01-12 14:00:03.479 EET>:[274]:<MDS-CS_MDS>:<DEBUG>:<LAYER = SCM, EVENT = PushServlet: POST request from [UNKNOWN @ 0:0:0:0:0:0:0:1] to [PAPDEST=WAPPUSH%3D2100000A%253A100%2FTYPE%3DUSER%40rim.net&PORT=100&REQUESTURI=/] : -1 bytes...> <2011-01-12 14:00:03.480 EET>:[275]:<MDS-CS_MDS>:<DEBUG>:<LAYER = SCM, EVENT = submitting push message with id:pushID:asdas> <2011-01-12 14:00:03.482 EET>:[276]:<MDS-CS_MDS>:<DEBUG>:<LAYER = SCM, EVENT = Executing push submit command for pushID:pushID:asdas> <2011-01-12 14:00:03.483 EET>:[278]:<MDS-CS_MDS>:<DEBUG>:<LAYER = SCM, EVENT = Pushing message to: 2100000a> <2011-01-12 14:00:03.484 EET>:[279]:<MDS-CS_MDS>:<DEBUG>:<LAYER = SCM, EVENT = Number of active push connections:1> <2011-01-12 14:00:03.489 EET>:[280]:<MDS-CS_MDS>:<DEBUG>:<LAYER = SCM, EVENT = added server-initiated connection = -872546301, push id = pushID:asdas> <2011-01-12 14:00:03.491 EET>:[281]:<MDS-CS_MDS>:<DEBUG>:<LAYER = SCM, EVENT = Available threads in DefaultJobPool = 9 running JobRunner: DefaultJobRunner-7> <2011-01-12 14:00:03.494 EET>:[282]:<MDS-CS_MDS>:<DEBUG>:<LAYER = IPPP, HANDLER = HTTP, EVENT = ReceivedFromServer, DEVICEPIN = 2100000a, CONNECTIONID = -872546301, HTTPTRANSMISSION => <2011-01-12 14:00:03.494 EET>:[282]:<MDS-CS_MDS>:<DEBUG>:<LAYER = IPPP, HANDLER = HTTP, EVENT = ReceivedFromServer, DEVICEPIN = 2100000a, CONNECTIONID = -872546301, HTTPTRANSMISSION = [Transmission Line Section]:> <2011-01-12 14:00:03.494 EET>:[282]:<MDS-CS_MDS>:<DEBUG>:<LAYER = IPPP, HANDLER = HTTP, EVENT = ReceivedFromServer, DEVICEPIN = 2100000a, CONNECTIONID = -872546301, HTTPTRANSMISSION = POST / HTTP/1.1> <2011-01-12 14:00:03.494 EET>:[282]:<MDS-CS_MDS>:<DEBUG>:<LAYER = IPPP, HANDLER = HTTP, EVENT = ReceivedFromServer, DEVICEPIN = 2100000a, CONNECTIONID = -872546301, HTTPTRANSMISSION = [Headers Section]: 8 headers> <2011-01-12 14:00:03.494 EET>:[282]:<MDS-CS_MDS>:<DEBUG>:<LAYER = IPPP, HANDLER = HTTP, EVENT = ReceivedFromServer, DEVICEPIN = 2100000a, CONNECTIONID = -872546301, HTTPTRANSMISSION = [Parameters Section]: 3 parameters> <2011-01-12 14:00:03.499 EET>:[283]:<MDS-CS_MDS>:<DEBUG>:<LAYER = IPPP, HANDLER = HTTP, EVENT = SentToDevice, DEVICEPIN = 2100000a, CONNECTIONID = -872546301, HTTPTRANSMISSION => <2011-01-12 14:00:03.499 EET>:[283]:<MDS-CS_MDS>:<DEBUG>:<LAYER = IPPP, HANDLER = HTTP, EVENT = SentToDevice, DEVICEPIN = 2100000a, CONNECTIONID = -872546301, HTTPTRANSMISSION = [Transmission Line Section]:> <2011-01-12 14:00:03.499 EET>:[283]:<MDS-CS_MDS>:<DEBUG>:<LAYER = IPPP, HANDLER = HTTP, EVENT = SentToDevice, DEVICEPIN = 2100000a, CONNECTIONID = -872546301, HTTPTRANSMISSION = POST / HTTP/1.1> <2011-01-12 14:00:03.499 EET>:[283]:<MDS-CS_MDS>:<DEBUG>:<LAYER = IPPP, HANDLER = HTTP, EVENT = SentToDevice, DEVICEPIN = 2100000a, CONNECTIONID = -872546301, HTTPTRANSMISSION = [Headers Section]: 9 headers> <2011-01-12 14:00:03.499 EET>:[283]:<MDS-CS_MDS>:<DEBUG>:<LAYER = IPPP, HANDLER = HTTP, EVENT = SentToDevice, DEVICEPIN = 2100000a, CONNECTIONID = -872546301, HTTPTRANSMISSION = [Parameters Section]: 3 parameters> <2011-01-12 14:00:03.501 EET>:[284]:<MDS-CS_MDS>:<DEBUG>:<LAYER = SCM, EVENT = Finished JobRunner: DefaultJobRunner-7, available threads in DefaultJobPool = 10, time spent = 8ms> <2011-01-12 14:00:03.521 EET>:[287]:<MDS-CS_MDS>:<DEBUG>:<LAYER = IPPP, EVENT = CreatedSendingQueue, DEVICEPIN = 2100000a> <2011-01-12 14:00:03.526 EET>:[290]:<MDS-CS_MDS>:<DEBUG>:<LAYER = IPPP, EVENT = Sending, TAG = 1288699908, DEVICEPIN = 2100000a, VERSION = 16, CONNECTIONID = -872546301, SEQUENCE = 0, TYPE = NOTIFY-REQUEST, CONNECTIONHANDLER = http, PROTOCOL = TCP, PARAMETERS = [MGONTAR/10.10.0.35:100], SIZE = 339> <2011-01-12 14:00:03.531 EET>:[291]:<MDS-CS_MDS>:<DEBUG>:<LAYER = SCM, EVENT = Number of active push connections:0> <2011-01-12 14:00:03.591 EET>:[292]:<MDS-CS_MDS>:<DEBUG>:<LAYER = IPPP, EVENT = Notification, TAG = 1288699908, STATE = DELIVERED> <2011-01-12 14:00:03.600 EET>:[296]:<MDS-CS_MDS>:<DEBUG>:<LAYER = SCM, EVENT = Device connections: AVG latency (msecs)79> <2011-01-12 14:00:03.600 EET>:[297]:<MDS-CS_MDS>:<DEBUG>:<LAYER = IPPP, Removed push connection:-872546301> <2011-01-12 14:00:07.015 EET>:[298]:<MDS-CS_MDS>:<DEBUG>:<LAYER = IPPP, EVENT = RemovedSendingQueue, DEVICEPIN = 2100000a> And here is real MDS log: <2011-01-12 11:35:02.763 GMT>:[3932]:<MDS-CS_LDN-SERVER1_MDS-CS_1>:<DEBUG>:<LAYER = SCM, PapServlet: request from 192.168.1.241 583 bytes...> <2011-01-12 11:35:02.897 GMT>:[3933]:<MDS-CS_LDN-SERVER1_MDS-CS_1>:<DEBUG>:<LAYER = SCM, Mapping PAP request to push request for pushID:pushID:sdfsdfwerwer> <2011-01-12 11:35:02.909 GMT>:[3934]:<MDS-CS_LDN-SERVER1_MDS-CS_1>:<DEBUG>:<LAYER = SCM, PushServlet: POST request from [UNKNOWN @ 192.168.1.241] to [PAPDEST=WAPPUSH%3D22D7F6BD%253A7874%2FTYPE%3DUSER%40rim.net&PORT=7874&REQUESTURI=/]> <2011-01-12 11:35:02.909 GMT>:[3934]:<MDS-CS_LDN-SERVER1_MDS-CS_1>:<DEBUG>:<push id: pushID:sdfsdfwerwer> <2011-01-12 11:35:02.910 GMT>:[3935]:<MDS-CS_LDN-SERVER1_MDS-CS_1>:<DEBUG>:<LAYER = SCM, submitting push message with id:pushID:sdfsdfwerwer> <2011-01-12 11:35:02.910 GMT>:[3936]:<MDS-CS_LDN-SERVER1_MDS-CS_1>:<DEBUG>:<LAYER = SCM, Executing push submit command for pushID:pushID:sdfsdfwerwer> <2011-01-12 11:35:02.911 GMT>:[3937]:<MDS-CS_LDN-SERVER1_MDS-CS_1>:<DEBUG>:<LAYER = SCM, Pushing message to: 22d7f6bd> <2011-01-12 11:35:02.912 GMT>:[3938]:<MDS-CS_LDN-SERVER1_MDS-CS_1>:<DEBUG>:<LAYER = SCM, Number of active push connections:1> <2011-01-12 11:35:02.931 GMT>:[3939]:<MDS-CS_LDN-SERVER1_MDS-CS_1>:<DEBUG>:<LAYER = SCM, added server-initiated connection = -1848311806, push id = pushID:sdfsdfwerwer> <2011-01-12 11:35:03.240 GMT>:[3940]:<MDS-CS_LDN-SERVER1_MDS-CS_1>:<DEBUG>:<LAYER = IPPP, EVENT = CreatedSendingQueue, DEVICEPIN = 22d7f6bd, USERID = u3> <2011-01-12 11:35:03.241 GMT>:[3941]:<MDS-CS_LDN-SERVER1_MDS-CS_1>:<DEBUG>:<LAYER = IPPP, EVENT = Sending, TAG = 536543251, DEVICEPIN = 22d7f6bd, USERID = u3, VERSION = 16, CONNECTIONID = -1848311806, SEQUENCE = 0, TYPE = NOTIFY-REQUEST, CONNECTIONHANDLER = http, PROTOCOL = TCP, PARAMETERS = [LDN-Server1/192.168.1.240:7874], SIZE = 383> <2011-01-12 11:35:03.241 GMT>:[3942]:<MDS-CS_LDN-SERVER1_MDS-CS_1>:<DEBUG>:<LAYER = SCM, Number of active push connections:0> <2011-01-12 11:35:03.253 GMT>:[3943]:<MDS-CS_LDN-SERVER1_MDS-CS_1>:<DEBUG>:<LAYER = SRP, SRPID = S27700165[LDN-SERVER1:3200], EVENT = Sending, VERSION = 1, COMMAND = SEND, TAG = 536543251, SIZE = 570> <2011-01-12 11:35:03.838 GMT>:[3944]:<MDS-CS_LDN-SERVER1_MDS-CS_1>:<DEBUG>:<LAYER = SRP, SRPID = S27700165[LDN-SERVER1:3200], EVENT = Receiving, VERSION = 1, COMMAND = STATUS, TAG = 536543251, SIZE = 10, STATE = DELIVERED> <2011-01-12 11:35:04.104 GMT>:[3945]:<MDS-CS_LDN-SERVER1_MDS-CS_1>:<DEBUG>:<LAYER = IPPP, EVENT = Notification, TAG = 536543251, STATE = DELIVERED> <2011-01-12 11:35:04.121 GMT>:[3946]:<MDS-CS_LDN-SERVER1_MDS-CS_1>:<DEBUG>:<LAYER = SCM, Device connections: AVG latency (msecs)893> <2011-01-12 11:35:04.135 GMT>:[3947]:<MDS-CS_LDN-SERVER1_MDS-CS_1>:<INFO >:<LAYER = IPPP, DEVICEPIN = 22d7f6bd, DOMAINNAME = LDN-Server1/192.168.1.240, CONNECTION_TYPE = PUSH_CONN, ConnectionId = -1848311806, DURATION(ms) = 1151, MFH_KBytes = 0, MTH_KBytes = 0.374, MFH_PACKET_COUNT = 0, MTH_PACKET_COUNT = 1> <2011-01-12 11:35:04.144 GMT>:[3948]:<MDS-CS_LDN-SERVER1_MDS-CS_1>:<DEBUG>:<LAYER = IPPP, Removed push connection:-1848311806> <2011-01-12 11:35:09.264 GMT>:[3949]:<MDS-CS_LDN-SERVER1_MDS-CS_1>:<DEBUG>:<LAYER = IPPP, EVENT = RemovedSendingQueue, DEVICEPIN = 22d7f6bd, USERID = u3> <2011-01-12 11:35:58.187 GMT>:[3950]:<MDS-CS_LDN-SERVER1_MDS-CS_1>:<DEBUG>:<LAYER = SRP, SRPID = S27700165[LDN-SERVER1:3200], EVENT = Sending, VERSION = 1, COMMAND = INFO, SIZE = 46> <2011-01-12 11:35:58.187 GMT>:[3951]:<MDS-CS_LDN-SERVER1_MDS-CS_1>:<DEBUG>:<LAYER = SCM, Sent health to S27700165[LDN-SERVER1:3200] Health=[0x 0000 0007 0000 0000],Mask=[0x 0000 0007 0000 0000],Load=[60]> As you can see, logs not really differs, message is marked as delivered. But my app on device not really gets this message (as it works in mds simulator) Please advice me, what may be wrong? Is there some certificate to install or security settings I should configure to make this push message came to device application? Thank you! same question on bbforums

    Read the article

  • Using a mounted NTFS share with nginx

    - by Hoff
    I have set up a local testing VM with Ubuntu Server 12.04 LTS and the LEMP stack. It's kind of an unconventional setup because instead of having all my PHP scripts on the local machine, I've mounted an NTFS share as the document root because I do my development on Windows. I had everything working perfectly up until this morning, now I keep getting a dreaded 'File not found.' error. I am almost certain this must be somehow permission related, because if I copy my site over to /var/www, nginx and php-fpm have no problems serving my PHP scripts. What I can't figure out is why all of a sudden (after a reboot of the server), no PHP files will be served but instead just the 'File not found.' error. Static files work fine, so I think it's PHP that is causing the headache. Both nginx and php-fpm are configured to run as the user www-data: root@ubuntu-server:~# ps aux | grep 'nginx\|php-fpm' root 1095 0.0 0.0 5816 792 ? Ss 11:11 0:00 nginx: master process /opt/nginx/sbin/nginx -c /etc/nginx/nginx.conf www-data 1096 0.0 0.1 6016 1172 ? S 11:11 0:00 nginx: worker process www-data 1098 0.0 0.1 6016 1172 ? S 11:11 0:00 nginx: worker process root 1130 0.0 0.4 175560 4212 ? Ss 11:11 0:00 php-fpm: master process (/etc/php5/php-fpm.conf) www-data 1131 0.0 0.3 175560 3216 ? S 11:11 0:00 php-fpm: pool www www-data 1132 0.0 0.3 175560 3216 ? S 11:11 0:00 php-fpm: pool www www-data 1133 0.0 0.3 175560 3216 ? S 11:11 0:00 php-fpm: pool www root 1686 0.0 0.0 4368 816 pts/1 S+ 11:11 0:00 grep --color=auto nginx\|php-fpm I have mounted the NTFS share at /mnt/webfiles by editing /etc/fstab and adding the following line: //192.168.0.199/c$/Websites/ /mnt/webfiles cifs username=Jordan,password=mypasswordhere,gid=33,uid=33 0 0 Where gid 33 is the www-data group and uid 33 is the user www-data. If I list the contents of one of my sites you can in fact see that they belong to the user www-data: root@ubuntu-server:~# ls -l /mnt/webfiles/nTv5-2.0 total 8 drwxr-xr-x 0 www-data www-data 0 Jun 6 19:12 app drwxr-xr-x 0 www-data www-data 0 Aug 22 19:00 assets -rwxr-xr-x 0 www-data www-data 1150 Jan 4 2012 favicon.ico -rwxr-xr-x 0 www-data www-data 1412 Dec 28 2011 index.php drwxr-xr-x 0 www-data www-data 0 Jun 3 16:44 lib drwxr-xr-x 0 www-data www-data 0 Jan 3 2012 plugins drwxr-xr-x 0 www-data www-data 0 Jun 3 16:45 vendors If I switch to the www-data user, I have no problem creating a new file on the share: root@ubuntu-server:~# su www-data $ > /mnt/webfiles/test.txt $ ls -l /mnt/webfiles | grep test\.txt -rwxr-xr-x 0 www-data www-data 0 Sep 8 11:19 test.txt There should be no problem reading or writing to the share with php-fpm running as the user www-data. When I examine the error log of nginx, it's filled with a bunch of lines that look like the following: 2012/09/08 11:22:36 [error] 1096#0: *1 FastCGI sent in stderr: "Primary script unknown" while reading response header from upstream, client: 192.168.0.199, server: , request: "GET / HTTP/1.1", upstream: "fastcgi://unix:/var/run/php5-fpm.sock:", host: "192.168.0.123" 2012/09/08 11:22:39 [error] 1096#0: *1 FastCGI sent in stderr: "Primary script unknown" while reading response header from upstream, client: 192.168.0.199, server: , request: "GET /apc.php HTTP/1.1", upstream: "fastcgi://unix:/var/run/php5-fpm.sock:", host: "192.168.0.123" It's bizarre that this was working previously and now all of sudden PHP is complaining that it can't "find" the scripts on the share. Does anybody know why this is happening? EDIT I tried editing php-fpm.conf and changing chdir to the following: chdir = /mnt/webfiles When I try and restart the php-fpm service, I get the error: Starting php-fpm [08-Sep-2012 14:20:55] ERROR: [pool www] the chdir path '/mnt/webfiles' does not exist or is not a directory This is a total load of bullshit because this directory DOES exist and is mounted! Any ls commands to list that directory work perfectly. Why the hell can't PHP-FPM see this directory?! Here are my configuration files for reference: nginx.conf user www-data; worker_processes 2; error_log /var/log/nginx/nginx.log info; pid /var/run/nginx.pid; events { worker_connections 1024; multi_accept on; } http { include fastcgi.conf; include mime.types; default_type application/octet-stream; set_real_ip_from 127.0.0.1; real_ip_header X-Forwarded-For; ## Proxy proxy_redirect off; proxy_set_header Host $host; proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; client_max_body_size 32m; client_body_buffer_size 128k; proxy_connect_timeout 90; proxy_send_timeout 90; proxy_read_timeout 90; proxy_buffers 32 4k; ## Compression gzip on; gzip_types text/plain text/css application/x-javascript text/xml application/xml application/xml+rss text/javascript; gzip_disable "MSIE [1-6]\.(?!.*SV1)"; ### TCP options tcp_nodelay on; tcp_nopush on; keepalive_timeout 65; sendfile on; include /etc/nginx/sites-enabled/*; } my site config server { listen 80; access_log /var/log/nginx/$host.access.log; error_log /var/log/nginx/error.log; root /mnt/webfiles/nTv5-2.0/app/webroot; index index.php; ## Block bad bots if ($http_user_agent ~* (HTTrack|HTMLParser|libcurl|discobot|Exabot|Casper|kmccrew|plaNETWORK|RPT-HTTPClient)) { return 444; } ## Block certain Referers (case insensitive) if ($http_referer ~* (sex|vigra|viagra) ) { return 444; } ## Deny dot files: location ~ /\. { deny all; } ## Favicon Not Found location = /favicon.ico { access_log off; log_not_found off; } ## Robots.txt Not Found location = /robots.txt { access_log off; log_not_found off; } if (-f $document_root/maintenance.html) { rewrite ^(.*)$ /maintenance.html last; } location ~* \.(?:ico|css|js|gif|jpe?g|png)$ { # Some basic cache-control for static files to be sent to the browser expires max; add_header Pragma public; add_header Cache-Control "max-age=2678400, public, must-revalidate"; } location / { try_files $uri $uri/ index.php; if (-f $request_filename) { break; } rewrite ^(.+)$ /index.php?url=$1 last; } location ~ \.php$ { include /etc/nginx/fastcgi.conf; fastcgi_pass unix:/var/run/php5-fpm.sock; } } php-fpm.conf ;;;;;;;;;;;;;;;;;;;;; ; FPM Configuration ; ;;;;;;;;;;;;;;;;;;;;; ; All relative paths in this configuration file are relative to PHP's install ; prefix (/opt/php5). This prefix can be dynamicaly changed by using the ; '-p' argument from the command line. ; Include one or more files. If glob(3) exists, it is used to include a bunch of ; files from a glob(3) pattern. This directive can be used everywhere in the ; file. ; Relative path can also be used. They will be prefixed by: ; - the global prefix if it's been set (-p arguement) ; - /opt/php5 otherwise ;include=etc/fpm.d/*.conf ;;;;;;;;;;;;;;;;;; ; Global Options ; ;;;;;;;;;;;;;;;;;; [global] ; Pid file ; Note: the default prefix is /opt/php5/var ; Default Value: none pid = /var/run/php-fpm.pid ; Error log file ; Note: the default prefix is /opt/php5/var ; Default Value: log/php-fpm.log error_log = /var/log/php5-fpm/php-fpm.log ; Log level ; Possible Values: alert, error, warning, notice, debug ; Default Value: notice ;log_level = notice ; If this number of child processes exit with SIGSEGV or SIGBUS within the time ; interval set by emergency_restart_interval then FPM will restart. A value ; of '0' means 'Off'. ; Default Value: 0 ;emergency_restart_threshold = 0 ; Interval of time used by emergency_restart_interval to determine when ; a graceful restart will be initiated. This can be useful to work around ; accidental corruptions in an accelerator's shared memory. ; Available Units: s(econds), m(inutes), h(ours), or d(ays) ; Default Unit: seconds ; Default Value: 0 ;emergency_restart_interval = 0 ; Time limit for child processes to wait for a reaction on signals from master. ; Available units: s(econds), m(inutes), h(ours), or d(ays) ; Default Unit: seconds ; Default Value: 0 ;process_control_timeout = 0 ; Send FPM to background. Set to 'no' to keep FPM in foreground for debugging. ; Default Value: yes ;daemonize = yes ;;;;;;;;;;;;;;;;;;;; ; Pool Definitions ; ;;;;;;;;;;;;;;;;;;;; ; Multiple pools of child processes may be started with different listening ; ports and different management options. The name of the pool will be ; used in logs and stats. There is no limitation on the number of pools which ; FPM can handle. Your system will tell you anyway :) ; Start a new pool named 'www'. ; the variable $pool can we used in any directive and will be replaced by the ; pool name ('www' here) [www] ; Per pool prefix ; It only applies on the following directives: ; - 'slowlog' ; - 'listen' (unixsocket) ; - 'chroot' ; - 'chdir' ; - 'php_values' ; - 'php_admin_values' ; When not set, the global prefix (or /opt/php5) applies instead. ; Note: This directive can also be relative to the global prefix. ; Default Value: none ;prefix = /path/to/pools/$pool ; The address on which to accept FastCGI requests. ; Valid syntaxes are: ; 'ip.add.re.ss:port' - to listen on a TCP socket to a specific address on ; a specific port; ; 'port' - to listen on a TCP socket to all addresses on a ; specific port; ; '/path/to/unix/socket' - to listen on a unix socket. ; Note: This value is mandatory. ;listen = 127.0.0.1:9000 listen = /var/run/php5-fpm.sock ; Set listen(2) backlog. A value of '-1' means unlimited. ; Default Value: 128 (-1 on FreeBSD and OpenBSD) ;listen.backlog = -1 ; List of ipv4 addresses of FastCGI clients which are allowed to connect. ; Equivalent to the FCGI_WEB_SERVER_ADDRS environment variable in the original ; PHP FCGI (5.2.2+). Makes sense only with a tcp listening socket. Each address ; must be separated by a comma. If this value is left blank, connections will be ; accepted from any ip address. ; Default Value: any ;listen.allowed_clients = 127.0.0.1 ; Set permissions for unix socket, if one is used. In Linux, read/write ; permissions must be set in order to allow connections from a web server. Many ; BSD-derived systems allow connections regardless of permissions. ; Default Values: user and group are set as the running user ; mode is set to 0666 ;listen.owner = www-data ;listen.group = www-data ;listen.mode = 0666 ; Unix user/group of processes ; Note: The user is mandatory. If the group is not set, the default user's group ; will be used. user = www-data group = www-data ; Choose how the process manager will control the number of child processes. ; Possible Values: ; static - a fixed number (pm.max_children) of child processes; ; dynamic - the number of child processes are set dynamically based on the ; following directives: ; pm.max_children - the maximum number of children that can ; be alive at the same time. ; pm.start_servers - the number of children created on startup. ; pm.min_spare_servers - the minimum number of children in 'idle' ; state (waiting to process). If the number ; of 'idle' processes is less than this ; number then some children will be created. ; pm.max_spare_servers - the maximum number of children in 'idle' ; state (waiting to process). If the number ; of 'idle' processes is greater than this ; number then some children will be killed. ; Note: This value is mandatory. pm = dynamic ; The number of child processes to be created when pm is set to 'static' and the ; maximum number of child processes to be created when pm is set to 'dynamic'. ; This value sets the limit on the number of simultaneous requests that will be ; served. Equivalent to the ApacheMaxClients directive with mpm_prefork. ; Equivalent to the PHP_FCGI_CHILDREN environment variable in the original PHP ; CGI. ; Note: Used when pm is set to either 'static' or 'dynamic' ; Note: This value is mandatory. pm.max_children = 50 ; The number of child processes created on startup. ; Note: Used only when pm is set to 'dynamic' ; Default Value: min_spare_servers + (max_spare_servers - min_spare_servers) / 2 pm.start_servers = 20 ; The desired minimum number of idle server processes. ; Note: Used only when pm is set to 'dynamic' ; Note: Mandatory when pm is set to 'dynamic' pm.min_spare_servers = 5 ; The desired maximum number of idle server processes. ; Note: Used only when pm is set to 'dynamic' ; Note: Mandatory when pm is set to 'dynamic' pm.max_spare_servers = 35 ; The number of requests each child process should execute before respawning. ; This can be useful to work around memory leaks in 3rd party libraries. For ; endless request processing specify '0'. Equivalent to PHP_FCGI_MAX_REQUESTS. ; Default Value: 0 pm.max_requests = 500 ; The URI to view the FPM status page. If this value is not set, no URI will be ; recognized as a status page. By default, the status page shows the following ; information: ; accepted conn - the number of request accepted by the pool; ; pool - the name of the pool; ; process manager - static or dynamic; ; idle processes - the number of idle processes; ; active processes - the number of active processes; ; total processes - the number of idle + active processes. ; max children reached - number of times, the process limit has been reached, ; when pm tries to start more children (works only for ; pm 'dynamic') ; The values of 'idle processes', 'active processes' and 'total processes' are ; updated each second. The value of 'accepted conn' is updated in real time. ; Example output: ; accepted conn: 12073 ; pool: www ; process manager: static ; idle processes: 35 ; active processes: 65 ; total processes: 100 ; max children reached: 1 ; By default the status page output is formatted as text/plain. Passing either ; 'html' or 'json' as a query string will return the corresponding output ; syntax. Example: ; http://www.foo.bar/status ; http://www.foo.bar/status?json ; http://www.foo.bar/status?html ; Note: The value must start with a leading slash (/). The value can be ; anything, but it may not be a good idea to use the .php extension or it ; may conflict with a real PHP file. ; Default Value: not set pm.status_path = /status ; The ping URI to call the monitoring page of FPM. If this value is not set, no ; URI will be recognized as a ping page. This could be used to test from outside ; that FPM is alive and responding, or to ; - create a graph of FPM availability (rrd or such); ; - remove a server from a group if it is not responding (load balancing); ; - trigger alerts for the operating team (24/7). ; Note: The value must start with a leading slash (/). The value can be ; anything, but it may not be a good idea to use the .php extension or it ; may conflict with a real PHP file. ; Default Value: not set ping.path = /ping ; This directive may be used to customize the response of a ping request. The ; response is formatted as text/plain with a 200 response code. ; Default Value: pong ping.response = pong ; The timeout for serving a single request after which the worker process will ; be killed. This option should be used when the 'max_execution_time' ini option ; does not stop script execution for some reason. A value of '0' means 'off'. ; Available units: s(econds)(default), m(inutes), h(ours), or d(ays) ; Default Value: 0 ;request_terminate_timeout = 0 ; The timeout for serving a single request after which a PHP backtrace will be ; dumped to the 'slowlog' file. A value of '0s' means 'off'. ; Available units: s(econds)(default), m(inutes), h(ours), or d(ays) ; Default Value: 0 ;request_slowlog_timeout = 0 ; The log file for slow requests ; Default Value: not set ; Note: slowlog is mandatory if request_slowlog_timeout is set ;slowlog = log/$pool.log.slow ; Set open file descriptor rlimit. ; Default Value: system defined value ;rlimit_files = 1024 ; Set max core size rlimit. ; Possible Values: 'unlimited' or an integer greater or equal to 0 ; Default Value: system defined value ;rlimit_core = 0 ; Chroot to this directory at the start. This value must be defined as an ; absolute path. When this value is not set, chroot is not used. ; Note: you can prefix with '$prefix' to chroot to the pool prefix or one ; of its subdirectories. If the pool prefix is not set, the global prefix ; will be used instead. ; Note: chrooting is a great security feature and should be used whenever ; possible. However, all PHP paths will be relative to the chroot ; (error_log, sessions.save_path, ...). ; Default Value: not set ;chroot = ; Chdir to this directory at the start. ; Note: relative path can be used. ; Default Value: current directory or / when chroot ;chdir = /var/www ; Redirect worker stdout and stderr into main error log. If not set, stdout and ; stderr will be redirected to /dev/null according to FastCGI specs. ; Note: on highloaded environement, this can cause some delay in the page ; process time (several ms). ; Default Value: no ;catch_workers_output = yes ; Pass environment variables like LD_LIBRARY_PATH. All $VARIABLEs are taken from ; the current environment. ; Default Value: clean env ;env[HOSTNAME] = $HOSTNAME ;env[PATH] = /usr/local/bin:/usr/bin:/bin ;env[TMP] = /tmp ;env[TMPDIR] = /tmp ;env[TEMP] = /tmp ; Additional php.ini defines, specific to this pool of workers. These settings ; overwrite the values previously defined in the php.ini. The directives are the ; same as the PHP SAPI: ; php_value/php_flag - you can set classic ini defines which can ; be overwritten from PHP call 'ini_set'. ; php_admin_value/php_admin_flag - these directives won't be overwritten by ; PHP call 'ini_set' ; For php_*flag, valid values are on, off, 1, 0, true, false, yes or no. ; Defining 'extension' will load the corresponding shared extension from ; extension_dir. Defining 'disable_functions' or 'disable_classes' will not ; overwrite previously defined php.ini values, but will append the new value ; instead. ; Note: path INI options can be relative and will be expanded with the prefix ; (pool, global or /opt/php5) ; Default Value: nothing is defined by default except the values in php.ini and ; specified at startup with the -d argument ;php_admin_value[sendmail_path] = /usr/sbin/sendmail -t -i -f [email protected] ;php_flag[display_errors] = off ;php_admin_value[error_log] = /var/log/fpm-php.www.log ;php_admin_flag[log_errors] = on ;php_admin_value[memory_limit] = 32M php_admin_value[sendmail_path] = /usr/sbin/sendmail -t -i

    Read the article

  • Maven Error - Expected START_TAG or END_TAG not TEXT

    - by onepotato
    I am setting up a spring mvc web application + hibernate jpa + maven from scratch using Eclipse Indigo. I am stuck in this error when doing a Maven build. [ERROR] BUILD ERROR [INFO] ------------------------------------------------------------------------ [INFO] Error installing artifact's metadata: Error installing metadata: Error updating group repository metadata expected START_TAG or END_TAG not TEXT (position: TEXT seen ...<extension>war</... @13:25) [INFO] ------------------------------------------------------------------------ I tried googling but can't find a solution that works for me. I even search the whole project for the text <extension>war</ and mysteriously, there is no text like this in my project. However, in the tomcat web.xml there are a lot of <extension> tag, but I doubt that it has something to do in this error because I never touched that web.xml Here is my pom.xml <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd"> <modelVersion>4.0.0</modelVersion> <groupId>com.mycompany.applicationname</groupId> <artifactId>Application MVC</artifactId> <packaging>war</packaging> <version>0.0.1-SNAPSHOT</version> <name>Maven Application Webapp</name> <url>http://maven.apache.org</url> <properties> <spring.version>3.0.3.RELEASE</spring.version> </properties> <dependencies> <dependency> <groupId>org.springframework</groupId> <artifactId>spring-core</artifactId> <version>${spring.version}</version> </dependency> <dependency> <groupId>org.springframework</groupId> <artifactId>spring-web</artifactId> <version>${spring.version}</version> </dependency> <dependency> <groupId>org.springframework</groupId> <artifactId>spring-webmvc</artifactId> <version>${spring.version}</version> </dependency> <dependency> <groupId>org.springframework</groupId> <artifactId>spring-webmvc</artifactId> <version>${spring.version}</version> </dependency> <dependency> <groupId>org.hibernate.javax.persistence</groupId> <artifactId>hibernate-jpa-2.0-api</artifactId> <version>1.0.0.Final</version> </dependency> </dependencies> <build> <finalName>ApplicationName</finalName> </build> </project> As Funtik has suggested, I did a build with -X. Here is the stacktrace. [INFO] ------------------------------------------------------------------------ [ERROR] BUILD ERROR [INFO] ------------------------------------------------------------------------ [INFO] Error installing artifact's metadata: Error installing metadata: Error updating group repository metadata expected START_TAG or END_TAG not TEXT (position: TEXT seen ...<extension>war</... @13:25) [INFO] ------------------------------------------------------------------------ [DEBUG] Trace org.apache.maven.lifecycle.LifecycleExecutionException: Error installing artifact's metadata: Error installing metadata: Error updating group repository metadata at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoals(DefaultLifecycleExecutor.java:583) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoalWithLifecycle(DefaultLifecycleExecutor.java:499) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoal(DefaultLifecycleExecutor.java:478) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoalAndHandleFailures(DefaultLifecycleExecutor.java:330) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeTaskSegments(DefaultLifecycleExecutor.java:291) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.execute(DefaultLifecycleExecutor.java:142) at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:336) at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:129) at org.apache.maven.cli.MavenCli.main(MavenCli.java:287) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:592) at org.codehaus.classworlds.Launcher.launchEnhanced(Launcher.java:315) at org.codehaus.classworlds.Launcher.launch(Launcher.java:255) at org.codehaus.classworlds.Launcher.mainWithExitCode(Launcher.java:430) at org.codehaus.classworlds.Launcher.main(Launcher.java:375) Caused by: org.apache.maven.plugin.MojoExecutionException: Error installing artifact's metadata: Error installing metadata: Error updating group repository metadata at org.apache.maven.plugin.install.InstallMojo.execute(InstallMojo.java:143) at org.apache.maven.plugin.DefaultPluginManager.executeMojo(DefaultPluginManager.java:451) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoals(DefaultLifecycleExecutor.java:558) ... 16 more Caused by: org.apache.maven.artifact.installer.ArtifactInstallationException: Error installing artifact's metadata: Error installing metadata: Error updating group repository metadata at org.apache.maven.artifact.installer.DefaultArtifactInstaller.install(DefaultArtifactInstaller.java:91) at org.apache.maven.plugin.install.InstallMojo.execute(InstallMojo.java:105) ... 18 more Caused by: org.apache.maven.artifact.repository.metadata.RepositoryMetadataInstallationException: Error installing metadata: Error updating group repository metadata at org.apache.maven.artifact.repository.metadata.DefaultRepositoryMetadataManager.install(DefaultRepositoryMetadataManager.java:463) at org.apache.maven.artifact.installer.DefaultArtifactInstaller.install(DefaultArtifactInstaller.java:79) ... 19 more Caused by: org.apache.maven.artifact.repository.metadata.RepositoryMetadataStoreException: Error updating group repository metadata at org.apache.maven.artifact.repository.metadata.AbstractRepositoryMetadata.storeInLocalRepository(AbstractRepositoryMetadata.java:76) at org.apache.maven.artifact.repository.metadata.DefaultRepositoryMetadataManager.install(DefaultRepositoryMetadataManager.java:459) ... 20 more Caused by: org.codehaus.plexus.util.xml.pull.XmlPullParserException: expected START_TAG or END_TAG not TEXT (position: TEXT seen ...<extension>war</... @13:25) at org.codehaus.plexus.util.xml.pull.MXParser.nextTag(MXParser.java:1083) at org.apache.maven.artifact.repository.metadata.io.xpp3.MetadataXpp3Reader.parseVersioning(MetadataXpp3Reader.java:513) at org.apache.maven.artifact.repository.metadata.io.xpp3.MetadataXpp3Reader.parseMetadata(MetadataXpp3Reader.java:352) at org.apache.maven.artifact.repository.metadata.io.xpp3.MetadataXpp3Reader.read(MetadataXpp3Reader.java:866) at org.apache.maven.artifact.repository.metadata.AbstractRepositoryMetadata.updateRepositoryMetadata(AbstractRepositoryMetadata.java:98) at org.apache.maven.artifact.repository.metadata.AbstractRepositoryMetadata.storeInLocalRepository(AbstractRepositoryMetadata.java:68) ... 21 more [INFO] ------------------------------------------------------------------------ [INFO] Total time: 2 seconds [INFO] Finished at: Thu Jun 27 17:36:23 SGT 2013 [INFO] Final Memory: 9M/16M [INFO] ------------------------------------------------------------------------ web.xml <?xml version="1.0" encoding="UTF-8"?> <web-app id="WebApp_ID" version="2.4" xmlns="http://java.sun.com/xml/ns/j2ee" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://java.sun.com/xml/ns/j2ee http://java.sun.com/xml/ns/j2ee/web-app_2_4.xsd"> <display-name>Adjustment Tool</display-name> <servlet> <servlet-name>mvc-dispatcher</servlet-name> <servlet-class>org.springframework.web.servlet.DispatcherServlet</servlet-class> <init-param> <param-name>contextConfigLocation</param-name> <param-value>/WEB-INF/spring-mvc.xml</param-value> </init-param> <load-on-startup>1</load-on-startup> </servlet> <servlet-mapping> <servlet-name>mvc-dispatcher</servlet-name> <url-pattern>/</url-pattern> </servlet-mapping> <listener> <listener-class>org.springframework.web.context.ContextLoaderListener</listener-class> </listener> </web-app> Any ideas?

    Read the article

  • ClickOnce manifest problem

    - by TWith2Sugars
    We are currently deploying a WPF 4 app via click once and there is a scenario when the installation fails. If the user does not have .Net 4.0 Full install and attempts to install our app the framework installs fine but the app fails to install. If we re-run the installation again the app installs fine. Here is a copy of the log: PLATFORM VERSION INFO Windows : 6.1.7600.0 (Win32NT) Common Language Runtime : 2.0.50727.4927 System.Deployment.dll : 2.0.50727.4927 (NetFXspW7.050727-4900) mscorwks.dll : 2.0.50727.4927 (NetFXspW7.050727-4900) dfdll.dll : 2.0.50727.4927 (NetFXspW7.050727-4900) dfshim.dll : 4.0.31106.0 (Main.031106-0000) SOURCES Deployment url : [URL REMOVED] Server : Apache/2.0.54 Application url : [URL REMOVED] Server : Apache/2.0.54 IDENTITIES Deployment Identity : Graphicly.App.application, Version=0.3.2.0, Culture=neutral, PublicKeyToken=c982228345371fbc, processorArchitecture=msil Application Identity : Graphicly.App.exe, Version=0.3.2.0, Culture=neutral, PublicKeyToken=c982228345371fbc, processorArchitecture=msil, type=win32 APPLICATION SUMMARY * Installable application. ERROR SUMMARY Below is a summary of the errors, details of these errors are listed later in the log. * Dependency Graphicly.WCFClient.dll cannot be processed for patching. Following failure messages were detected: + Exception occurred loading manifest from file Graphicly.WCFClient.dll: the manifest may not be valid or the file could not be opened. + Cannot load internal manifest from component file. * Dependency Microsoft.Surface.Presentation.Design.dll cannot be processed for patching. Following failure messages were detected: + Exception occurred loading manifest from file Microsoft.Surface.Presentation.Design.dll: the manifest may not be valid or the file could not be opened. + Cannot load internal manifest from component file. * Dependency GalaSoft.MvvmLight.WPF4.dll cannot be processed for patching. Following failure messages were detected: + Exception occurred loading manifest from file GalaSoft.MvvmLight.WPF4.dll: the manifest may not be valid or the file could not be opened. + Cannot load internal manifest from component file. * Dependency Graphicly.Infrastructure.dll cannot be processed for patching. Following failure messages were detected: + Exception occurred loading manifest from file Graphicly.Infrastructure.dll: the manifest may not be valid or the file could not be opened. + Cannot load internal manifest from component file. * Dependency Graphicly.AutoUpdater.dll cannot be processed for patching. Following failure messages were detected: + Exception occurred loading manifest from file Graphicly.AutoUpdater.dll: the manifest may not be valid or the file could not be opened. + Cannot load internal manifest from component file. * Dependency System.Windows.Interactivity.dll cannot be processed for patching. Following failure messages were detected: + Exception occurred loading manifest from file System.Windows.Interactivity.dll: the manifest may not be valid or the file could not be opened. + Cannot load internal manifest from component file. * Dependency Microsoft.Surface.Presentation.dll cannot be processed for patching. Following failure messages were detected: + Exception occurred loading manifest from file Microsoft.Surface.Presentation.dll: the manifest may not be valid or the file could not be opened. + Cannot load internal manifest from component file. * Dependency Graphicly.Fonts.dll cannot be processed for patching. Following failure messages were detected: + Exception occurred loading manifest from file Graphicly.Fonts.dll: the manifest may not be valid or the file could not be opened. + Cannot load internal manifest from component file. * Dependency Graphicly.Reader.dll cannot be processed for patching. Following failure messages were detected: + Exception occurred loading manifest from file Graphicly.Reader.dll: the manifest may not be valid or the file could not be opened. + Cannot load internal manifest from component file. * Dependency Microsoft.Surface.Presentation.Generic.dll cannot be processed for patching. Following failure messages were detected: + Exception occurred loading manifest from file Microsoft.Surface.Presentation.Generic.dll: the manifest may not be valid or the file could not be opened. + Cannot load internal manifest from component file. * Dependency Graphicly.Controls.dll cannot be processed for patching. Following failure messages were detected: + Exception occurred loading manifest from file Graphicly.Controls.dll: the manifest may not be valid or the file could not be opened. + Cannot load internal manifest from component file. * Dependency Graphicly.SocialNetwork.dll cannot be processed for patching. Following failure messages were detected: + Exception occurred loading manifest from file Graphicly.SocialNetwork.dll: the manifest may not be valid or the file could not be opened. + Cannot load internal manifest from component file. * Dependency Graphicly.Archive.dll cannot be processed for patching. Following failure messages were detected: + Exception occurred loading manifest from file Graphicly.Archive.dll: the manifest may not be valid or the file could not be opened. + Cannot load internal manifest from component file. * Dependency Graphicly.App.exe cannot be processed for patching. Following failure messages were detected: + Exception occurred loading manifest from file Graphicly.App.exe: the manifest may not be valid or the file could not be opened. + Cannot load internal manifest from component file. * Dependency GalaSoft.MvvmLight.Extras.WPF4.dll cannot be processed for patching. Following failure messages were detected: + Exception occurred loading manifest from file GalaSoft.MvvmLight.Extras.WPF4.dll: the manifest may not be valid or the file could not be opened. + Cannot load internal manifest from component file. * Activation of [URL REMOVED] resulted in exception. Following failure messages were detected: + Exception occurred loading manifest from file GalaSoft.MvvmLight.Extras.WPF4.dll: the manifest may not be valid or the file could not be opened. + Cannot load internal manifest from component file. COMPONENT STORE TRANSACTION FAILURE SUMMARY No transaction error was detected. WARNINGS * The file named Microsoft.Windows.Design.Extensibility.dll does not have a hash specified in the manifest. Hash validation will be ignored. * The file named Ionic.Zip.Reduced.dll does not have a hash specified in the manifest. Hash validation will be ignored. * The file named Newtonsoft.Json.dll does not have a hash specified in the manifest. Hash validation will be ignored. * The file named Microsoft.WindowsAzure.StorageClient.dll does not have a hash specified in the manifest. Hash validation will be ignored. * The file named Dimebrain.TweetSharp.dll does not have a hash specified in the manifest. Hash validation will be ignored. * The file named Microsoft.Windows.Design.Interaction.dll does not have a hash specified in the manifest. Hash validation will be ignored. * The file named HtmlAgilityPack.dll does not have a hash specified in the manifest. Hash validation will be ignored. * The file named Facebook.dll does not have a hash specified in the manifest. Hash validation will be ignored. OPERATION PROGRESS STATUS * [20/05/2010 09:17:33] : Activation of [URL REMOVED] has started. * [20/05/2010 09:17:38] : Processing of deployment manifest has successfully completed. * [20/05/2010 09:17:38] : Installation of the application has started. * [20/05/2010 09:17:39] : Processing of application manifest has successfully completed. * [20/05/2010 09:17:40] : Request of trust and detection of platform is complete. ERROR DETAILS Following errors were detected during this operation. * [20/05/2010 09:17:40] System.Deployment.Application.InvalidDeploymentException (ManifestLoad) - Exception occurred loading manifest from file Graphicly.WCFClient.dll: the manifest may not be valid or the file could not be opened. - Source: System.Deployment - Stack trace: at System.Deployment.Application.Manifest.AssemblyManifest.ManifestLoadExceptionHelper(Exception exception, String filePath) at System.Deployment.Application.Manifest.AssemblyManifest.LoadFromInternalManifestFile(String filePath) at System.Deployment.Application.FileDownloader.AddFilesInHashtable(Hashtable hashtable, AssemblyManifest applicationManifest, String applicationFolder) --- Inner Exception --- System.Deployment.Application.DeploymentException (InvalidManifest) - Cannot load internal manifest from component file. - Source: - Stack trace: * [20/05/2010 09:17:40] System.Deployment.Application.InvalidDeploymentException (ManifestLoad) - Exception occurred loading manifest from file Microsoft.Surface.Presentation.Design.dll: the manifest may not be valid or the file could not be opened. - Source: System.Deployment - Stack trace: at System.Deployment.Application.Manifest.AssemblyManifest.ManifestLoadExceptionHelper(Exception exception, String filePath) at System.Deployment.Application.Manifest.AssemblyManifest.LoadFromInternalManifestFile(String filePath) at System.Deployment.Application.FileDownloader.AddFilesInHashtable(Hashtable hashtable, AssemblyManifest applicationManifest, String applicationFolder) --- Inner Exception --- System.Deployment.Application.DeploymentException (InvalidManifest) - Cannot load internal manifest from component file. - Source: - Stack trace: * [20/05/2010 09:17:40] System.Deployment.Application.InvalidDeploymentException (ManifestLoad) - Exception occurred loading manifest from file GalaSoft.MvvmLight.WPF4.dll: the manifest may not be valid or the file could not be opened. - Source: System.Deployment - Stack trace: at System.Deployment.Application.Manifest.AssemblyManifest.ManifestLoadExceptionHelper(Exception exception, String filePath) at System.Deployment.Application.Manifest.AssemblyManifest.LoadFromInternalManifestFile(String filePath) at System.Deployment.Application.FileDownloader.AddFilesInHashtable(Hashtable hashtable, AssemblyManifest applicationManifest, String applicationFolder) --- Inner Exception --- System.Deployment.Application.DeploymentException (InvalidManifest) - Cannot load internal manifest from component file. - Source: - Stack trace: * [20/05/2010 09:17:40] System.Deployment.Application.InvalidDeploymentException (ManifestLoad) - Exception occurred loading manifest from file Graphicly.Infrastructure.dll: the manifest may not be valid or the file could not be opened. - Source: System.Deployment - Stack trace: at System.Deployment.Application.Manifest.AssemblyManifest.ManifestLoadExceptionHelper(Exception exception, String filePath) at System.Deployment.Application.Manifest.AssemblyManifest.LoadFromInternalManifestFile(String filePath) at System.Deployment.Application.FileDownloader.AddFilesInHashtable(Hashtable hashtable, AssemblyManifest applicationManifest, String applicationFolder) --- Inner Exception --- System.Deployment.Application.DeploymentException (InvalidManifest) - Cannot load internal manifest from component file. - Source: - Stack trace: * [20/05/2010 09:17:40] System.Deployment.Application.InvalidDeploymentException (ManifestLoad) - Exception occurred loading manifest from file Graphicly.AutoUpdater.dll: the manifest may not be valid or the file could not be opened. - Source: System.Deployment - Stack trace: at System.Deployment.Application.Manifest.AssemblyManifest.ManifestLoadExceptionHelper(Exception exception, String filePath) at System.Deployment.Application.Manifest.AssemblyManifest.LoadFromInternalManifestFile(String filePath) at System.Deployment.Application.FileDownloader.AddFilesInHashtable(Hashtable hashtable, AssemblyManifest applicationManifest, String applicationFolder) --- Inner Exception --- System.Deployment.Application.DeploymentException (InvalidManifest) - Cannot load internal manifest from component file. - Source: - Stack trace: * [20/05/2010 09:17:40] System.Deployment.Application.InvalidDeploymentException (ManifestLoad) - Exception occurred loading manifest from file System.Windows.Interactivity.dll: the manifest may not be valid or the file could not be opened. - Source: System.Deployment - Stack trace: at System.Deployment.Application.Manifest.AssemblyManifest.ManifestLoadExceptionHelper(Exception exception, String filePath) at System.Deployment.Application.Manifest.AssemblyManifest.LoadFromInternalManifestFile(String filePath) at System.Deployment.Application.FileDownloader.AddFilesInHashtable(Hashtable hashtable, AssemblyManifest applicationManifest, String applicationFolder) --- Inner Exception --- System.Deployment.Application.DeploymentException (InvalidManifest) - Cannot load internal manifest from component file. - Source: - Stack trace: * [20/05/2010 09:17:40] System.Deployment.Application.InvalidDeploymentException (ManifestLoad) - Exception occurred loading manifest from file Microsoft.Surface.Presentation.dll: the manifest may not be valid or the file could not be opened. - Source: System.Deployment - Stack trace: at System.Deployment.Application.Manifest.AssemblyManifest.ManifestLoadExceptionHelper(Exception exception, String filePath) at System.Deployment.Application.Manifest.AssemblyManifest.LoadFromInternalManifestFile(String filePath) at System.Deployment.Application.FileDownloader.AddFilesInHashtable(Hashtable hashtable, AssemblyManifest applicationManifest, String applicationFolder) --- Inner Exception --- System.Deployment.Application.DeploymentException (InvalidManifest) - Cannot load internal manifest from component file. - Source: - Stack trace: * [20/05/2010 09:17:40] System.Deployment.Application.InvalidDeploymentException (ManifestLoad) - Exception occurred loading manifest from file Graphicly.Fonts.dll: the manifest may not be valid or the file could not be opened. - Source: System.Deployment - Stack trace: at System.Deployment.Application.Manifest.AssemblyManifest.ManifestLoadExceptionHelper(Exception exception, String filePath) at System.Deployment.Application.Manifest.AssemblyManifest.LoadFromInternalManifestFile(String filePath) at System.Deployment.Application.FileDownloader.AddFilesInHashtable(Hashtable hashtable, AssemblyManifest applicationManifest, String applicationFolder) --- Inner Exception --- System.Deployment.Application.DeploymentException (InvalidManifest) - Cannot load internal manifest from component file. - Source: - Stack trace: * [20/05/2010 09:17:40] System.Deployment.Application.InvalidDeploymentException (ManifestLoad) - Exception occurred loading manifest from file Graphicly.Reader.dll: the manifest may not be valid or the file could not be opened. - Source: System.Deployment - Stack trace: at System.Deployment.Application.Manifest.AssemblyManifest.ManifestLoadExceptionHelper(Exception exception, String filePath) at System.Deployment.Application.Manifest.AssemblyManifest.LoadFromInternalManifestFile(String filePath) at System.Deployment.Application.FileDownloader.AddFilesInHashtable(Hashtable hashtable, AssemblyManifest applicationManifest, String applicationFolder) --- Inner Exception --- System.Deployment.Application.DeploymentException (InvalidManifest) - Cannot load internal manifest from component file. - Source: - Stack trace: * [20/05/2010 09:17:40] System.Deployment.Application.InvalidDeploymentException (ManifestLoad) - Exception occurred loading manifest from file Microsoft.Surface.Presentation.Generic.dll: the manifest may not be valid or the file could not be opened. - Source: System.Deployment - Stack trace: at System.Deployment.Application.Manifest.AssemblyManifest.ManifestLoadExceptionHelper(Exception exception, String filePath) at System.Deployment.Application.Manifest.AssemblyManifest.LoadFromInternalManifestFile(String filePath) at System.Deployment.Application.FileDownloader.AddFilesInHashtable(Hashtable hashtable, AssemblyManifest applicationManifest, String applicationFolder) --- Inner Exception --- System.Deployment.Application.DeploymentException (InvalidManifest) - Cannot load internal manifest from component file. - Source: - Stack trace: * [20/05/2010 09:17:41] System.Deployment.Application.InvalidDeploymentException (ManifestLoad) - Exception occurred loading manifest from file Graphicly.Controls.dll: the manifest may not be valid or the file could not be opened. - Source: System.Deployment - Stack trace: at System.Deployment.Application.Manifest.AssemblyManifest.ManifestLoadExceptionHelper(Exception exception, String filePath) at System.Deployment.Application.Manifest.AssemblyManifest.LoadFromInternalManifestFile(String filePath) at System.Deployment.Application.FileDownloader.AddFilesInHashtable(Hashtable hashtable, AssemblyManifest applicationManifest, String applicationFolder) --- Inner Exception --- System.Deployment.Application.DeploymentException (InvalidManifest) - Cannot load internal manifest from component file. - Source: - Stack trace: * [20/05/2010 09:17:41] System.Deployment.Application.InvalidDeploymentException (ManifestLoad) - Exception occurred loading manifest from file Graphicly.SocialNetwork.dll: the manifest may not be valid or the file could not be opened. - Source: System.Deployment - Stack trace: at System.Deployment.Application.Manifest.AssemblyManifest.ManifestLoadExceptionHelper(Exception exception, String filePath) at System.Deployment.Application.Manifest.AssemblyManifest.LoadFromInternalManifestFile(String filePath) at System.Deployment.Application.FileDownloader.AddFilesInHashtable(Hashtable hashtable, AssemblyManifest applicationManifest, String applicationFolder) --- Inner Exception --- System.Deployment.Application.DeploymentException (InvalidManifest) - Cannot load internal manifest from component file. - Source: - Stack trace: * [20/05/2010 09:17:41] System.Deployment.Application.InvalidDeploymentException (ManifestLoad) - Exception occurred loading manifest from file Graphicly.Archive.dll: the manifest may not be valid or the file could not be opened. - Source: System.Deployment - Stack trace: at System.Deployment.Application.Manifest.AssemblyManifest.ManifestLoadExceptionHelper(Exception exception, String filePath) at System.Deployment.Application.Manifest.AssemblyManifest.LoadFromInternalManifestFile(String filePath) at System.Deployment.Application.FileDownloader.AddFilesInHashtable(Hashtable hashtable, AssemblyManifest applicationManifest, String applicationFolder) --- Inner Exception --- System.Deployment.Application.DeploymentException (InvalidManifest) - Cannot load internal manifest from component file. - Source: - Stack trace: * [20/05/2010 09:17:41] System.Deployment.Application.InvalidDeploymentException (ManifestLoad) - Exception occurred loading manifest from file Graphicly.App.exe: the manifest may not be valid or the file could not be opened. - Source: System.Deployment - Stack trace: at System.Deployment.Application.Manifest.AssemblyManifest.ManifestLoadExceptionHelper(Exception exception, String filePath) at System.Deployment.Application.Manifest.AssemblyManifest.LoadFromInternalManifestFile(String filePath) at System.Deployment.Application.FileDownloader.AddFilesInHashtable(Hashtable hashtable, AssemblyManifest applicationManifest, String applicationFolder) --- Inner Exception --- System.Deployment.Application.DeploymentException (InvalidManifest) - Cannot load internal manifest from component file. - Source: - Stack trace: * [20/05/2010 09:17:41] System.Deployment.Application.InvalidDeploymentException (ManifestLoad) - Exception occurred loading manifest from file GalaSoft.MvvmLight.Extras.WPF4.dll: the manifest may not be valid or the file could not be opened. - Source: System.Deployment - Stack trace: at System.Deployment.Application.Manifest.AssemblyManifest.ManifestLoadExceptionHelper(Exception exception, String filePath) at System.Deployment.Application.Manifest.AssemblyManifest.LoadFromInternalManifestFile(String filePath) at System.Deployment.Application.FileDownloader.AddFilesInHashtable(Hashtable hashtable, AssemblyManifest applicationManifest, String applicationFolder) --- Inner Exception --- System.Deployment.Application.DeploymentException (InvalidManifest) - Cannot load internal manifest from component file. - Source: - Stack trace: * [20/05/2010 09:17:41] System.Deployment.Application.InvalidDeploymentException (ManifestLoad) - Exception occurred loading manifest from file GalaSoft.MvvmLight.Extras.WPF4.dll: the manifest may not be valid or the file could not be opened. - Source: System.Deployment - Stack trace: at System.Deployment.Application.Manifest.AssemblyManifest.ManifestLoadExceptionHelper(Exception exception, String filePath) at System.Deployment.Application.Manifest.AssemblyManifest.LoadFromInternalManifestFile(String filePath) at System.Deployment.Application.DownloadManager.ProcessDownloadedFile(Object sender, DownloadEventArgs e) at System.Deployment.Application.FileDownloader.DownloadModifiedEventHandler.Invoke(Object sender, DownloadEventArgs e) at System.Deployment.Application.FileDownloader.PatchSingleFile(DownloadQueueItem item, Hashtable dependencyTable) at System.Deployment.Application.FileDownloader.PatchFiles(SubscriptionState subState) at System.Deployment.Application.FileDownloader.Download(SubscriptionState subState) at System.Deployment.Application.DownloadManager.DownloadDependencies(SubscriptionState subState, AssemblyManifest deployManifest, AssemblyManifest appManifest, Uri sourceUriBase, String targetDirectory, String group, IDownloadNotification notification, DownloadOptions options) at System.Deployment.Application.ApplicationActivator.DownloadApplication(SubscriptionState subState, ActivationDescription actDesc, Int64 transactionId, TempDirectory& downloadTemp) at System.Deployment.Application.ApplicationActivator.InstallApplication(SubscriptionState& subState, ActivationDescription actDesc) at System.Deployment.Application.ApplicationActivator.PerformDeploymentActivation(Uri activationUri, Boolean isShortcut, String textualSubId, String deploymentProviderUrlFromExtension, BrowserSettings browserSettings, String& errorPageUrl) at System.Deployment.Application.ApplicationActivator.ActivateDeploymentWorker(Object state) --- Inner Exception --- System.Deployment.Application.DeploymentException (InvalidManifest) - Cannot load internal manifest from component file. - Source: - Stack trace: COMPONENT STORE TRANSACTION DETAILS No transaction information is available. I'm baffled. Any ideas what this could be? Cheers Tony

    Read the article

  • Building applications with WPF, MVVM and Prism(aka CAG)

    - by skjagini
    In this article I am going to walk through an application using WPF and Prism (aka composite application guidance, CAG) which simulates engaging a taxi (cab).  The rules are simple, the app would have3 screens A login screen to authenticate the user An information screen. A screen to engage the cab and roam around and calculating the total fare Metered Rate of Fare The meter is required to be engaged when a cab is occupied by anyone $3.00 upon entry $0.35 for each additional unit The unit fare is: one-fifth of a mile, when the cab is traveling at 6 miles an hour or more; or 60 seconds when not in motion or traveling at less than 12 miles per hour. Night surcharge of $.50 after 8:00 PM & before 6:00 AM Peak hour Weekday Surcharge of $1.00 Monday - Friday after 4:00 PM & before 8:00 PM New York State Tax Surcharge of $.50 per ride. Example: Friday (2010-10-08) 5:30pm Start at Lexington Ave & E 57th St End at Irving Pl & E 15th St Start = $3.00 Travels 2 miles at less than 6 mph for 15 minutes = $3.50 Travels at more than 12 mph for 5 minutes = $1.75 Peak hour Weekday Surcharge = $1.00 (ride started at 5:30 pm) New York State Tax Surcharge = $0.50 Before we dive into the app, I would like to give brief description about the framework.  If you want to jump on to the source code, scroll all the way to the end of the post. MVVM MVVM pattern is in no way related to the usage of PRISM in your application and should be considered if you are using WPF irrespective of PRISM or not. Lets say you are not familiar with MVVM, your typical UI would involve adding some UI controls like text boxes, a button, double clicking on the button,  generating event handler, calling a method from business layer and updating the user interface, it works most of the time for developing small scale applications. The problem with this approach is that there is some amount of code specific to business logic wrapped in UI specific code which is hard to unit test it, mock it and MVVM helps to solve the exact problem. MVVM stands for Model(M) – View(V) – ViewModel(VM),  based on the interactions with in the three parties it should be called VVMM,  MVVM sounds more like MVC (Model-View-Controller) so the name. Why it should be called VVMM: View – View Model - Model WPF allows to create user interfaces using XAML and MVVM takes it to the next level by allowing complete separation of user interface and business logic. In WPF each view will have a property, DataContext when set to an instance of a class (which happens to be your view model) provides the data the view is interested in, i.e., view interacts with view model and at the same time view model interacts with view through DataContext. Sujith, if view and view model are interacting directly with each other how does MVVM is helping me separation of concerns? Well, the catch is DataContext is of type Object, since it is of type object view doesn’t know exact type of view model allowing views and views models to be loosely coupled. View models aggregate data from models (data access layer, services, etc) and make it available for views through properties, methods etc, i.e., View Models interact with Models. PRISM Prism is provided by Microsoft Patterns and Practices team and it can be downloaded from codeplex for source code,  samples and documentation on msdn.  The name composite implies, to compose user interface from different modules (views) without direct dependencies on each other, again allowing  loosely coupled development. Well Sujith, I can already do that with user controls, why shall I learn another framework?  That’s correct, you can decouple using user controls, but you still have to manage some amount of coupling, like how to do you communicate between the controls, how do you subscribe/unsubscribe, loading/unloading views dynamically. Prism is not a replacement for user controls, provides the following features which greatly help in designing the composite applications. Dependency Injection (DI)/ Inversion of Control (IoC) Modules Regions Event Aggregator  Commands Simply put, MVVM helps building a single view and Prism helps building an application using the views There are other open source alternatives to Prism, like MVVMLight, Cinch, take a look at them as well. Lets dig into the source code.  1. Solution The solution is made of the following projects Framework: Holds the common functionality in building applications using WPF and Prism TaxiClient: Start up project, boot strapping and app styling TaxiCommon: Helps with the business logic TaxiModules: Holds the meat of the application with views and view models TaxiTests: To test the application 2. DI / IoC Dependency Injection (DI) as the name implies refers to injecting dependencies and Inversion of Control (IoC) means the calling code has no direct control on the dependencies, opposite of normal way of programming where dependencies are passed by caller, i.e inversion; aside from some differences in terminology the concept is same in both the cases. The idea behind DI/IoC pattern is to reduce the amount of direct coupling between different components of the application, the higher the dependency the more tightly coupled the application resulting in code which is hard to modify, unit test and mock.  Initializing Dependency Injection through BootStrapper TaxiClient is the starting project of the solution and App (App.xaml)  is the starting class that gets called when you run the application. From the App’s OnStartup method we will invoke BootStrapper.   namespace TaxiClient { /// <summary> /// Interaction logic for App.xaml /// </summary> public partial class App : Application { protected override void OnStartup(StartupEventArgs e) { base.OnStartup(e);   (new BootStrapper()).Run(); } } } BootStrapper is your contact point for initializing the application including dependency injection, creating Shell and other frameworks. We are going to use Unity for DI and there are lot of open source DI frameworks like Spring.Net, StructureMap etc with different feature set  and you can choose a framework based on your preferences. Note that Prism comes with in built support for Unity, for example we are deriving from UnityBootStrapper in our case and for any other DI framework you have to extend the Prism appropriately   namespace TaxiClient { public class BootStrapper: UnityBootstrapper { protected override IModuleCatalog CreateModuleCatalog() { return new ConfigurationModuleCatalog(); } protected override DependencyObject CreateShell() { Framework.FrameworkBootStrapper.Run(Container, Application.Current.Dispatcher);   Shell shell = new Shell(); shell.ResizeMode = ResizeMode.NoResize; shell.Show();   return shell; } } } Lets take a look into  FrameworkBootStrapper to check out how to register with unity container. namespace Framework { public class FrameworkBootStrapper { public static void Run(IUnityContainer container, Dispatcher dispatcher) { UIDispatcher uiDispatcher = new UIDispatcher(dispatcher); container.RegisterInstance<IDispatcherService>(uiDispatcher);   container.RegisterType<IInjectSingleViewService, InjectSingleViewService>( new ContainerControlledLifetimeManager());   . . . } } } In the above code we are registering two components with unity container. You shall observe that we are following two different approaches, RegisterInstance and RegisterType.  With RegisterInstance we are registering an existing instance and the same instance will be returned for every request made for IDispatcherService   and with RegisterType we are requesting unity container to create an instance for us when required, i.e., when I request for an instance for IInjectSingleViewService, unity will create/return an instance of InjectSingleViewService class and with RegisterType we can configure the life time of the instance being created. With ContaienrControllerLifetimeManager, the unity container caches the instance and reuses for any subsequent requests, without recreating a new instance. Lets take a look into FareViewModel.cs and it’s constructor. The constructor takes one parameter IEventAggregator and if you try to find all references in your solution for IEventAggregator, you will not find a single location where an instance of EventAggregator is passed directly to the constructor. The compiler still finds an instance and works fine because Prism is already configured when used with Unity container to return an instance of EventAggregator when requested for IEventAggregator and in this particular case it is called constructor injection. public class FareViewModel:ObservableBase, IDataErrorInfo { ... private IEventAggregator _eventAggregator;   public FareViewModel(IEventAggregator eventAggregator) { _eventAggregator = eventAggregator; InitializePropertyNames(); InitializeModel(); PropertyChanged += OnPropertyChanged; } ... 3. Shell Shells are very similar in operation to Master Pages in asp.net or MDI in Windows Forms. And shells contain regions which display the views, you can have as many regions as you wish in a given view. You can also nest regions. i.e, one region can load a view which in itself may contain other regions. We have to create a shell at the start of the application and are doing it by overriding CreateShell method from BootStrapper From the following Shell.xaml you shall notice that we have two content controls with Region names as ‘MenuRegion’ and ‘MainRegion’.  The idea here is that you can inject any user controls into the regions dynamically, i.e., a Menu User Control for MenuRegion and based on the user action you can load appropriate view into MainRegion.    <Window x:Class="TaxiClient.Shell" xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation" xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" xmlns:Regions="clr-namespace:Microsoft.Practices.Prism.Regions;assembly=Microsoft.Practices.Prism" Title="Taxi" Height="370" Width="800"> <Grid Margin="2"> <ContentControl Regions:RegionManager.RegionName="MenuRegion" HorizontalAlignment="Stretch" VerticalAlignment="Stretch" HorizontalContentAlignment="Stretch" VerticalContentAlignment="Stretch" />   <ContentControl Grid.Row="1" Regions:RegionManager.RegionName="MainRegion" HorizontalAlignment="Stretch" VerticalAlignment="Stretch" HorizontalContentAlignment="Stretch" VerticalContentAlignment="Stretch" /> <!--<Border Grid.ColumnSpan="2" BorderThickness="2" CornerRadius="3" BorderBrush="LightBlue" />-->   </Grid> </Window> 4. Modules Prism provides the ability to build composite applications and modules play an important role in it. For example if you are building a Mortgage Loan Processor application with 3 components, i.e. customer’s credit history,  existing mortgages, new home/loan information; and consider that the customer’s credit history component involves gathering data about his/her address, background information, job details etc. The idea here using Prism modules is to separate the implementation of these 3 components into their own visual studio projects allowing to build components with no dependency on each other and independently. If we need to add another component to the application, the component can be developed by in house team or some other team in the organization by starting with a new Visual Studio project and adding to the solution at the run time with very little knowledge about the application. Prism modules are defined by implementing the IModule interface and each visual studio project to be considered as a module should implement the IModule interface.  From the BootStrapper.cs you shall observe that we are overriding the method by returning a ConfiguratingModuleCatalog which returns the modules that are registered for the application using the app.config file  and you can also add module using code. Lets take a look into configuration file.   <?xml version="1.0"?> <configuration> <configSections> <section name="modules" type="Microsoft.Practices.Prism.Modularity.ModulesConfigurationSection, Microsoft.Practices.Prism"/> </configSections> <modules> <module assemblyFile="TaxiModules.dll" moduleType="TaxiModules.ModuleInitializer, TaxiModules" moduleName="TaxiModules"/> </modules> </configuration> Here we are adding TaxiModules project to our solution and TaxiModules.ModuleInitializer implements IModule interface   5. Module Mapper With Prism modules you can dynamically add or remove modules from the regions, apart from that Prism also provides API to control adding/removing the views from a region within the same module. Taxi Information Screen: Engage the Taxi Screen: The sample application has two screens, ‘Taxi Information’ and ‘Engage the Taxi’ and they both reside in same module, TaxiModules. ‘Engage the Taxi’ is again made of two user controls, FareView on the left and TotalView on the right. We have created a Shell with two regions, MenuRegion and MainRegion with menu loaded into MenuRegion. We can create a wrapper user control called EngageTheTaxi made of FareView and TotalView and load either TaxiInfo or EngageTheTaxi into MainRegion based on the user action. Though it will work it tightly binds the user controls and for every combination of user controls, we need to create a dummy wrapper control to contain them. Instead we can apply the principles we learned so far from Shell/regions and introduce another template (LeftAndRightRegionView.xaml) made of two regions Region1 (left) and Region2 (right) and load  FareView and TotalView dynamically.  To help with loading of the views dynamically I have introduce an helper an interface, IInjectSingleViewService,  idea suggested by Mike Taulty, a must read blog for .Net developers. using System; using System.Collections.Generic; using System.ComponentModel;   namespace Framework.PresentationUtility.Navigation {   public interface IInjectSingleViewService : INotifyPropertyChanged { IEnumerable<CommandViewDefinition> Commands { get; } IEnumerable<ModuleViewDefinition> Modules { get; }   void RegisterViewForRegion(string commandName, string viewName, string regionName, Type viewType); void ClearViewFromRegion(string viewName, string regionName); void RegisterModule(string moduleName, IList<ModuleMapper> moduleMappers); } } The Interface declares three methods to work with views: RegisterViewForRegion: Registers a view with a particular region. You can register multiple views and their regions under one command.  When this particular command is invoked all the views registered under it will be loaded into their regions. ClearViewFromRegion: To unload a specific view from a region. RegisterModule: The idea is when a command is invoked you can load the UI with set of controls in their default position and based on the user interaction, you can load different contols in to different regions on the fly.  And it is supported ModuleViewDefinition and ModuleMappers as shown below. namespace Framework.PresentationUtility.Navigation { public class ModuleViewDefinition { public string ModuleName { get; set; } public IList<ModuleMapper> ModuleMappers; public ICommand Command { get; set; } }   public class ModuleMapper { public string ViewName { get; set; } public string RegionName { get; set; } public Type ViewType { get; set; } } } 6. Event Aggregator Prism event aggregator enables messaging between components as in Observable pattern, Notifier notifies the Observer which receives notification it is interested in. When it comes to Observable pattern, Observer has to unsubscribes for notifications when it no longer interested in notifications, which allows the Notifier to remove the Observer’s reference from it’s local cache. Though .Net has managed garbage collection it cannot remove inactive the instances referenced by an active instance resulting in memory leak, keeping the Observers in memory as long as Notifier stays in memory.  Developers have to be very careful to unsubscribe when necessary and it often gets overlooked, to overcome these problems Prism Event Aggregator uses weak references to cache the reference (Observer in this case)  and releases the reference (memory) once the instance goes out of scope. Using event aggregator is very simple, declare a generic type of CompositePresenationEvent by inheriting from it. using Microsoft.Practices.Prism.Events; using TaxiCommon.BAO;   namespace TaxiCommon.CompositeEvents { public class TaxiOnMoveEvent:CompositePresentationEvent<TaxiOnMove> { } }   TaxiOnMove.cs includes the properties which we want to exchange between the parties, FareView and TotalView. using System;   namespace TaxiCommon.BAO { public class TaxiOnMove { public TimeSpan MinutesAtTweleveMPH { get; set; } public double MilesAtSixMPH { get; set; } } }   Lets take a look into FareViewodel (Notifier) and how it raises the event.  Here we are raising the event by getting the event through GetEvent<..>() and publishing it with the payload private void OnAddMinutes(object obj) { TaxiOnMove payload = new TaxiOnMove(); if(MilesAtSixMPH != null) payload.MilesAtSixMPH = MilesAtSixMPH.Value; if(MinutesAtTweleveMPH != null) payload.MinutesAtTweleveMPH = new TimeSpan(0,0,MinutesAtTweleveMPH.Value,0);   _eventAggregator.GetEvent<TaxiOnMoveEvent>().Publish(payload); ResetMinutesAndMiles(); } And TotalViewModel(Observer) subscribes to notifications by getting the event through GetEvent<..>() namespace TaxiModules.ViewModels { public class TotalViewModel:ObservableBase { .... private IEventAggregator _eventAggregator;   public TotalViewModel(IEventAggregator eventAggregator) { _eventAggregator = eventAggregator; ... }   private void SubscribeToEvents() { _eventAggregator.GetEvent<TaxiStartedEvent>() .Subscribe(OnTaxiStarted, ThreadOption.UIThread,false,(filter) => true); _eventAggregator.GetEvent<TaxiOnMoveEvent>() .Subscribe(OnTaxiMove, ThreadOption.UIThread, false, (filter) => true); _eventAggregator.GetEvent<TaxiResetEvent>() .Subscribe(OnTaxiReset, ThreadOption.UIThread, false, (filter) => true); }   ... private void OnTaxiMove(TaxiOnMove taxiOnMove) { OnMoveFare fare = new OnMoveFare(taxiOnMove); Fares.Add(fare); SetTotalFare(new []{fare}); }   .... 7. MVVM through example In this section we are going to look into MVVM implementation through example.  I have all the modules declared in a single project, TaxiModules, again it is not necessary to have them into one project. Once the user logs into the application, will be greeted with the ‘Engage the Taxi’ screen which is made of two user controls, FareView.xaml and TotalView.Xaml. As you can see from the solution explorer, each of them have their own code behind files and  ViewModel classes, FareViewMode.cs, TotalViewModel.cs Lets take a look in to the FareView and how it interacts with FareViewModel using MVVM implementation. FareView.xaml acts as a view and FareViewMode.cs is it’s view model. The FareView code behind class   namespace TaxiModules.Views { /// <summary> /// Interaction logic for FareView.xaml /// </summary> public partial class FareView : UserControl { public FareView(FareViewModel viewModel) { InitializeComponent(); this.Loaded += (s, e) => { this.DataContext = viewModel; }; } } } The FareView is bound to FareViewModel through the data context  and you shall observe that DataContext is of type Object, i.e. the FareView doesn’t really know the type of ViewModel (FareViewModel). This helps separation of View and ViewModel as View and ViewModel are independent of each other, you can bind FareView to FareViewModel2 as well and the application compiles just fine. Lets take a look into FareView xaml file  <UserControl x:Class="TaxiModules.Views.FareView" xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation" xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" xmlns:Toolkit="clr-namespace:Microsoft.Windows.Controls;assembly=WPFToolkit" xmlns:Commands="clr-namespace:Microsoft.Practices.Prism.Commands;assembly=Microsoft.Practices.Prism"> <Grid Margin="10" > ....   <Border Style="{DynamicResource innerBorder}" Grid.Row="0" Grid.Column="0" Grid.RowSpan="11" Grid.ColumnSpan="2" Panel.ZIndex="1"/>   <Label Grid.Row="0" Content="Engage the Taxi" Style="{DynamicResource innerHeader}"/> <Label Grid.Row="1" Content="Select the State"/> <ComboBox Grid.Row="1" Grid.Column="1" ItemsSource="{Binding States}" Height="auto"> <ComboBox.ItemTemplate> <DataTemplate> <TextBlock Text="{Binding Name}"/> </DataTemplate> </ComboBox.ItemTemplate> <ComboBox.SelectedItem> <Binding Path="SelectedState" Mode="TwoWay"/> </ComboBox.SelectedItem> </ComboBox> <Label Grid.Row="2" Content="Select the Date of Entry"/> <Toolkit:DatePicker Grid.Row="2" Grid.Column="1" SelectedDate="{Binding DateOfEntry, ValidatesOnDataErrors=true}" /> <Label Grid.Row="3" Content="Enter time 24hr format"/> <TextBox Grid.Row="3" Grid.Column="1" Text="{Binding TimeOfEntry, TargetNullValue=''}"/> <Button Grid.Row="4" Grid.Column="1" Content="Start the Meter" Commands:Click.Command="{Binding StartMeterCommand}" />   <Label Grid.Row="5" Content="Run the Taxi" Style="{DynamicResource innerHeader}"/> <Label Grid.Row="6" Content="Number of Miles &lt;@6mph"/> <TextBox Grid.Row="6" Grid.Column="1" Text="{Binding MilesAtSixMPH, TargetNullValue='', ValidatesOnDataErrors=true}"/> <Label Grid.Row="7" Content="Number of Minutes @12mph"/> <TextBox Grid.Row="7" Grid.Column="1" Text="{Binding MinutesAtTweleveMPH, TargetNullValue=''}"/> <Button Grid.Row="8" Grid.Column="1" Content="Add Minutes and Miles " Commands:Click.Command="{Binding AddMinutesCommand}"/> <Label Grid.Row="9" Content="Other Operations" Style="{DynamicResource innerHeader}"/> <Button Grid.Row="10" Grid.Column="1" Content="Reset the Meter" Commands:Click.Command="{Binding ResetCommand}"/>   </Grid> </UserControl> The highlighted code from the above code shows data binding, for example ComboBox which displays list of states has it’s ItemsSource bound to States property, with DataTemplate bound to Name and SelectedItem  to SelectedState. You might be wondering what are all these properties and how it is able to bind to them.  The answer lies in data context, i.e., when you bound a control, WPF looks for data context on the root object (Grid in this case) and if it can’t find data context it will look into root’s root, i.e. FareView UserControl and it is bound to FareViewModel.  Each of those properties have be declared on the ViewModel for the View to bind correctly. To put simply, View is bound to ViewModel through data context of type object and every control that is bound on the View actually binds to the public property on the ViewModel. Lets look into the ViewModel code (the following code is not an exact copy of FareViewMode.cs, pasted relevant code for this section)   namespace TaxiModules.ViewModels { public class FareViewModel:ObservableBase, IDataErrorInfo { public List<USState> States { get { return USStates.StateList; } }   public USState SelectedState { get { return _selectedState; } set { _selectedState = value; RaisePropertyChanged(_selectedStatePropertyName); } }   public DateTime? DateOfEntry { get { return _dateOfEntry; } set { _dateOfEntry = value; RaisePropertyChanged(_dateOfEntryPropertyName); } }   public TimeSpan? TimeOfEntry { get { return _timeOfEntry; } set { _timeOfEntry = value; RaisePropertyChanged(_timeOfEntryPropertyName); } }   public double? MilesAtSixMPH { get { return _milesAtSixMPH; } set { _milesAtSixMPH = value; RaisePropertyChanged(_distanceAtSixMPHPropertyName); } }   public int? MinutesAtTweleveMPH { get { return _minutesAtTweleveMPH; } set { _minutesAtTweleveMPH = value; RaisePropertyChanged(_minutesAtTweleveMPHPropertyName); } }   public ICommand StartMeterCommand { get { if(_startMeterCommand == null) { _startMeterCommand = new DelegateCommand<object>(OnStartMeter, CanStartMeter); } return _startMeterCommand; } }   public ICommand AddMinutesCommand { get { if(_addMinutesCommand == null) { _addMinutesCommand = new DelegateCommand<object>(OnAddMinutes, CanAddMinutes); } return _addMinutesCommand; } }   public ICommand ResetCommand { get { if(_resetCommand == null) { _resetCommand = new DelegateCommand<object>(OnResetCommand); } return _resetCommand; } }   } private void OnStartMeter(object obj) { _eventAggregator.GetEvent<TaxiStartedEvent>().Publish( new TaxiStarted() { EngagedOn = DateOfEntry.Value.Date + TimeOfEntry.Value, EngagedState = SelectedState.Value });   _isMeterStarted = true; OnPropertyChanged(this,null); } And views communicate user actions like button clicks, tree view item selections, etc using commands. When user clicks on ‘Start the Meter’ button it invokes the method StartMeterCommand, which calls the method OnStartMeter which publishes the event to TotalViewModel using event aggregator  and TaxiStartedEvent. namespace TaxiModules.ViewModels { public class TotalViewModel:ObservableBase { ... private IEventAggregator _eventAggregator;   public TotalViewModel(IEventAggregator eventAggregator) { _eventAggregator = eventAggregator;   InitializePropertyNames(); InitializeModel(); SubscribeToEvents(); }   public decimal? TotalFare { get { return _totalFare; } set { _totalFare = value; RaisePropertyChanged(_totalFarePropertyName); } } .... private void SubscribeToEvents() { _eventAggregator.GetEvent<TaxiStartedEvent>().Subscribe(OnTaxiStarted, ThreadOption.UIThread,false,(filter) => true); _eventAggregator.GetEvent<TaxiOnMoveEvent>().Subscribe(OnTaxiMove, ThreadOption.UIThread, false, (filter) => true); _eventAggregator.GetEvent<TaxiResetEvent>().Subscribe(OnTaxiReset, ThreadOption.UIThread, false, (filter) => true); }   private void OnTaxiStarted(TaxiStarted taxiStarted) { Fares.Add(new EntryFare()); Fares.Add(new StateTaxFare(taxiStarted)); Fares.Add(new NightSurchargeFare(taxiStarted)); Fares.Add(new PeakHourWeekdayFare(taxiStarted));   SetTotalFare(Fares); }   private void SetTotalFare(IEnumerable<IFare> fares) { TotalFare = (_totalFare ?? 0) + TaxiFareHelper.GetTotalFare(fares); } ....   } }   TotalViewModel subscribes to events, TaxiStartedEvent and rest. When TaxiStartedEvent gets invoked it calls the OnTaxiStarted method which sets the total fare which includes entry fee, state tax, nightly surcharge, peak hour weekday fare.   Note that TotalViewModel derives from ObservableBase which implements the method RaisePropertyChanged which we are invoking in Set of TotalFare property, i.e, once we update the TotalFare property it raises an the event that  allows the TotalFare text box to fetch the new value through the data context. ViewModel is communicating with View through data context and it has no knowledge about View, helping in loose coupling of ViewModel and View.   I have attached the source code (.Net 4.0, Prism 4.0, VS 2010) , download and play with it and don’t forget to leave your comments.  

    Read the article

  • yui compressor maven plugin doesnt compress the js files

    - by hanumant
    I am using yui compressor to compress the js files in my web app. I have configured the plugin as indicated on yui maven plugin site yui compressor maven plugin. This is the pom plugin conf <plugin> <groupId>net.sf.alchim</groupId> <artifactId>yuicompressor-maven-plugin</artifactId> <version>0.7.1</version> <executions> <execution> <phase>compile</phase> <goals> <goal>jslint</goal> <goal>compress</goal> </goals> </execution> </executions> <configuration> <failOnWarning>true</failOnWarning> <nosuffix>true</nosuffix> <force>true</force> <aggregations> <aggregation> <!-- remove files after aggregation (default: false) --> <removeIncluded>false</removeIncluded> <!-- insert new line after each concatenation (default: false) --> <insertNewLine>false</insertNewLine> <output>${project.basedir}/${webcontent.dir}/js/compressedAll.js</output> <!-- files to include, path relative to output's directory or absolute path--> <!--inputDir>base directory for non absolute includes, default to parent dir of output</inputDir--> <includes> <include>**/autocomplete.js</include> <include>**/calendar.js</include> <include>**/dialogs.js</include> <include>**/download.js</include> <include>**/folding.js</include> <include>**/jquery-1.4.2.min.js</include> <include>**/jquery.bgiframe.min.js</include> <include>**/jquery.loadmask.js</include> <include>**/jquery.printelement-1.1.js</include> <include>**/jquery.tablesorter.mod.js</include> <include>**/jquery.tablesorter.pager.js</include> <include>**/jquery.dialogs.plugin.js</include> <include>**/jquery.ui.autocomplete.js</include> <include>**/jquery.validate.js</include> <include>**/jquery-ui-1.8.custom.min.js</include> <include>**/languageDropdown.js</include> <include>**/messages.js</include> <include>**/print.js</include> <include>**/tables.js</include> <include>**/tabs.js</include> <include>**/uwTooltip.js</include> </includes> <!-- files to exclude, path relative to output's directory--> </aggregation> </aggregations> </configuration> <dependencies> <dependency> <groupId>rhino</groupId> <artifactId>js</artifactId> <scope>compile</scope> <version>1.6R5</version> </dependency> <dependency> <groupId>org.apache.maven</groupId> <artifactId>maven-plugin-api</artifactId> <version>2.0.7</version> <scope>provided</scope> </dependency> <dependency> <groupId>org.apache.maven</groupId> <artifactId>maven-project</artifactId> <version>2.0.7</version> <scope>provided</scope> </dependency><dependency> <groupId>net.sf.retrotranslator</groupId> <artifactId>retrotranslator-runtime</artifactId> <version>1.2.9</version> <scope>runtime</scope> </dependency> </dependencies> </plugin> And here is the log at compress time These will use the artifact files already in the core ClassRealm instead, to allow them to be included in PluginDescriptor.getArtifacts(). [DEBUG] Configuring mojo 'net.sf.alchim:yuicompressor-maven-plugin:0.7.1:jslint' [DEBUG] (f) failOnWarning = true [DEBUG] (f) jswarn = true [DEBUG] (f) outputDirectory = C:\test\target\classes [DEBUG] (f) project = MavenProject: com.test.test1:test2:19-SNAPSHOT @ C:\test\pom.xml [DEBUG] (f) resources = [Resource {targetPath: null, filtering: false, FileSet {directory: C:\test\src, PatternSet [includes: {}, excludes: {**/*.class, **/*.java, site/*}]}}] [DEBUG] (f) sourceDirectory = C:\test\src\..\js [DEBUG] (f) warSourceDirectory = C:\test\src\main\webapp [DEBUG] (f) webappDirectory = C:\test\target\test2-19-SNAPSHOT [DEBUG] -- end configuration -- [INFO] [yuicompressor:jslint {execution: default}] [INFO] nb warnings: 0, nb errors: 0 [DEBUG] Configuring mojo 'net.sf.alchim:yuicompressor-maven-plugin:0.7.1:compress' -- [DEBUG] (f) removeIncluded = false [DEBUG] (f) insertNewLine = false [DEBUG] (f) output = C:\test\WebContent\js\compressedAll.js [DEBUG] (f) includes = [**/autocomplete.js, **/calendar.js, **/dialogs.js, **/download.js, **/folding.js, **/jquery-1.4.2.min.js, **/jquery.bgifram e.min.js, **/jquery.loadmask.js, **/jquery.printelement-1.1.js, **/jquery.tablesorter.mod.js, **/jquery.tablesorter.pager.js, **/jquery.dialogs.p lugin.js, **/jquery.ui.autocomplete.js, **/jquery.validate.js, **/jquery-ui-1.8.custom.min.js, **/languageDropdown.js, **/messages.js, **/print.js, * */tables.js, **/tabs.js, **/uwTooltip.js] [DEBUG] (f) aggregations = [net.sf.alchim.mojo.yuicompressor.Aggregation@65646564] [DEBUG] (f) disableOptimizations = false [DEBUG] (f) encoding = Cp1252 [DEBUG] (f) failOnWarning = true [DEBUG] (f) force = true [DEBUG] (f) gzip = false [DEBUG] (f) jswarn = true [DEBUG] (f) linebreakpos = 0 [DEBUG] (f) nomunge = false [DEBUG] (f) nosuffix = true [DEBUG] (f) outputDirectory = C:\test\target\classes [DEBUG] (f) preserveAllSemiColons = false [DEBUG] (f) project = MavenProject: com.test.test1:test2:19-SNAPSHOT @ C:\test\pom.xml [DEBUG] (f) resources = [Resource {targetPath: null, filtering: false, FileSet {directory: C:\test\src, PatternSet [includes: {}, excludes: {**/*.class, **/*.java, site/*}]}}] [DEBUG] (f) sourceDirectory = C:\test\src\..\js [DEBUG] (f) statistics = true [DEBUG] (f) suffix = -min [DEBUG] (f) warSourceDirectory = C:\test\src\main\webapp [DEBUG] (f) webappDirectory = C:\test\target\test2-19-SNAPSHOT [DEBUG] -- end configuration -- [INFO] [yuicompressor:compress {execution: default}] [INFO] generate aggregation : C:\test\WebContent\js\compressedAll.js [INFO] compressedAll.js (407505b) [INFO] nb warnings: 0, nb errors: 0 [DEBUG] Configuring mojo 'org.apache.maven.plugins:maven-resources-plugin:2.2:testResources' -- [DEBUG] (f) filters = [] [DEBUG] (f) outputDirectory = C:\test\target\test-classes [DEBUG] (f) project = MavenProject: com.test.test1:test2:19-SNAPSHOT @ C:\test\pom.xml [DEBUG] (f) resources = [Resource {targetPath: null, filtering: false, FileSet {directory: C:\test\test , PatternSet [includes: {}, excludes: {**/*.class, **/*.java}]}}] [DEBUG] -- end configuration -- The problem is the files are getting aggregated into one file but without compressing. The link above uses version 1.1 and the plugin version I am using is 0.7.1. Is this going to make any diff. Can someone tell what is wrong here. PS: I have obfuscated some text in log to follow the compliance in my firm. So you may find it mismatching somewhere.

    Read the article

  • How and where to implement basic authentication in Kibana 3

    - by Jabb
    I have put my elasticsearch server behind a Apache reverse proxy that provides basic authentication. Authenticating to Apache directly from the browser works fine. However, when I use Kibana 3 to access the server, I receive authentication errors. Obviously because no auth headers are sent along with Kibana's Ajax calls. I added the below to elastic-angular-client.js in the Kibana vendor directory to implement authentication quick and dirty. But for some reason it does not work. $http.defaults.headers.common.Authorization = 'Basic ' + Base64Encode('user:Password'); What is the best approach and place to implement basic authentication in Kibana? /*! elastic.js - v1.1.1 - 2013-05-24 * https://github.com/fullscale/elastic.js * Copyright (c) 2013 FullScale Labs, LLC; Licensed MIT */ /*jshint browser:true */ /*global angular:true */ 'use strict'; /* Angular.js service wrapping the elastic.js API. This module can simply be injected into your angular controllers. */ angular.module('elasticjs.service', []) .factory('ejsResource', ['$http', function ($http) { return function (config) { var // use existing ejs object if it exists ejs = window.ejs || {}, /* results are returned as a promise */ promiseThen = function (httpPromise, successcb, errorcb) { return httpPromise.then(function (response) { (successcb || angular.noop)(response.data); return response.data; }, function (response) { (errorcb || angular.noop)(response.data); return response.data; }); }; // check if we have a config object // if not, we have the server url so // we convert it to a config object if (config !== Object(config)) { config = {server: config}; } // set url to empty string if it was not specified if (config.server == null) { config.server = ''; } /* implement the elastic.js client interface for angular */ ejs.client = { server: function (s) { if (s == null) { return config.server; } config.server = s; return this; }, post: function (path, data, successcb, errorcb) { $http.defaults.headers.common.Authorization = 'Basic ' + Base64Encode('user:Password'); console.log($http.defaults.headers); path = config.server + path; var reqConfig = {url: path, data: data, method: 'POST'}; return promiseThen($http(angular.extend(reqConfig, config)), successcb, errorcb); }, get: function (path, data, successcb, errorcb) { $http.defaults.headers.common.Authorization = 'Basic ' + Base64Encode('user:Password'); path = config.server + path; // no body on get request, data will be request params var reqConfig = {url: path, params: data, method: 'GET'}; return promiseThen($http(angular.extend(reqConfig, config)), successcb, errorcb); }, put: function (path, data, successcb, errorcb) { $http.defaults.headers.common.Authorization = 'Basic ' + Base64Encode('user:Password'); path = config.server + path; var reqConfig = {url: path, data: data, method: 'PUT'}; return promiseThen($http(angular.extend(reqConfig, config)), successcb, errorcb); }, del: function (path, data, successcb, errorcb) { $http.defaults.headers.common.Authorization = 'Basic ' + Base64Encode('user:Password'); path = config.server + path; var reqConfig = {url: path, data: data, method: 'DELETE'}; return promiseThen($http(angular.extend(reqConfig, config)), successcb, errorcb); }, head: function (path, data, successcb, errorcb) { $http.defaults.headers.common.Authorization = 'Basic ' + Base64Encode('user:Password'); path = config.server + path; // no body on HEAD request, data will be request params var reqConfig = {url: path, params: data, method: 'HEAD'}; return $http(angular.extend(reqConfig, config)) .then(function (response) { (successcb || angular.noop)(response.headers()); return response.headers(); }, function (response) { (errorcb || angular.noop)(undefined); return undefined; }); } }; return ejs; }; }]); UPDATE 1: I implemented Matts suggestion. However, the server returns a weird response. It seems that the authorization header is not working. Could it have to do with the fact, that I am running Kibana on port 81 and elasticsearch on 8181? OPTIONS /solar_vendor/_search HTTP/1.1 Host: 46.252.46.173:8181 User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; rv:25.0) Gecko/20100101 Firefox/25.0 Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8 Accept-Language: de-de,de;q=0.8,en-us;q=0.5,en;q=0.3 Accept-Encoding: gzip, deflate Origin: http://46.252.46.173:81 Access-Control-Request-Method: POST Access-Control-Request-Headers: authorization,content-type Connection: keep-alive Pragma: no-cache Cache-Control: no-cache This is the response HTTP/1.1 401 Authorization Required Date: Fri, 08 Nov 2013 23:47:02 GMT WWW-Authenticate: Basic realm="Username/Password" Vary: Accept-Encoding Content-Encoding: gzip Content-Length: 346 Connection: close Content-Type: text/html; charset=iso-8859-1 UPDATE 2: Updated all instances with the modified headers in these Kibana files root@localhost:/var/www/kibana# grep -r 'ejsResource(' . ./src/app/controllers/dash.js: $scope.ejs = ejsResource({server: config.elasticsearch, headers: {'Access-Control-Request-Headers': 'Accept, Origin, Authorization', 'Authorization': 'Basic XXXXXXXXXXXXXXXXXXXXXXXXXXXXX=='}}); ./src/app/services/querySrv.js: var ejs = ejsResource({server: config.elasticsearch, headers: {'Access-Control-Request-Headers': 'Accept, Origin, Authorization', 'Authorization': 'Basic XXXXXXXXXXXXXXXXXXXXXXXXXXXXX=='}}); ./src/app/services/filterSrv.js: var ejs = ejsResource({server: config.elasticsearch, headers: {'Access-Control-Request-Headers': 'Accept, Origin, Authorization', 'Authorization': 'Basic XXXXXXXXXXXXXXXXXXXXXXXXXXXXX=='}}); ./src/app/services/dashboard.js: var ejs = ejsResource({server: config.elasticsearch, headers: {'Access-Control-Request-Headers': 'Accept, Origin, Authorization', 'Authorization': 'Basic XXXXXXXXXXXXXXXXXXXXXXXXXXXXX=='}}); And modified my vhost conf for the reverse proxy like this <VirtualHost *:8181> ProxyRequests Off ProxyPass / http://127.0.0.1:9200/ ProxyPassReverse / https://127.0.0.1:9200/ <Location /> Order deny,allow Allow from all AuthType Basic AuthName “Username/Password” AuthUserFile /var/www/cake2.2.4/.htpasswd Require valid-user Header always set Access-Control-Allow-Methods "GET, POST, DELETE, OPTIONS, PUT" Header always set Access-Control-Allow-Headers "Content-Type, X-Requested-With, X-HTTP-Method-Override, Origin, Accept, Authorization" Header always set Access-Control-Allow-Credentials "true" Header always set Cache-Control "max-age=0" Header always set Access-Control-Allow-Origin * </Location> ErrorLog ${APACHE_LOG_DIR}/error.log </VirtualHost> Apache sends back the new response headers but the request header still seems to be wrong somewhere. Authentication just doesn't work. Request Headers OPTIONS /solar_vendor/_search HTTP/1.1 Host: 46.252.26.173:8181 User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; rv:25.0) Gecko/20100101 Firefox/25.0 Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8 Accept-Language: de-de,de;q=0.8,en-us;q=0.5,en;q=0.3 Accept-Encoding: gzip, deflate Origin: http://46.252.26.173:81 Access-Control-Request-Method: POST Access-Control-Request-Headers: authorization,content-type Connection: keep-alive Pragma: no-cache Cache-Control: no-cache Response Headers HTTP/1.1 401 Authorization Required Date: Sat, 09 Nov 2013 08:48:48 GMT Access-Control-Allow-Methods: GET, POST, DELETE, OPTIONS, PUT Access-Control-Allow-Headers: Content-Type, X-Requested-With, X-HTTP-Method-Override, Origin, Accept, Authorization Access-Control-Allow-Credentials: true Cache-Control: max-age=0 Access-Control-Allow-Origin: * WWW-Authenticate: Basic realm="Username/Password" Vary: Accept-Encoding Content-Encoding: gzip Content-Length: 346 Connection: close Content-Type: text/html; charset=iso-8859-1 SOLUTION: After doing some more research, I found out that this is definitely a configuration issue with regard to CORS. There are quite a few posts available regarding that topic but it appears that in order to solve my problem, it would be necessary to to make some very granular configurations on apache and also make sure that the right stuff is sent from the browser. So I reconsidered the strategy and found a much simpler solution. Just modify the vhost reverse proxy config to move the elastisearch server AND kibana on the same http port. This also adds even better security to Kibana. This is what I did: <VirtualHost *:8181> ProxyRequests Off ProxyPass /bigdatadesk/ http://127.0.0.1:81/bigdatadesk/src/ ProxyPassReverse /bigdatadesk/ http://127.0.0.1:81/bigdatadesk/src/ ProxyPass / http://127.0.0.1:9200/ ProxyPassReverse / https://127.0.0.1:9200/ <Location /> Order deny,allow Allow from all AuthType Basic AuthName “Username/Password” AuthUserFile /var/www/.htpasswd Require valid-user </Location> ErrorLog ${APACHE_LOG_DIR}/error.log </VirtualHost>

    Read the article

  • 26 Days: Countdown to Oracle OpenWorld 2012

    - by Michael Snow
    Welcome to our countdown to Oracle OpenWorld! Oracle OpenWorld 2012 is just around the corner. In less than 26 days, San Francisco will be invaded by an expected 50,000 people from all over the world. Here on the Oracle WebCenter team, we’ve all been working to help make the experience a great one for all our WebCenter customers. For a sneak peak  – we’ll be spending this week giving you a teaser of what to look forward to if you are joining us in San Francisco from September 30th through October 4th. We have Oracle WebCenter sessions covering all topics imaginable. Take a look and use the tools we provide to build out your schedule in advance and reserve your seats in your favorite sessions.  That gives you plenty of time to plan for your week with us in San Francisco. If unfortunately, your boss denied your request to attend - there are still some ways that you can join in the experience virtually On-Demand. This year - we are expanding even more up North of Market Street and will be taking over Union Square as well. Check out this map of San Francisco to get a sense of how much of a footprint Oracle OpenWorld has grown to this year. With so much to see and so many sessions to learn from - its no wonder that people get excited. Add to that a good mix of fun and all of the possible WebCenter sessions you could attend - you won't want to sleep at all to take full advantage of such an opportunity. We'll also have our annual WebCenter Customer Appreciation reception - stay tuned this week for some more info on registration to make sure you'll be able to join us. If you've been following the America's Cup at all and believe in EXTREME PERFORMANCE you'll definitely want to take a look at this video from last year's OpenWorld Keynote. 12.00 Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-family:"Calibri","sans-serif"; mso-ascii- mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi- mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Important OpenWorld Links:  Attendee / Presenters Toolkit Oracle Schedule Builder WebCenter Sessions (listed in the catalog under Fusion Middleware as "Portals, Sites, Content, and Collaboration" ) Oracle Music Festival - AMAZING Line up!!  Oracle Customer Appreciation Night -LOOK HERE!! Oracle OpenWorld LIVE On-Demand Here are all the WebCenter sessions broken down by day for your viewing pleasure. Monday, October 1st CON8885 - Simplify CRM Engagement with Contextual Collaboration Are your sales teams disconnected and disengaged? Do you want a tool for easily connecting expertise across your organization and providing visibility into the complete sales process? Do you want a way to enhance and retain organization knowledge? Oracle Social Network is the answer. Attend this session to learn how to make CRM easy, effective, and efficient for use across virtual sales teams. Also learn how Oracle Social Network can drive sales force collaboration with natural conversations throughout the sales cycle, promote sales team productivity through purposeful social networking without the noise, and build cross-team knowledge by integrating conversations with CRM and other business applications. CON8268 - Oracle WebCenter Strategy: Engaging Your Customers. Empowering Your Business Oracle WebCenter is a user engagement platform for social business, connecting people and information. Attend this session to learn about the Oracle WebCenter strategy, and understand where Oracle is taking the platform to help companies engage customers, empower employees, and enable partners. Business success starts with ensuring that everyone is engaged with the right people and the right information and can access what they need through the channel of their choice—Web, mobile, or social. Are you giving customers, employees, and partners the best-possible experience? Come learn how you can! ¶ HOL10208 - Add Social Capabilities to Your Enterprise Applications Oracle Social Network enables you to add real-time collaboration capabilities into your enterprise applications, so that conversations can happen directly within your business systems. In this hands-on lab, you will try out the Oracle Social Network product to collaborate with other attendees, using real-time conversations with document sharing capabilities. Next you will embed social capabilities into a sample Web-based enterprise application, using embedded UI components. Experts will also write simple REST-based integrations, using the Oracle Social Network API to programmatically create social interactions. ¶ CON8893 - Improve Employee Productivity with Intuitive and Social Work Environments Social technologies have already transformed the ways customers, employees, partners, and suppliers communicate and stay informed. Forward-thinking organizations today need technologies and infrastructures to help them advance to the next level and integrate social activities with business applications to deliver a user experience that simplifies business processes and enterprise application engagement. Attend this session to hear from an innovative Oracle Social Network customer and learn how you can improve productivity with intuitive and social work environments and empower your employees with innovative social tools to enable contextual access to content and dynamic personalization of solutions. ¶ CON8270 - Oracle WebCenter Content Strategy and Vision Oracle WebCenter provides a strategic content infrastructure for managing documents, images, e-mails, and rich media files. With a single repository, organizations can address any content use case, such as accounts payable, HR onboarding, document management, compliance, records management, digital asset management, or Website management. In this session, learn about future plans for how Oracle WebCenter will address new use cases as well as new integrations with Oracle Fusion Middleware and Oracle Applications, leveraging your investments by making your users more productive and error-free. ¶ CON8269 - Oracle WebCenter Sites Strategy and Vision Oracle’s Web experience management solution, Oracle WebCenter Sites, enables organizations to use the online channel to drive customer acquisition and brand loyalty. It helps marketers and business users easily create and manage contextually relevant, social, interactive online experiences across multiple channels on a global scale. In this session, learn about future plans for how Oracle WebCenter Sites will provide you with the tools, capabilities, and integrations you need in order to continue to address your customers’ evolving requirements for engaging online experiences and keep moving your business forward. ¶ CON8896 - Living with SharePoint SharePoint is a popular platform, but it’s not always the best fit for Oracle customers. In this session, you’ll discover the technical and nontechnical limitations and pitfalls of SharePoint and learn about Oracle alternatives for collaboration, portals, enterprise and Web content management, social computing, and application integration. The presentation shows you how to integrate with SharePoint when business or IT requirements dictate and covers cloud-based (Office 365) and on-premises versions of SharePoint. Presented by a former Microsoft director of SharePoint product management and backed by independent customer research, this session will prepare you to answer the question “Why don’t we just use SharePoint for that?’ the next time it comes up in your organization. ¶ CON7843 - Content-Enabling Enterprise Processes with Oracle WebCenter Organizations today continually strive to automate business processes, reduce costs, and improve efficiency. Many business processes are content-intensive and unstructured, requiring ad hoc collaboration, and distributed in nature, requiring many approvals and generating huge volumes of paper. In this session, learn how Oracle and SYSTIME have partnered to help a customer content-enable its enterprise with Oracle WebCenter Content and Oracle WebCenter Imaging 11g and integrate them with Oracle Applications. ¶ CON6114 - Tape Robotics’ Newest Superhero: Now Fueled by Oracle Software For small, midsize, and rapidly growing businesses that want the most energy-efficient, scalable storage infrastructure to meet their rapidly growing data demands, Oracle’s most recent addition to its award-winning tape portfolio leverages several pieces of Oracle software. With Oracle Linux, Oracle WebLogic, and Oracle Fusion Middleware tools, the library achieves a higher level of usability than previous products while offering customers a familiar interface for management, plus ease of use. This session examines the competitive advantages of the tape library and how Oracle software raises customer satisfaction. Learn how the combination of Oracle engineered systems, Oracle Secure Backup, and Oracle’s StorageTek tape libraries provide end-to-end coverage of your data. ¶ CON9437 - Mobile Access Management With more than five billion mobile devices on the planet and an increasing number of users using their own devices to access corporate data and applications, securely extending identity management to mobile devices has become a hot topic. This session focuses on how to extend your existing identity management infrastructure and policies to securely and seamlessly enable mobile user access. CON7815 - Customer Experience Online in Cloud: Oracle WebCenter Sites, Oracle ATG Apps, Oracle Exalogic Oracle WebCenter Sites and Oracle’s ATG product line together can provide a compelling marketing and e-commerce experience. When you couple them with the extreme performance of Oracle Exalogic, you’ll see unmatched scalability that provides you with a true cloud-based solution. In this session, you’ll learn how running Oracle WebCenter Sites and ATG applications on Oracle Exalogic delivers both a private and a public cloud experience. Find out what it takes to get these systems working together and delivering engaging Web experiences. Even if you aren’t considering Oracle Exalogic today, the rich Web experience of Oracle WebCenter, paired with the depth of the ATG product line, can provide your business full support, from merchandising through sale completion. ¶ CON8271 - Oracle WebCenter Portal Strategy and Vision To innovate and keep a competitive edge, organizations need to leverage the power of agile and responsive Web applications. Oracle WebCenter Portal enables you to do just that, by delivering intuitive user experiences for enterprise applications to drive innovation with composite applications and mashups. Attend this session to learn firsthand from customers how Oracle WebCenter Portal extends the value of existing enterprise applications, business processes, and content; delivers a superior business user experience; and maximizes limited IT resources. ¶ CON8880 - The Connected Customer Experience Begins with the Online Channel There’s a lot of talk these days about how to connect the customer journey across various touchpoints—from Websites and e-commerce to call centers and in-store—to provide experiences that are more relevant and engaging and ultimately gain competitive edge. Doing it all at once isn’t a realistic objective, so where do you start? Come to this session, and hear about three steps you can take that can help you begin your journey toward delivering the connected customer experience. You’ll hear how Oracle now has an integrated digital marketing platform for your corporate Website, your e-commerce site, your self-service portal, and your marketing and loyalty campaigns, and you’ll learn what you can do today to begin executing on your customer experience initiatives. ¶ GEN11451 - General Session: Building Mobile Applications with Oracle Cloud With the prevalence of smart mobile devices, companies are facing an increased demand to provide access to data and applications from new channels. However, developing applications for mobile devices poses some unique challenges. Come to this session to learn how Oracle addresses these challenges, offering a simpler way to develop and deploy cross-device mobile applications. See how Oracle Cloud enables you to access applications, data, and services from mobile channels in an easier way.  CON8272 - Oracle Social Network Strategy and Vision One key way of increasing employee productivity is by bringing people, processes, and information together—providing new social capabilities to enable business users to quickly correspond and collaborate on business activities. Oracle WebCenter provides a user engagement platform with social and collaborative technologies to empower business users to focus on their key business processes, applications, and content in the context of their role and process. Attend this session to hear how the latest social capabilities in Oracle Social Network are enabling organizations to transform themselves into social businesses.  --- Tuesday, October 2nd HOL10194 - Enterprise Content Management Simplified: Oracle WebCenter Content’s Next-Generation UI Regardless of the nature of your business, unstructured content underpins many of its daily functions. Whether you are working with traditional presentations, spreadsheets, or text documents—or even with digital assets such as images and multimedia files—your content needs to be accessible and manageable in convenient and intuitive ways to make working with the content easier. Additionally, you need the ability to easily share documents with coworkers to facilitate a collaborative working environment. Come to this session to see how Oracle WebCenter Content’s next-generation user interface helps modern knowledge workers easily manage personal and enterprise documents in a collaborative environment.¶ CON8877 - Develop a Mobile Strategy with Oracle WebCenter: Engage Customers, Employees, and Partners Mobile technology has gone from nice-to-have to a cornerstone of user engagement. Mobile access enables users to have information available at their fingertips, enabling them to take action the moment they make a decision, interact in the moment of convenience, and take advantage of new service offerings in their preferred channels. All your employees have your mobile applications in their pocket; now what are you going to do? It is a critical step for companies to think through what their employees, customers, and partners really need on their devices. Attend this session to see how Oracle WebCenter enables you to better engage your customers, employees, and partners by providing a unified experience across multiple channels. ¶ CON9447 - Enabling Access for Hundreds of Millions of Users How do you grow your business by identifying, authenticating, authorizing, and federating users on the Web, leveraging social identity and the open source OAuth protocol? How do you scale your access management solution to support hundreds of millions of users? With social identity support out of the box, Oracle’s access management solution is also benchmarked for 250-million-user deployment according to real-world customer scenarios. In this session, you will learn about the social identity capability and the 250-million-user benchmark testing of Oracle Access Manager and Oracle Adaptive Access Manager running on Oracle Exalogic and Oracle Exadata. ¶ HOL10207 - Build an Intranet Portal with Oracle WebCenter In this hands-on lab, you’ll work with Oracle WebCenter Portal and Oracle WebCenter Content to build out an enterprise portal that maximizes the productivity of teams and individual contributors. Using browser-based tools, you’ll manage site resources such as page styles, templates, and navigation. You’ll edit content stored in Oracle WebCenter Content directly from your portal. You’ll also experience the latest features that promote collaboration, social networking, and personal productivity. ¶ CON2906 - Get Proactive: Best Practices for Maintaining Oracle Fusion Middleware You chose Oracle Fusion Middleware products to help your organization deliver superior business results. Now learn how to take full advantage of your software with all the great tools, resources, and product updates you’re entitled to through Oracle Support. In this session, Oracle product experts provide proven best practices to help you work more efficiently, plan and prepare for upgrades and patching more effectively, and manage risk. Topics include configuration management tools, remote diagnostics, My Oracle Support Community, and My Oracle Support Lifecycle Advisors. New users and Oracle Fusion Middleware experts alike are guaranteed to leave with fresh ideas and practical, easy-to-implement next steps. ¶ CON8878 - Oracle WebCenter’s Cloud Strategy: From Social and Platform Services to Mashups Cloud computing represents a paradigm shift in how we build applications, automate processes, collaborate, and share and in how we secure our enterprise. Additionally, as you adopt cloud-based services in your organization, it’s likely that you will still have many critical on-premises applications running. With these mixed environments, multiple user interfaces, different security, and multiple datasources and content sources, how do you start evolving your strategy to account for these challenges? Oracle WebCenter offers a complete array of technologies enabling you to solve these challenges and prepare you for the cloud. Attend this session to learn how you can use Oracle WebCenter in the cloud as well as create on-premises and cloud application mash-ups. ¶ CON8901 - Optimize Enterprise Business Processes with Oracle WebCenter and Oracle BPM Do you have business processes that span multiple applications? Are you grappling with how to have visibility across these business processes; how to manage content that is associated with these processes; and, most importantly, how to model and optimize these business processes? Attend this session to hear how Oracle WebCenter and Oracle Business Process Management provide a unique set of integrated solutions to provide a composite application dashboard across these business processes and offer a solution for content-centric business processes. ¶ CON8883 - Deliver Engaging Interfaces to Oracle Applications with Oracle WebCenter Critical business processes live within enterprise applications, and application users need to manage and execute these processes as effectively as possible. Oracle provides a comprehensive user engagement platform to increase user productivity and optimize overall processes within Oracle Applications—Oracle E-Business Suite and Oracle’s Siebel, PeopleSoft, and JD Edwards product families—and third-party applications. Attend this session to learn how you can integrate these applications with Oracle WebCenter to deliver composite application dashboards to your end users—whether they are your customers, partners, or employees—for enhanced usability and Web 2.0–enabled enterprise portals.¶ Wednesday, October 3rd CON8895 - Future-Ready Intranets: How Aramark Re-engineered the Application Landscape There are essential techniques and technologies you can use to deliver employee portals that garner higher productivity, improve business efficiency, and increase user engagement. Attend this session to learn how you can leverage Oracle WebCenter Portal as a user engagement platform for bringing together business process management, enterprise content management, and business intelligence into a highly relevant and integrated experience. Hear how Aramark has leveraged Oracle WebCenter Portal and Oracle WebCenter Content to deliver a unified workspace providing simpler navigation and processing, consolidation of tools, easy access to information, integrated search, and single sign-on. ¶ CON8886 - Content Consolidation: Save Money, Increase Efficiency, and Eliminate Silos Organizations are looking for ways to save money and be more efficient. With content in many different places, it’s difficult to know where to look for a document and whether the document is the most current version. With Oracle WebCenter, content can be consolidated into one best-of-breed repository that is secure, scalable, and integrated with your business processes and applications. Users can find the content they need, where they need it, and ensure that it is the right content. This session covers content challenges that affect your business; content consolidation that can lead to savings in storage and administration costs and can lower risks; and how companies are realizing savings. ¶ CON8911 - Improve Online Experiences for Customers and Partners with Self-Service Portals Are you able to provide your customers and partners an easy-to-use online self-service experience? Are you processing high-volume transactions and struggling with call center bottlenecks or back-end systems that won’t integrate, causing order delays and customer frustration? Are you looking to target content such as product and service offerings to your end users? This session shares approaches to providing targeted delivery as well as strategies and best practices for transforming your business by providing an intuitive user experience for your customers and partners. ¶ CON6156 - Top 10 Ways to Integrate Oracle WebCenter Content This session covers 10 common ways to integrate Oracle WebCenter Content with other enterprise applications and middleware. It discusses out-of-the-box modules that provide expanded features in Oracle WebCenter Content—such as enterprise search, SOA, and BPEL—as well as developer tools you can use to create custom integrations. The presentation also gives guidance on which integration option may work best in your environment. ¶ HOL10207 - Build an Intranet Portal with Oracle WebCenter In this hands-on lab, you’ll work with Oracle WebCenter Portal and Oracle WebCenter Content to build out an enterprise portal that maximizes the productivity of teams and individual contributors. Using browser-based tools, you’ll manage site resources such as page styles, templates, and navigation. You’ll edit content stored in Oracle WebCenter Content directly from your portal. You’ll also experience the latest features that promote collaboration, social networking, and personal productivity. ¶ CON7817 - Migration to Oracle WebCenter Imaging 11g Customers today continually strive to automate business processes, reduce costs, and improve efficiency. The accounts payable process—which is often distributed in nature, requires many approvals, and generates huge volumes of paper invoices—is automated by many customers. In this session, learn how Oracle and SYSTIME have partnered to help a customer migrate its existing Oracle Imaging and Process Management Release 7.6 to the latest Oracle WebCenter Imaging 11g and integrate it with Oracle’s JD Edwards family of products. ¶ CON8910 - How to Engage Customers Across Web, Mobile, and Social Channels Whether on desktops at the office, on tablets at home, or on mobile phones when on the go, today’s customers are always connected. To engage today’s customers, you need to make the online customer experience connected and consistent across a host of devices and multiple channels, including Web, mobile, and social networks. Managing this multichannel environment can result in lots of headaches without the right tools. Attend this session to learn how Oracle WebCenter Sites solves the challenge of multichannel customer engagement. ¶ HOL10206 - Oracle WebCenter Sites 11g: Transforming the Content Contributor Experience Oracle WebCenter Sites 11g makes it easy for marketers and business users to contribute to and manage Websites with the new visual, contextual, and intuitive Web authoring interface. In this hands-on lab, you will create and manage content for a sports-themed Website, using many of the new and enhanced features of the 11g release. ¶ CON8900 - Building Next-Generation Portals: An Interactive Customer Panel Discussion Social and collaborative technologies have changed how people interact, learn, and collaborate, and providing a modern, social Web presence is imperative to remain competitive in today’s market. Can your business benefit from a more collaborative and interactive portal environment for employees, customers, and partners? Attend this session to hear from Oracle WebCenter Portal customers as they share their strategies and best practices for providing users with a modern experience that adapts to their needs and includes personalized access to content in context. The panel also addresses how customers have benefited from creating next-generation portals by migrating from older portal technologies to Oracle WebCenter Portal. ¶ CON9625 - Taking Control of Oracle WebCenter Security Organizations are increasingly looking to extend their Oracle WebCenter portal for social business, to serve external users and provide seamless access to the right information. In particular, many organizations are extending Oracle WebCenter in a business-to-business scenario requiring secure identification and authorization of business partners and their users. This session focuses on how customers are leveraging, securing, and providing access control to Oracle WebCenter portal and mobile solutions. You will learn best practices and hear real-world examples of how to provide flexible and granular access control for Oracle WebCenter deployments, using Oracle Platform Security Services and Oracle Access Management Suite product offerings. ¶ CON8891 - Extending Social into Enterprise Applications and Business Processes Oracle Social Network is an extensible social platform that enables contextual collaboration within enterprise applications and business processes, providing relevant data from across various enterprise systems in one place. Attend this session to see how an Oracle Social Network customer is integrating multiple applications—such as CRM, HCM, and business processes—into Oracle Social Network and Oracle WebCenter to enable individuals and teams to solve complex cross-organizational business problems more effectively by utilizing the social enterprise. ¶ Thursday, October 4th CON8899 - Becoming a Social Business: Stories from the Front Lines of Change What does it really mean to be a social business? How can you change our organization to embrace social approaches? What pitfalls do you need to avoid? In this lively panel discussion, customer and industry thought leaders in social business explore these topics and more as they share their stories of the good, the bad, and the ugly that can happen when embracing social methods and technologies to improve business success. Using moderated questions and open Q&A from the audience, the panel discusses vital topics such as the critical factors for success, the major issues to avoid, how to gain senior executive support for social efforts, how to handle undesired behavior, and how to measure business impact. It takes a thought-provoking look at becoming a social business from the inside. ¶ CON6851 - Oracle WebCenter and Oracle Business Intelligence Enterprise Edition to Create Vendor Portals Large manufacturers of grocery items routinely find themselves depending on the inventory management expertise of their wholesalers and distributors. Inventory costs can be managed more efficiently by the manufacturers if they have better insight into the inventory levels of items carried by their distributors. This creates a unique opportunity for distributors and wholesalers to leverage this knowledge into a revenue-generating subscription service. Oracle Business Intelligence Enterprise Edition and Oracle WebCenter Portal play a key part in enabling creation of business-managed business intelligence portals for vendors. This session discusses one customer that implemented this by leveraging Oracle WebCenter and Oracle Business Intelligence Enterprise Edition. ¶ CON8879 - Provide a Personalized and Consistent Customer Experience in Your Websites and Portals Your customers engage with your company online in different ways throughout their journey—from prospecting by acquiring information on your corporate Website to transacting through self-service applications on your customer portal—and then the cycle begins again when they look for new products and services. Ensuring that the customer experience is consistent and personalized across online properties—from branding and content to interactions and transactions—can be a daunting task. Oracle WebCenter enables you to speak and interact with your customers with one voice across your Websites and portals by providing an integrated platform for delivery of self-service and engagement that unifies and personalizes the online experience. Learn more in this session. ¶ CON8898 - Land Mines, Potholes, and Dirt Roads: Navigating the Way to ECM Nirvana Ten years ago, people were predicting that by this time in history, we’d be some kind of utopian paperless society. As we all know, we’re not there yet, but are we getting closer? What is keeping companies from driving down the road to enterprise content management bliss? Most people understand that using ECM as a central platform enables organizations to expedite document-centric processes, but most business processes in organizations are still heavily paper-based. Many of these processes could be automated and improved with an ECM platform infrastructure. In this panel discussion, you’ll hear from Oracle WebCenter customers that have already solved some of these challenges as they share their strategies for success and roads to avoid along your journey. ¶ CON8908 - Oracle WebCenter Portal: Creating and Using Content Presenter Templates Oracle WebCenter Portal applications use task flows to display and integrate content stored in the Oracle WebCenter Content server. Among the most flexible task flows is Content Presenter, which renders various types of content on an Oracle WebCenter Portal page. Although Oracle WebCenter Portal comes with a set of predefined Content Presenter templates, developers can create their own templates for specific rendering needs. This session shows the lifecycle of developing Content Presenter task flows, including how to create, package, import, modify at runtime, and use such templates. In addition to simple examples with Oracle Application Development Framework (Oracle ADF) UI elements to render the content, it shows how to use other UI technologies, CSS files, and JavaScript libraries. ¶ CON8897 - Using Web Experience Management to Drive Online Marketing Success Every year, the online channel becomes more imperative for driving organizational top-line revenue, but for many companies, mastering how to best market their products and services in a fast-evolving online world with high customer expectations for personalized experiences can be a complex proposition. Come to this panel discussion, and hear directly from online marketers how they are succeeding today by using Web experience management to drive marketing success, using capabilities such as targeting and optimization, user-generated content, mobile site publishing, and site visitor personalization to deliver engaging online experiences. ¶ CON8892 - Oracle’s Journey to Social Business Social business is a revolution, one that is causing rapidly accelerating change in how companies and customers engage with one another and how employees work together. Oracle’s goal in becoming a social business is to create a socially connected organization in which working collaboratively across geographical locations, lines of business, and management chains is second nature, enabling innovative solutions to business challenges. We can achieve this by connecting the right people, finding the right content, communicating with the right people, collaborating at the right time, and building the right communities in the right context—all ready in the CLOUD. Attend this session to see how Oracle is transforming itself into a social business. ¶  ------------ If you've read all the way to the end here - we are REALLY looking forward to seeing you in San Francisco.

    Read the article

  • Upload File to Windows Azure Blob in Chunks through ASP.NET MVC, JavaScript and HTML5

    - by Shaun
    Originally posted on: http://geekswithblogs.net/shaunxu/archive/2013/07/01/upload-file-to-windows-azure-blob-in-chunks-through-asp.net.aspxMany people are using Windows Azure Blob Storage to store their data in the cloud. Blob storage provides 99.9% availability with easy-to-use API through .NET SDK and HTTP REST. For example, we can store JavaScript files, images, documents in blob storage when we are building an ASP.NET web application on a Web Role in Windows Azure. Or we can store our VHD files in blob and mount it as a hard drive in our cloud service. If you are familiar with Windows Azure, you should know that there are two kinds of blob: page blob and block blob. The page blob is optimized for random read and write, which is very useful when you need to store VHD files. The block blob is optimized for sequential/chunk read and write, which has more common usage. Since we can upload block blob in blocks through BlockBlob.PutBlock, and them commit them as a whole blob with invoking the BlockBlob.PutBlockList, it is very powerful to upload large files, as we can upload blocks in parallel, and provide pause-resume feature. There are many documents, articles and blog posts described on how to upload a block blob. Most of them are focus on the server side, which means when you had received a big file, stream or binaries, how to upload them into blob storage in blocks through .NET SDK.  But the problem is, how can we upload these large files from client side, for example, a browser. This questioned to me when I was working with a Chinese customer to help them build a network disk production on top of azure. The end users upload their files from the web portal, and then the files will be stored in blob storage from the Web Role. My goal is to find the best way to transform the file from client (end user’s machine) to the server (Web Role) through browser. In this post I will demonstrate and describe what I had done, to upload large file in chunks with high speed, and save them as blocks into Windows Azure Blob Storage.   Traditional Upload, Works with Limitation The simplest way to implement this requirement is to create a web page with a form that contains a file input element and a submit button. 1: @using (Html.BeginForm("About", "Index", FormMethod.Post, new { enctype = "multipart/form-data" })) 2: { 3: <input type="file" name="file" /> 4: <input type="submit" value="upload" /> 5: } And then in the backend controller, we retrieve the whole content of this file and upload it in to the blob storage through .NET SDK. We can split the file in blocks and upload them in parallel and commit. The code had been well blogged in the community. 1: [HttpPost] 2: public ActionResult About(HttpPostedFileBase file) 3: { 4: var container = _client.GetContainerReference("test"); 5: container.CreateIfNotExists(); 6: var blob = container.GetBlockBlobReference(file.FileName); 7: var blockDataList = new Dictionary<string, byte[]>(); 8: using (var stream = file.InputStream) 9: { 10: var blockSizeInKB = 1024; 11: var offset = 0; 12: var index = 0; 13: while (offset < stream.Length) 14: { 15: var readLength = Math.Min(1024 * blockSizeInKB, (int)stream.Length - offset); 16: var blockData = new byte[readLength]; 17: offset += stream.Read(blockData, 0, readLength); 18: blockDataList.Add(Convert.ToBase64String(BitConverter.GetBytes(index)), blockData); 19:  20: index++; 21: } 22: } 23:  24: Parallel.ForEach(blockDataList, (bi) => 25: { 26: blob.PutBlock(bi.Key, new MemoryStream(bi.Value), null); 27: }); 28: blob.PutBlockList(blockDataList.Select(b => b.Key).ToArray()); 29:  30: return RedirectToAction("About"); 31: } This works perfect if we selected an image, a music or a small video to upload. But if I selected a large file, let’s say a 6GB HD-movie, after upload for about few minutes the page will be shown as below and the upload will be terminated. In ASP.NET there is a limitation of request length and the maximized request length is defined in the web.config file. It’s a number which less than about 4GB. So if we want to upload a really big file, we cannot simply implement in this way. Also, in Windows Azure, a cloud service network load balancer will terminate the connection if exceed the timeout period. From my test the timeout looks like 2 - 3 minutes. Hence, when we need to upload a large file we cannot just use the basic HTML elements. Besides the limitation mentioned above, the simple HTML file upload cannot provide rich upload experience such as chunk upload, pause and pause-resume. So we need to find a better way to upload large file from the client to the server.   Upload in Chunks through HTML5 and JavaScript In order to break those limitation mentioned above we will try to upload the large file in chunks. This takes some benefit to us such as - No request size limitation: Since we upload in chunks, we can define the request size for each chunks regardless how big the entire file is. - No timeout problem: The size of chunks are controlled by us, which means we should be able to make sure request for each chunk upload will not exceed the timeout period of both ASP.NET and Windows Azure load balancer. It was a big challenge to upload big file in chunks until we have HTML5. There are some new features and improvements introduced in HTML5 and we will use them to implement our solution.   In HTML5, the File interface had been improved with a new method called “slice”. It can be used to read part of the file by specifying the start byte index and the end byte index. For example if the entire file was 1024 bytes, file.slice(512, 768) will read the part of this file from the 512nd byte to 768th byte, and return a new object of interface called "Blob”, which you can treat as an array of bytes. In fact,  a Blob object represents a file-like object of immutable, raw data. The File interface is based on Blob, inheriting blob functionality and expanding it to support files on the user's system. For more information about the Blob please refer here. File and Blob is very useful to implement the chunk upload. We will use File interface to represent the file the user selected from the browser and then use File.slice to read the file in chunks in the size we wanted. For example, if we wanted to upload a 10MB file with 512KB chunks, then we can read it in 512KB blobs by using File.slice in a loop.   Assuming we have a web page as below. User can select a file, an input box to specify the block size in KB and a button to start upload. 1: <div> 2: <input type="file" id="upload_files" name="files[]" /><br /> 3: Block Size: <input type="number" id="block_size" value="512" name="block_size" />KB<br /> 4: <input type="button" id="upload_button_blob" name="upload" value="upload (blob)" /> 5: </div> Then we can have the JavaScript function to upload the file in chunks when user clicked the button. 1: <script type="text/javascript"> 1: 2: $(function () { 3: $("#upload_button_blob").click(function () { 4: }); 5: });</script> Firstly we need to ensure the client browser supports the interfaces we are going to use. Just try to invoke the File, Blob and FormData from the “window” object. If any of them is “undefined” the condition result will be “false” which means your browser doesn’t support these premium feature and it’s time for you to get your browser updated. FormData is another new feature we are going to use in the future. It could generate a temporary form for us. We will use this interface to create a form with chunk and associated metadata when invoked the service through ajax. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: if (window.File && window.Blob && window.FormData) { 4: alert("Your brwoser is awesome, let's rock!"); 5: } 6: else { 7: alert("Oh man plz update to a modern browser before try is cool stuff out."); 8: return; 9: } 10: }); Each browser supports these interfaces by their own implementation and currently the Blob, File and File.slice are supported by Chrome 21, FireFox 13, IE 10, Opera 12 and Safari 5.1 or higher. After that we worked on the files the user selected one by one since in HTML5, user can select multiple files in one file input box. 1: var files = $("#upload_files")[0].files; 2: for (var i = 0; i < files.length; i++) { 3: var file = files[i]; 4: var fileSize = file.size; 5: var fileName = file.name; 6: } Next, we calculated the start index and end index for each chunks based on the size the user specified from the browser. We put them into an array with the file name and the index, which will be used when we upload chunks into Windows Azure Blob Storage as blocks since we need to specify the target blob name and the block index. At the same time we will store the list of all indexes into another variant which will be used to commit blocks into blob in Azure Storage once all chunks had been uploaded successfully. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: ... ... 4: // start to upload each files in chunks 5: var files = $("#upload_files")[0].files; 6: for (var i = 0; i < files.length; i++) { 7: var file = files[i]; 8: var fileSize = file.size; 9: var fileName = file.name; 10:  11: // calculate the start and end byte index for each blocks(chunks) 12: // with the index, file name and index list for future using 13: var blockSizeInKB = $("#block_size").val(); 14: var blockSize = blockSizeInKB * 1024; 15: var blocks = []; 16: var offset = 0; 17: var index = 0; 18: var list = ""; 19: while (offset < fileSize) { 20: var start = offset; 21: var end = Math.min(offset + blockSize, fileSize); 22:  23: blocks.push({ 24: name: fileName, 25: index: index, 26: start: start, 27: end: end 28: }); 29: list += index + ","; 30:  31: offset = end; 32: index++; 33: } 34: } 35: }); Now we have all chunks’ information ready. The next step should be upload them one by one to the server side, and at the server side when received a chunk it will upload as a block into Blob Storage, and finally commit them with the index list through BlockBlobClient.PutBlockList. But since all these invokes are ajax calling, which means not synchronized call. So we need to introduce a new JavaScript library to help us coordinate the asynchronize operation, which named “async.js”. You can download this JavaScript library here, and you can find the document here. I will not explain this library too much in this post. We will put all procedures we want to execute as a function array, and pass into the proper function defined in async.js to let it help us to control the execution sequence, in series or in parallel. Hence we will define an array and put the function for chunk upload into this array. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: ... ... 4:  5: // start to upload each files in chunks 6: var files = $("#upload_files")[0].files; 7: for (var i = 0; i < files.length; i++) { 8: var file = files[i]; 9: var fileSize = file.size; 10: var fileName = file.name; 11: // calculate the start and end byte index for each blocks(chunks) 12: // with the index, file name and index list for future using 13: ... ... 14:  15: // define the function array and push all chunk upload operation into this array 16: blocks.forEach(function (block) { 17: putBlocks.push(function (callback) { 18: }); 19: }); 20: } 21: }); 22: }); As you can see, I used File.slice method to read each chunks based on the start and end byte index we calculated previously, and constructed a temporary HTML form with the file name, chunk index and chunk data through another new feature in HTML5 named FormData. Then post this form to the backend server through jQuery.ajax. This is the key part of our solution. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: ... ... 4: // start to upload each files in chunks 5: var files = $("#upload_files")[0].files; 6: for (var i = 0; i < files.length; i++) { 7: var file = files[i]; 8: var fileSize = file.size; 9: var fileName = file.name; 10: // calculate the start and end byte index for each blocks(chunks) 11: // with the index, file name and index list for future using 12: ... ... 13: // define the function array and push all chunk upload operation into this array 14: blocks.forEach(function (block) { 15: putBlocks.push(function (callback) { 16: // load blob based on the start and end index for each chunks 17: var blob = file.slice(block.start, block.end); 18: // put the file name, index and blob into a temporary from 19: var fd = new FormData(); 20: fd.append("name", block.name); 21: fd.append("index", block.index); 22: fd.append("file", blob); 23: // post the form to backend service (asp.net mvc controller action) 24: $.ajax({ 25: url: "/Home/UploadInFormData", 26: data: fd, 27: processData: false, 28: contentType: "multipart/form-data", 29: type: "POST", 30: success: function (result) { 31: if (!result.success) { 32: alert(result.error); 33: } 34: callback(null, block.index); 35: } 36: }); 37: }); 38: }); 39: } 40: }); Then we will invoke these functions one by one by using the async.js. And once all functions had been executed successfully I invoked another ajax call to the backend service to commit all these chunks (blocks) as the blob in Windows Azure Storage. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: ... ... 4: // start to upload each files in chunks 5: var files = $("#upload_files")[0].files; 6: for (var i = 0; i < files.length; i++) { 7: var file = files[i]; 8: var fileSize = file.size; 9: var fileName = file.name; 10: // calculate the start and end byte index for each blocks(chunks) 11: // with the index, file name and index list for future using 12: ... ... 13: // define the function array and push all chunk upload operation into this array 14: ... ... 15: // invoke the functions one by one 16: // then invoke the commit ajax call to put blocks into blob in azure storage 17: async.series(putBlocks, function (error, result) { 18: var data = { 19: name: fileName, 20: list: list 21: }; 22: $.post("/Home/Commit", data, function (result) { 23: if (!result.success) { 24: alert(result.error); 25: } 26: else { 27: alert("done!"); 28: } 29: }); 30: }); 31: } 32: }); That’s all in the client side. The outline of our logic would be - Calculate the start and end byte index for each chunks based on the block size. - Defined the functions of reading the chunk form file and upload the content to the backend service through ajax. - Execute the functions defined in previous step with “async.js”. - Commit the chunks by invoking the backend service in Windows Azure Storage finally.   Save Chunks as Blocks into Blob Storage In above we finished the client size JavaScript code. It uploaded the file in chunks to the backend service which we are going to implement in this step. We will use ASP.NET MVC as our backend service, and it will receive the chunks, upload into Windows Azure Bob Storage in blocks, then finally commit as one blob. As in the client side we uploaded chunks by invoking the ajax call to the URL "/Home/UploadInFormData", I created a new action under the Index controller and it only accepts HTTP POST request. 1: [HttpPost] 2: public JsonResult UploadInFormData() 3: { 4: var error = string.Empty; 5: try 6: { 7: } 8: catch (Exception e) 9: { 10: error = e.ToString(); 11: } 12:  13: return new JsonResult() 14: { 15: Data = new 16: { 17: success = string.IsNullOrWhiteSpace(error), 18: error = error 19: } 20: }; 21: } Then I retrieved the file name, index and the chunk content from the Request.Form object, which was passed from our client side. And then, used the Windows Azure SDK to create a blob container (in this case we will use the container named “test”.) and create a blob reference with the blob name (same as the file name). Then uploaded the chunk as a block of this blob with the index, since in Blob Storage each block must have an index (ID) associated with so that finally we can put all blocks as one blob by specifying their block ID list. 1: [HttpPost] 2: public JsonResult UploadInFormData() 3: { 4: var error = string.Empty; 5: try 6: { 7: var name = Request.Form["name"]; 8: var index = int.Parse(Request.Form["index"]); 9: var file = Request.Files[0]; 10: var id = Convert.ToBase64String(BitConverter.GetBytes(index)); 11:  12: var container = _client.GetContainerReference("test"); 13: container.CreateIfNotExists(); 14: var blob = container.GetBlockBlobReference(name); 15: blob.PutBlock(id, file.InputStream, null); 16: } 17: catch (Exception e) 18: { 19: error = e.ToString(); 20: } 21:  22: return new JsonResult() 23: { 24: Data = new 25: { 26: success = string.IsNullOrWhiteSpace(error), 27: error = error 28: } 29: }; 30: } Next, I created another action to commit the blocks into blob once all chunks had been uploaded. Similarly, I retrieved the blob name from the Request.Form. I also retrieved the chunks ID list, which is the block ID list from the Request.Form in a string format, split them as a list, then invoked the BlockBlob.PutBlockList method. After that our blob will be shown in the container and ready to be download. 1: [HttpPost] 2: public JsonResult Commit() 3: { 4: var error = string.Empty; 5: try 6: { 7: var name = Request.Form["name"]; 8: var list = Request.Form["list"]; 9: var ids = list 10: .Split(',') 11: .Where(id => !string.IsNullOrWhiteSpace(id)) 12: .Select(id => Convert.ToBase64String(BitConverter.GetBytes(int.Parse(id)))) 13: .ToArray(); 14:  15: var container = _client.GetContainerReference("test"); 16: container.CreateIfNotExists(); 17: var blob = container.GetBlockBlobReference(name); 18: blob.PutBlockList(ids); 19: } 20: catch (Exception e) 21: { 22: error = e.ToString(); 23: } 24:  25: return new JsonResult() 26: { 27: Data = new 28: { 29: success = string.IsNullOrWhiteSpace(error), 30: error = error 31: } 32: }; 33: } Now we finished all code we need. The whole process of uploading would be like this below. Below is the full client side JavaScript code. 1: <script type="text/javascript" src="~/Scripts/async.js"></script> 2: <script type="text/javascript"> 3: $(function () { 4: $("#upload_button_blob").click(function () { 5: // assert the browser support html5 6: if (window.File && window.Blob && window.FormData) { 7: alert("Your brwoser is awesome, let's rock!"); 8: } 9: else { 10: alert("Oh man plz update to a modern browser before try is cool stuff out."); 11: return; 12: } 13:  14: // start to upload each files in chunks 15: var files = $("#upload_files")[0].files; 16: for (var i = 0; i < files.length; i++) { 17: var file = files[i]; 18: var fileSize = file.size; 19: var fileName = file.name; 20:  21: // calculate the start and end byte index for each blocks(chunks) 22: // with the index, file name and index list for future using 23: var blockSizeInKB = $("#block_size").val(); 24: var blockSize = blockSizeInKB * 1024; 25: var blocks = []; 26: var offset = 0; 27: var index = 0; 28: var list = ""; 29: while (offset < fileSize) { 30: var start = offset; 31: var end = Math.min(offset + blockSize, fileSize); 32:  33: blocks.push({ 34: name: fileName, 35: index: index, 36: start: start, 37: end: end 38: }); 39: list += index + ","; 40:  41: offset = end; 42: index++; 43: } 44:  45: // define the function array and push all chunk upload operation into this array 46: var putBlocks = []; 47: blocks.forEach(function (block) { 48: putBlocks.push(function (callback) { 49: // load blob based on the start and end index for each chunks 50: var blob = file.slice(block.start, block.end); 51: // put the file name, index and blob into a temporary from 52: var fd = new FormData(); 53: fd.append("name", block.name); 54: fd.append("index", block.index); 55: fd.append("file", blob); 56: // post the form to backend service (asp.net mvc controller action) 57: $.ajax({ 58: url: "/Home/UploadInFormData", 59: data: fd, 60: processData: false, 61: contentType: "multipart/form-data", 62: type: "POST", 63: success: function (result) { 64: if (!result.success) { 65: alert(result.error); 66: } 67: callback(null, block.index); 68: } 69: }); 70: }); 71: }); 72:  73: // invoke the functions one by one 74: // then invoke the commit ajax call to put blocks into blob in azure storage 75: async.series(putBlocks, function (error, result) { 76: var data = { 77: name: fileName, 78: list: list 79: }; 80: $.post("/Home/Commit", data, function (result) { 81: if (!result.success) { 82: alert(result.error); 83: } 84: else { 85: alert("done!"); 86: } 87: }); 88: }); 89: } 90: }); 91: }); 92: </script> And below is the full ASP.NET MVC controller code. 1: public class HomeController : Controller 2: { 3: private CloudStorageAccount _account; 4: private CloudBlobClient _client; 5:  6: public HomeController() 7: : base() 8: { 9: _account = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("DataConnectionString")); 10: _client = _account.CreateCloudBlobClient(); 11: } 12:  13: public ActionResult Index() 14: { 15: ViewBag.Message = "Modify this template to jump-start your ASP.NET MVC application."; 16:  17: return View(); 18: } 19:  20: [HttpPost] 21: public JsonResult UploadInFormData() 22: { 23: var error = string.Empty; 24: try 25: { 26: var name = Request.Form["name"]; 27: var index = int.Parse(Request.Form["index"]); 28: var file = Request.Files[0]; 29: var id = Convert.ToBase64String(BitConverter.GetBytes(index)); 30:  31: var container = _client.GetContainerReference("test"); 32: container.CreateIfNotExists(); 33: var blob = container.GetBlockBlobReference(name); 34: blob.PutBlock(id, file.InputStream, null); 35: } 36: catch (Exception e) 37: { 38: error = e.ToString(); 39: } 40:  41: return new JsonResult() 42: { 43: Data = new 44: { 45: success = string.IsNullOrWhiteSpace(error), 46: error = error 47: } 48: }; 49: } 50:  51: [HttpPost] 52: public JsonResult Commit() 53: { 54: var error = string.Empty; 55: try 56: { 57: var name = Request.Form["name"]; 58: var list = Request.Form["list"]; 59: var ids = list 60: .Split(',') 61: .Where(id => !string.IsNullOrWhiteSpace(id)) 62: .Select(id => Convert.ToBase64String(BitConverter.GetBytes(int.Parse(id)))) 63: .ToArray(); 64:  65: var container = _client.GetContainerReference("test"); 66: container.CreateIfNotExists(); 67: var blob = container.GetBlockBlobReference(name); 68: blob.PutBlockList(ids); 69: } 70: catch (Exception e) 71: { 72: error = e.ToString(); 73: } 74:  75: return new JsonResult() 76: { 77: Data = new 78: { 79: success = string.IsNullOrWhiteSpace(error), 80: error = error 81: } 82: }; 83: } 84: } And if we selected a file from the browser we will see our application will upload chunks in the size we specified to the server through ajax call in background, and then commit all chunks in one blob. Then we can find the blob in our Windows Azure Blob Storage.   Optimized by Parallel Upload In previous example we just uploaded our file in chunks. This solved the problem that ASP.NET MVC request content size limitation as well as the Windows Azure load balancer timeout. But it might introduce the performance problem since we uploaded chunks in sequence. In order to improve the upload performance we could modify our client side code a bit to make the upload operation invoked in parallel. The good news is that, “async.js” library provides the parallel execution function. If you remembered the code we invoke the service to upload chunks, it utilized “async.series” which means all functions will be executed in sequence. Now we will change this code to “async.parallel”. This will invoke all functions in parallel. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: ... ... 4: // start to upload each files in chunks 5: var files = $("#upload_files")[0].files; 6: for (var i = 0; i < files.length; i++) { 7: var file = files[i]; 8: var fileSize = file.size; 9: var fileName = file.name; 10: // calculate the start and end byte index for each blocks(chunks) 11: // with the index, file name and index list for future using 12: ... ... 13: // define the function array and push all chunk upload operation into this array 14: ... ... 15: // invoke the functions one by one 16: // then invoke the commit ajax call to put blocks into blob in azure storage 17: async.parallel(putBlocks, function (error, result) { 18: var data = { 19: name: fileName, 20: list: list 21: }; 22: $.post("/Home/Commit", data, function (result) { 23: if (!result.success) { 24: alert(result.error); 25: } 26: else { 27: alert("done!"); 28: } 29: }); 30: }); 31: } 32: }); In this way all chunks will be uploaded to the server side at the same time to maximize the bandwidth usage. This should work if the file was not very large and the chunk size was not very small. But for large file this might introduce another problem that too many ajax calls are sent to the server at the same time. So the best solution should be, upload the chunks in parallel with maximum concurrency limitation. The code below specified the concurrency limitation to 4, which means at the most only 4 ajax calls could be invoked at the same time. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: ... ... 4: // start to upload each files in chunks 5: var files = $("#upload_files")[0].files; 6: for (var i = 0; i < files.length; i++) { 7: var file = files[i]; 8: var fileSize = file.size; 9: var fileName = file.name; 10: // calculate the start and end byte index for each blocks(chunks) 11: // with the index, file name and index list for future using 12: ... ... 13: // define the function array and push all chunk upload operation into this array 14: ... ... 15: // invoke the functions one by one 16: // then invoke the commit ajax call to put blocks into blob in azure storage 17: async.parallelLimit(putBlocks, 4, function (error, result) { 18: var data = { 19: name: fileName, 20: list: list 21: }; 22: $.post("/Home/Commit", data, function (result) { 23: if (!result.success) { 24: alert(result.error); 25: } 26: else { 27: alert("done!"); 28: } 29: }); 30: }); 31: } 32: });   Summary In this post we discussed how to upload files in chunks to the backend service and then upload them into Windows Azure Blob Storage in blocks. We focused on the frontend side and leverage three new feature introduced in HTML 5 which are - File.slice: Read part of the file by specifying the start and end byte index. - Blob: File-like interface which contains the part of the file content. - FormData: Temporary form element that we can pass the chunk alone with some metadata to the backend service. Then we discussed the performance consideration of chunk uploading. Sequence upload cannot provide maximized upload speed, but the unlimited parallel upload might crash the browser and server if too many chunks. So we finally came up with the solution to upload chunks in parallel with the concurrency limitation. We also demonstrated how to utilize “async.js” JavaScript library to help us control the asynchronize call and the parallel limitation.   Regarding the chunk size and the parallel limitation value there is no “best” value. You need to test vary composition and find out the best one for your particular scenario. It depends on the local bandwidth, client machine cores and the server side (Windows Azure Cloud Service Virtual Machine) cores, memory and bandwidth. Below is one of my performance test result. The client machine was Windows 8 IE 10 with 4 cores. I was using Microsoft Cooperation Network. The web site was hosted on Windows Azure China North data center (in Beijing) with one small web role (1.7GB 1 core CPU, 1.75GB memory with 100Mbps bandwidth). The test cases were - Chunk size: 512KB, 1MB, 2MB, 4MB. - Upload Mode: Sequence, parallel (unlimited), parallel with limit (4 threads, 8 threads). - Chunk Format: base64 string, binaries. - Target file: 100MB. - Each case was tested 3 times. Below is the test result chart. Some thoughts, but not guidance or best practice: - Parallel gets better performance than series. - No significant performance improvement between parallel 4 threads and 8 threads. - Transform with binaries provides better performance than base64. - In all cases, chunk size in 1MB - 2MB gets better performance.   Hope this helps, Shaun All documents and related graphics, codes are provided "AS IS" without warranty of any kind. Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.

    Read the article

  • How to print PCL Directly to the printer? Vb.net VS 2010

    - by Justin
    Good day, I've been trying to upgrade an old Print Program that prints raw pcl directly to the printer. The print program takes a PCL Mask and a PCL Batch Spool File (just pcl with page turn commands, I think) merges them and sends them off to the printer. I'm able to send to the Printer a file stream of PCL but I get mixed results and i do not understand why printing is so difficult under .net. I mean yes there is the PrintDocument class, but to print PCL... Let's just say I'm ready to detach my printer from the network and burn it a live. Here is my class PrintDirect (rather a hybrid class) Imports System Imports System.Text Imports System.Runtime.InteropServices <StructLayout(LayoutKind.Sequential)> _ Public Structure DOCINFO <MarshalAs(UnmanagedType.LPWStr)> _ Public pDocName As String <MarshalAs(UnmanagedType.LPWStr)> _ Public pOutputFile As String <MarshalAs(UnmanagedType.LPWStr)> _ Public pDataType As String End Structure 'DOCINFO Public Class PrintDirect <DllImport("winspool.drv", CharSet:=CharSet.Unicode, ExactSpelling:=False, CallingConvention:=CallingConvention.StdCall)> _ Public Shared Function OpenPrinter(ByVal pPrinterName As String, ByRef phPrinter As IntPtr, ByVal pDefault As Integer) As Long End Function <DllImport("winspool.drv", CharSet:=CharSet.Unicode, ExactSpelling:=False, CallingConvention:=CallingConvention.StdCall)> _ Public Shared Function StartDocPrinter(ByVal hPrinter As IntPtr, ByVal Level As Integer, ByRef pDocInfo As DOCINFO) As Long End Function <DllImport("winspool.drv", CharSet:=CharSet.Unicode, ExactSpelling:=True, CallingConvention:=CallingConvention.StdCall)> _ Public Shared Function StartPagePrinter(ByVal hPrinter As IntPtr) As Long End Function <DllImport("winspool.drv", CharSet:=CharSet.Ansi, ExactSpelling:=True, CallingConvention:=CallingConvention.StdCall)> _ Public Shared Function WritePrinter(ByVal hPrinter As IntPtr, ByVal data As String, ByVal buf As Integer, ByRef pcWritten As Integer) As Long End Function <DllImport("winspool.drv", CharSet:=CharSet.Unicode, ExactSpelling:=True, CallingConvention:=CallingConvention.StdCall)> _ Public Shared Function EndPagePrinter(ByVal hPrinter As IntPtr) As Long End Function <DllImport("winspool.drv", CharSet:=CharSet.Unicode, ExactSpelling:=True, CallingConvention:=CallingConvention.StdCall)> _ Public Shared Function EndDocPrinter(ByVal hPrinter As IntPtr) As Long End Function <DllImport("winspool.drv", CharSet:=CharSet.Unicode, ExactSpelling:=True, CallingConvention:=CallingConvention.StdCall)> _ Public Shared Function ClosePrinter(ByVal hPrinter As IntPtr) As Long End Function 'Helps uses to work with the printer Public Class PrintJob ''' <summary> ''' The address of the printer to print to. ''' </summary> ''' <remarks></remarks> Public PrinterName As String = "" Dim lhPrinter As New System.IntPtr() ''' <summary> ''' The object deriving from the Win32 API, Winspool.drv. ''' Use this object to overide the settings defined in the PrintJob Class. ''' </summary> ''' <remarks>Only use this when absolutly nessacary.</remarks> Public OverideDocInfo As New DOCINFO() ''' <summary> ''' The PCL Control Character or the ascii escape character. \x1b ''' </summary> ''' <remarks>ChrW(27)</remarks> Public Const PCL_Control_Character As Char = ChrW(27) ''' <summary> ''' Opens a connection to a printer, if false the connection could not be established. ''' </summary> ''' <returns></returns> ''' <remarks></remarks> Public Function OpenPrint() As Boolean Dim rtn_val As Boolean = False PrintDirect.OpenPrinter(PrinterName, lhPrinter, 0) If Not lhPrinter = 0 Then rtn_val = True End If Return rtn_val End Function ''' <summary> ''' The name of the Print Document. ''' </summary> ''' <value></value> ''' <returns></returns> ''' <remarks></remarks> Public Property DocumentName As String Get Return OverideDocInfo.pDocName End Get Set(ByVal value As String) OverideDocInfo.pDocName = value End Set End Property ''' <summary> ''' The type of Data that will be sent to the printer. ''' </summary> ''' <value>Send the print data type, usually "RAW" or "LPR". To overide use the OverideDocInfo Object</value> ''' <returns>The DataType as it was provided, if it is not part of the enumeration it is set to "UNKNOWN".</returns> ''' <remarks></remarks> Public Property DocumentDataType As PrintDataType Get Select Case OverideDocInfo.pDataType.ToUpper Case "RAW" Return PrintDataType.RAW Case "LPR" Return PrintDataType.LPR Case Else Return PrintDataType.UNKOWN End Select End Get Set(ByVal value As PrintDataType) OverideDocInfo.pDataType = [Enum].GetName(GetType(PrintDataType), value) End Set End Property Public Property DocumentOutputFile As String Get Return OverideDocInfo.pOutputFile End Get Set(ByVal value As String) OverideDocInfo.pOutputFile = value End Set End Property Enum PrintDataType UNKOWN = 0 RAW = 1 LPR = 2 End Enum ''' <summary> ''' Initializes the printing matrix ''' </summary> ''' <param name="PrintLevel">I have no idea what the hack this is...</param> ''' <remarks></remarks> Public Sub OpenDocument(Optional ByVal PrintLevel As Integer = 1) PrintDirect.StartDocPrinter(lhPrinter, PrintLevel, OverideDocInfo) End Sub ''' <summary> ''' Starts the next page. ''' </summary> ''' <remarks></remarks> Public Sub StartPage() PrintDirect.StartPagePrinter(lhPrinter) End Sub ''' <summary> ''' Writes a string to the printer, can be used to write out a section of the document at a time. ''' </summary> ''' <param name="data">The String to Send out to the Printer.</param> ''' <returns>The pcWritten as an integer, 0 may mean the writter did not write out anything.</returns> ''' <remarks>Warning the buffer is automatically created by the lenegth of the string.</remarks> Public Function WriteToPrinter(ByVal data As String) As Integer Dim pcWritten As Integer = 0 PrintDirect.WritePrinter(lhPrinter, data, data.Length - 1, pcWritten) Return pcWritten End Function Public Sub EndPage() PrintDirect.EndPagePrinter(lhPrinter) End Sub Public Sub CloseDocument() PrintDirect.EndDocPrinter(lhPrinter) End Sub Public Sub ClosePrint() PrintDirect.ClosePrinter(lhPrinter) End Sub ''' <summary> ''' Opens a connection to a printer and starts a new document. ''' </summary> ''' <returns>If false the connection could not be established. </returns> ''' <remarks></remarks> Public Function Open() As Boolean Dim rtn_val As Boolean = False rtn_val = OpenPrint() If rtn_val Then OpenDocument() End If Return rtn_val End Function ''' <summary> ''' Closes the document and closes the connection to the printer. ''' </summary> ''' <remarks></remarks> Public Sub Close() CloseDocument() ClosePrint() End Sub End Class End Class 'PrintDirect Here is how I print my file. I'm simple printing the PCL Masks, to show proof of concept. But I can't even do that. I can effectively create PCL and send it the printer without reading the file and it works fine... Plus I get mixed results with different stream reader text encoding as well. Dim pJob As New PrintDirect.PrintJob pJob.DocumentName = " test doc" pJob.DocumentDataType = PrintDirect.PrintJob.PrintDataType.RAW pJob.PrinterName = sPrinter.GetDevicePath '//This is where you'd stick your device name, sPrinter stands for Selected Printer from a dropdown (combobox). 'pJob.DocumentOutputFile = "C:\temp\test-spool.txt" If Not pJob.OpenPrint() Then MsgBox("Unable to connect to " & pJob.PrinterName, MsgBoxStyle.OkOnly, "Print Error") Exit Sub End If pJob.OpenDocument() pJob.StartPage() Dim sr As New IO.StreamReader(Me.txtFile.Text, Text.Encoding.ASCII) '//I've got best results with ASCII before, but only mized. Dim line As String = sr.ReadLine 'Fix for fly code on first run 'If line = 27 Then line = PrintDirect.PrintJob.PCL_Control_Character Do While (Not line Is Nothing) pJob.WriteToPrinter(line) line = sr.ReadLine Loop 'pJob.WriteToPrinter(ControlChars.FormFeed) pJob.EndPage() pJob.CloseDocument() sr.Close() sr.Dispose() sr = Nothing pJob.Close() I was able to get to print the mask, in the morning... now I'm getting strange printer characters scattered through 5 pages... I'm totally clueless. I must have changed something or the printer is at fault. You can access my test Check Mask, here http://kscserver.com/so/chk_mask.zip .

    Read the article

  • Spring & hibernate configuration (using maven): java.lang.ClassNotFoundException: org.hibernate.cfg.

    - by Marcos Carceles
    Hi, I am trying to include spring and hibernate in an application running on a Weblogic 10.3 server. When I run the application in the server, while accessing an TestServlet to check my configuration I get the following exception: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'mySessionFactory' defined in class path resource [spring-config/HorizonModelPeopleConnectionsSpringContext.xml]: Instantiation of bean failed; nested exception is org.springframework.beans.BeanInstantiationException: Could not instantiate bean class [org.springframework.orm.hibernate3.LocalSessionFactoryBean]: Constructor threw exception; nested exception is java.lang.NoClassDefFoundError: org.hibernate.cfg.Configuration at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:448) at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:251) at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:156) at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:248) at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:160) at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:284) at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:352) at org.springframework.context.support.ClassPathXmlApplicationContext.(ClassPathXmlApplicationContext.java:91) at org.springframework.context.support.ClassPathXmlApplicationContext.(ClassPathXmlApplicationContext.java:75) at org.springframework.context.support.ClassPathXmlApplicationContext.(ClassPathXmlApplicationContext.java:65) at view.com.horizon.test.SpringHibernateServlet.doGet(SpringHibernateServlet.java:27) at javax.servlet.http.HttpServlet.service(HttpServlet.java:707) at javax.servlet.http.HttpServlet.service(HttpServlet.java:820) at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:227) at weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:125) at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:292) at weblogic.servlet.internal.TailFilter.doFilter(TailFilter.java:26) at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56) at oracle.security.wls.filter.SSOSessionSynchronizationFilter.doFilter(SSOSessionSynchronizationFilter.java:279) at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56) at oracle.dms.wls.DMSServletFilter.doFilter(DMSServletFilter.java:326) at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56) at weblogic.servlet.internal.RequestEventsFilter.doFilter(RequestEventsFilter.java:27) at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56) at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3592) at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321) at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:121) at weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2202) at weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2108) at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1432) at weblogic.work.ExecuteThread.execute(ExecuteThread.java:201) at weblogic.work.ExecuteThread.run(ExecuteThread.java:173) Caused by: org.springframework.beans.BeanInstantiationException: Could not instantiate bean class [org.springframework.orm.hibernate3.LocalSessionFactoryBean]: Constructor threw exception; nested exception is java.lang.NoClassDefFoundError: org.hibernate.cfg.Configuration at org.springframework.beans.BeanUtils.instantiateClass(BeanUtils.java:100) at org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:61) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateBean(AbstractAutowireCapableBeanFactory.java:756) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:721) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:384) ... 31 more Caused by: java.lang.NoClassDefFoundError: org.hibernate.cfg.Configuration at org.springframework.orm.hibernate3.LocalSessionFactoryBean.class$(LocalSessionFactoryBean.java:158) at org.springframework.orm.hibernate3.LocalSessionFactoryBean.(LocalSessionFactoryBean.java:158) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27) at java.lang.reflect.Constructor.newInstance(Constructor.java:513) at org.springframework.beans.BeanUtils.instantiateClass(BeanUtils.java:85) ... 35 more I have checked my application and the hibernate jar file is included and it contains the class it says its missing: org.hibernate.cfg.Configuration. The application is built with maven. These are the dependencies of the JAR file using spring and hibernate: <!-- Frameworks --> <!-- Hibernate framework --> <dependency> <groupId>org.hibernate</groupId> <artifactId>hibernate</artifactId> <version>3.2.7.ga</version> </dependency> <!-- Hibernate uses slf4j for logging, for our purposes here use the simple backend --> <dependency> <groupId>org.slf4j</groupId> <artifactId>slf4j-log4j12</artifactId> <version>1.5.2</version> </dependency> <!-- Hibernate gives you a choice of bytecode providers between cglib and javassist --> <dependency> <groupId>javassist</groupId> <artifactId>javassist</artifactId> <version>3.4.GA</version> </dependency> <!-- Spring framework --> <dependency> <groupId>org.springframework</groupId> <artifactId>spring-orm</artifactId> <version>2.5.6</version> </dependency> At first I thought it could be an issue with the versions in the spring and hibernate libraries, so I have tried with different ones, but still I couldn't find anywhere where it says which library versions are compatible,. just got that Spring 2.5.x needs hibernate =3.1 And this is my Spring config file: <?xml version="1.0" encoding="UTF-8"?> <beans xmlns="http://www.springframework.org/schema/beans" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-2.5.xsd"> <bean id="myDataSource" class="org.springframework.jndi.JndiObjectFactoryBean"> <property name="jndiName"> <value>jdbc/WebCenterDS</value> </property> <!--property name="resourceRef"> <value>true</value> </property> <property name="jndiEnvironment"> <props> <prop key="java.naming.factory.initial">weblogic.jndi.WLInitialContextFactory</prop> <prop key="java.naming.provider.url">t3://localhost:7001</prop> </props> </property--> </bean> <bean id="mySessionFactory" class="org.springframework.orm.hibernate3.LocalSessionFactoryBean"> <property name="dataSource" ref="myDataSource"/> <property name="configLocation"> <value>classpath:hibernate-config/hibernate.cfg.xml</value> </property> <property name="mappingResources"> <list> <value>classpath:com/horizon/model/peopleconnections/profile/internal/bean/CustomAttribute.hbm.xml</value> </list> </property> <property name="hibernateProperties"> <value>hibernate.dialect=org.hibernate.dialect.HSQLDialect</value> </property> </bean> <bean id="profileExtensionDAO" class="com.horizon.model.peopleconnections.profile.internal.dao.ProfileExtensionDAOImpl"> <property name="sessionFactory" ref="mySessionFactory"/> </bean> </beans> The WAR structure I get is the following: J2EETestApplication ¦ springhibernate.jsp ¦ +---WEB-INF ¦ faces-config.xml ¦ web.xml ¦ weblogic.xml ¦ +---classes ¦ +---view ¦ +---com ¦ +---horizon ¦ +---test ¦ SpringHibernateServlet.class ¦ +---lib activation-1.1.jar antlr-2.7.6.jar aopalliance-1.0.jar asm-1.5.3.jar asm-attrs-1.5.3.jar cglib-2.1_3.jar commons-codec-1.3.jar commons-collections-2.1.1.jar commons-logging-1.1.1.jar dom4j-1.6.1.jar ehcache-1.2.3.jar hibernate-3.2.7.ga.jar horizon-model-commons-1.0-SNAPSHOT.jar horizon-model-peopleconnections-1.0-SNAPSHOT.jar horizon-shared-commons-1.0-SNAPSHOT.jar horizon-shared-logging-1.0-SNAPSHOT.jar horizon-shared-util-1.0-SNAPSHOT.jar horizon-shared-webcenter-1.0-SNAPSHOT.jar horizon-shared-webcenter.jar httpclient-4.0.1.jar httpcore-4.0.1.jar javassist-3.4.GA.jar jta-1.0.1B.jar log4j-1.2.14.jar mail-1.4.1.jar peopleconnections-profile-model-11.1.1.2.0.jar saxon-9.1.0.8.jar serviceframework-11.1.1.2.0.jar slf4j-api-1.5.2.jar slf4j-log4j12-1.5.2.jar spring-beans-2.5.6.jar spring-context-2.5.6.jar spring-core-2.5.6.jar spring-orm-2.5.6.jar spring-tx-2.5.6.jar Is there any dependency or configuration I am missing? If I use hibernate without spring I don't get the ClassDefNotFoundException.

    Read the article

  • Silverlight for Windows Embedded tutorial (step 4)

    - by Valter Minute
    I’m back with my Silverlight for Windows Embedded tutorial. Sorry for the long delay between step 3 and step 4, the MVP summit and some work related issue prevented me from working on the tutorial during the last weeks. In our first,  second and third tutorial steps we implemented some very simple applications, just to understand the basic structure of a Silverlight for Windows Embedded application, learn how to handle events and how to operate on images. In this third step our sample application will be slightly more complicated, to introduce two new topics: list boxes and custom control. We will also learn how to create controls at runtime. I choose to explain those topics together and provide a sample a bit more complicated than usual just to start to give the feeling of how a “real” Silverlight for Windows Embedded application is organized. As usual we can start using Expression Blend to define our main page. In this case we will have a listbox and a textblock. Here’s the XAML code: <UserControl xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation" xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" x:Class="ListDemo.Page" Width="640" Height="480" x:Name="ListPage" xmlns:ListDemo="clr-namespace:ListDemo">   <Grid x:Name="LayoutRoot" Background="White"> <ListBox Margin="19,57,19,66" x:Name="FileList" SelectionChanged="Filelist_SelectionChanged"/> <TextBlock Height="35" Margin="19,8,19,0" VerticalAlignment="Top" TextWrapping="Wrap" x:Name="CurrentDir" Text="TextBlock" FontSize="20"/> </Grid> </UserControl> In our listbox we will load a list of directories, starting from the filesystem root (there are no drives in Windows CE, the filesystem has a single root named “\”). When the user clicks on an item inside the list, the corresponding directory path will be displayed in the TextBlock object and the subdirectories of the selected branch will be shown inside the list. As you can see we declared an event handler for the SelectionChanged event of our listbox. We also used a different font size for the TextBlock, to make it more readable. XAML and Expression Blend allow you to customize your UI pretty heavily, experiment with the tools and discover how you can completely change the aspect of your application without changing a single line of code! Inside our ListBox we want to insert the directory presenting a nice icon and their name, just like you are used to see them inside Windows 7 file explorer, for example. To get this we will define a user control. This is a custom object that will behave like “regular” Silverlight for Windows Embedded objects inside our application. First of all we have to define the look of our custom control, named DirectoryItem, using XAML: <UserControl xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation" xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" xmlns:d="http://schemas.microsoft.com/expression/blend/2008" xmlns:mc="http://schemas.openxmlformats.org/markup-compatibility/2006" mc:Ignorable="d" x:Class="ListDemo.DirectoryItem" Width="500" Height="80">   <StackPanel x:Name="LayoutRoot" Orientation="Horizontal"> <Canvas Width="31.6667" Height="45.9583" Margin="10,10,10,10" RenderTransformOrigin="0.5,0.5"> <Canvas.RenderTransform> <TransformGroup> <ScaleTransform/> <SkewTransform/> <RotateTransform Angle="-31.27"/> <TranslateTransform/> </TransformGroup> </Canvas.RenderTransform> <Rectangle Width="31.6667" Height="45.8414" Canvas.Left="0" Canvas.Top="0.116943" Stretch="Fill"> <Rectangle.Fill> <LinearGradientBrush StartPoint="0.142631,0.75344" EndPoint="1.01886,0.75344"> <LinearGradientBrush.RelativeTransform> <TransformGroup> <SkewTransform CenterX="0.142631" CenterY="0.75344" AngleX="19.3128" AngleY="0"/> <RotateTransform CenterX="0.142631" CenterY="0.75344" Angle="-35.3436"/> </TransformGroup> </LinearGradientBrush.RelativeTransform> <LinearGradientBrush.GradientStops> <GradientStop Color="#FF7B6802" Offset="0"/> <GradientStop Color="#FFF3D42C" Offset="1"/> </LinearGradientBrush.GradientStops> </LinearGradientBrush> </Rectangle.Fill> </Rectangle> <Rectangle Width="29.8441" Height="43.1517" Canvas.Left="0.569519" Canvas.Top="1.05249" Stretch="Fill"> <Rectangle.Fill> <LinearGradientBrush StartPoint="0.142632,0.753441" EndPoint="1.01886,0.753441"> <LinearGradientBrush.RelativeTransform> <TransformGroup> <SkewTransform CenterX="0.142632" CenterY="0.753441" AngleX="19.3127" AngleY="0"/> <RotateTransform CenterX="0.142632" CenterY="0.753441" Angle="-35.3437"/> </TransformGroup> </LinearGradientBrush.RelativeTransform> <LinearGradientBrush.GradientStops> <GradientStop Color="#FFCDCDCD" Offset="0.0833333"/> <GradientStop Color="#FFFFFFFF" Offset="1"/> </LinearGradientBrush.GradientStops> </LinearGradientBrush> </Rectangle.Fill> </Rectangle> <Rectangle Width="29.8441" Height="43.1517" Canvas.Left="0.455627" Canvas.Top="2.28036" Stretch="Fill"> <Rectangle.Fill> <LinearGradientBrush StartPoint="0.142631,0.75344" EndPoint="1.01886,0.75344"> <LinearGradientBrush.RelativeTransform> <TransformGroup> <SkewTransform CenterX="0.142631" CenterY="0.75344" AngleX="19.3128" AngleY="0"/> <RotateTransform CenterX="0.142631" CenterY="0.75344" Angle="-35.3436"/> </TransformGroup> </LinearGradientBrush.RelativeTransform> <LinearGradientBrush.GradientStops> <GradientStop Color="#FFCDCDCD" Offset="0.0833333"/> <GradientStop Color="#FFFFFFFF" Offset="1"/> </LinearGradientBrush.GradientStops> </LinearGradientBrush> </Rectangle.Fill> </Rectangle> <Rectangle Width="29.8441" Height="43.1517" Canvas.Left="0.455627" Canvas.Top="1.34485" Stretch="Fill"> <Rectangle.Fill> <LinearGradientBrush StartPoint="0.142631,0.75344" EndPoint="1.01886,0.75344"> <LinearGradientBrush.RelativeTransform> <TransformGroup> <SkewTransform CenterX="0.142631" CenterY="0.75344" AngleX="19.3128" AngleY="0"/> <RotateTransform CenterX="0.142631" CenterY="0.75344" Angle="-35.3436"/> </TransformGroup> </LinearGradientBrush.RelativeTransform> <LinearGradientBrush.GradientStops> <GradientStop Color="#FFCDCDCD" Offset="0.0833333"/> <GradientStop Color="#FFFFFFFF" Offset="1"/> </LinearGradientBrush.GradientStops> </LinearGradientBrush> </Rectangle.Fill> </Rectangle> <Rectangle Width="26.4269" Height="45.8414" Canvas.Left="0.227798" Canvas.Top="0" Stretch="Fill"> <Rectangle.Fill> <LinearGradientBrush StartPoint="0.142631,0.75344" EndPoint="1.01886,0.75344"> <LinearGradientBrush.RelativeTransform> <TransformGroup> <SkewTransform CenterX="0.142631" CenterY="0.75344" AngleX="19.3127" AngleY="0"/> <RotateTransform CenterX="0.142631" CenterY="0.75344" Angle="-35.3436"/> </TransformGroup> </LinearGradientBrush.RelativeTransform> <LinearGradientBrush.GradientStops> <GradientStop Color="#FF7B6802" Offset="0"/> <GradientStop Color="#FFF3D42C" Offset="1"/> </LinearGradientBrush.GradientStops> </LinearGradientBrush> </Rectangle.Fill> </Rectangle> <Rectangle Width="1.25301" Height="45.8414" Canvas.Left="1.70862" Canvas.Top="0.116943" Stretch="Fill" Fill="#FFEBFF07"/> </Canvas> <TextBlock Height="80" x:Name="Name" Width="448" TextWrapping="Wrap" VerticalAlignment="Center" FontSize="24" Text="Directory"/> </StackPanel> </UserControl> As you can see, this XAML contains many graphic elements. Those elements are used to design the folder icon. The original drawing has been designed in Expression Design and then exported as XAML. In Silverlight for Windows Embedded you can use vector images. This means that your images will look good even when scaled or rotated. In our DirectoryItem custom control we have a TextBlock named Name, that will be used to display….(suspense)…. the directory name (I’m too lazy to invent fancy names for controls, and using “boring” intuitive names will make code more readable, I hope!). Now that we have some XAML code, we may execute XAML2CPP to generate part of the aplication code for us. We should then add references to our XAML2CPP generated resource file and include in our code and add a reference to the XAML runtime library to our sources file (you can follow the instruction of the first tutorial step to do that), To generate the code used in this tutorial you need XAML2CPP ver 1.0.1.0, that is downloadable here: http://geekswithblogs.net/WindowsEmbeddedCookbook/archive/2010/03/08/xaml2cpp-1.0.1.0.aspx We can now create our usual simple Win32 application inside Platform Builder, using the same step described in the first chapter of this tutorial (http://geekswithblogs.net/WindowsEmbeddedCookbook/archive/2009/10/01/silverlight-for-embedded-tutorial.aspx). We can declare a class for our main page, deriving it from the template that XAML2CPP generated for us: class ListPage : public TListPage<ListPage> { ... } We will see the ListPage class code in a short time, but before we will see the code of our DirectoryItem user control. This object will be used to populate our list, one item for each directory. To declare a user control things are a bit more complicated (but also in this case XAML2CPP will write most of the “boilerplate” code for use. To interact with a user control you should declare an interface. An interface defines the functions of a user control that can be called inside the application code. Our custom control is currently quite simple and we just need some member functions to store and retrieve a full pathname inside our control. The control will display just the last part of the path inside the control. An interface is declared as a C++ class that has only abstract virtual members. It should also have an UUID associated with it. UUID means Universal Unique IDentifier and it’s a 128 bit number that will identify our interface without the need of specifying its fully qualified name. UUIDs are used to identify COM interfaces and, as we discovered in chapter one, Silverlight for Windows Embedded is based on COM or, at least, provides a COM-like Application Programming Interface (API). Here’s the declaration of the DirectoryItem interface: class __declspec(novtable,uuid("{D38C66E5-2725-4111-B422-D75B32AA8702}")) IDirectoryItem : public IXRCustomUserControl { public:   virtual HRESULT SetFullPath(BSTR fullpath) = 0; virtual HRESULT GetFullPath(BSTR* retval) = 0; }; The interface is derived from IXRCustomControl, this will allow us to add our object to a XAML tree. It declares the two functions needed to set and get the full path, but don’t implement them. Implementation will be done inside the control class. The interface only defines the functions of our control class that are accessible from the outside. It’s a sort of “contract” between our control and the applications that will use it. We must support what’s inside the contract and the application code should know nothing else about our own control. To reference our interface we will use the UUID, to make code more readable we can declare a #define in this way: #define IID_IDirectoryItem __uuidof(IDirectoryItem) Silverlight for Windows Embedded objects (like COM objects) use a reference counting mechanism to handle object destruction. Every time you store a pointer to an object you should call its AddRef function and every time you no longer need that pointer you should call Release. The object keeps an internal counter, incremented for each AddRef and decremented on Release. When the counter reaches 0, the object is destroyed. Managing reference counting in our code can be quite complicated and, since we are lazy (I am, at least!), we will use a great feature of Silverlight for Windows Embedded: smart pointers.A smart pointer can be connected to a Silverlight for Windows Embedded object and manages its reference counting. To declare a smart pointer we must use the XRPtr template: typedef XRPtr<IDirectoryItem> IDirectoryItemPtr; Now that we have defined our interface, it’s time to implement our user control class. XAML2CPP has implemented a class for us, and we have only to derive our class from it, defining the main class and interface of our new custom control: class DirectoryItem : public DirectoryItemUserControlRegister<DirectoryItem,IDirectoryItem> { ... } XAML2CPP has generated some code for us to support the user control, we don’t have to mind too much about that code, since it will be generated (or written by hand, if you like) always in the same way, for every user control. But knowing how does this works “under the hood” is still useful to understand the architecture of Silverlight for Windows Embedded. Our base class declaration is a bit more complex than the one we used for a simple page in the previous chapters: template <class A,class B> class DirectoryItemUserControlRegister : public XRCustomUserControlImpl<A,B>,public TDirectoryItem<A,XAML2CPPUserControl> { ... } This class derives from the XAML2CPP generated template class, like the ListPage class, but it uses XAML2CPPUserControl for the implementation of some features. This class shares the same ancestor of XAML2CPPPage (base class for “regular” XAML pages), XAML2CPPBase, implements binding of member variables and event handlers but, instead of loading and creating its own XAML tree, it attaches to an existing one. The XAML tree (and UI) of our custom control is created and loaded by the XRCustomUserControlImpl class. This class is part of the Silverlight for Windows Embedded framework and implements most of the functions needed to build-up a custom control in Silverlight (the guys that developed Silverlight for Windows Embedded seem to care about lazy programmers!). We have just to initialize it, providing our class (DirectoryItem) and interface (IDirectoryItem). Our user control class has also a static member: protected:   static HINSTANCE hInstance; This is used to store the HINSTANCE of the modules that contain our user control class. I don’t like this implementation, but I can’t find a better one, so if somebody has good ideas about how to handle the HINSTANCE object, I’ll be happy to hear suggestions! It also implements two static members required by XRCustomUserControlImpl. The first one is used to load the XAML UI of our custom control: static HRESULT GetXamlSource(XRXamlSource* pXamlSource) { pXamlSource->SetResource(hInstance,TEXT("XAML"),IDR_XAML_DirectoryItem); return S_OK; }   It initializes a XRXamlSource object, connecting it to the XAML resource that XAML2CPP has included in our resource script. The other method is used to register our custom control, allowing Silverlight for Windows Embedded to create it when it load some XAML or when an application creates a new control at runtime (more about this later): static HRESULT Register() { return XRCustomUserControlImpl<A,B>::Register(__uuidof(B), L"DirectoryItem", L"clr-namespace:DirectoryItemNamespace"); } To register our control we should provide its interface UUID, the name of the corresponding element in the XAML tree and its current namespace (namespaces compatible with Silverlight must use the “clr-namespace” prefix. We may also register additional properties for our objects, allowing them to be loaded and saved inside XAML. In this case we have no permanent properties and the Register method will just register our control. An additional static method is implemented to allow easy registration of our custom control inside our application WinMain function: static HRESULT RegisterUserControl(HINSTANCE hInstance) { DirectoryItemUserControlRegister::hInstance=hInstance; return DirectoryItemUserControlRegister<A,B>::Register(); } Now our control is registered and we will be able to create it using the Silverlight for Windows Embedded runtime functions. But we need to bind our members and event handlers to have them available like we are used to do for other XAML2CPP generated objects. To bind events and members we need to implement the On_Loaded function: virtual HRESULT OnLoaded(__in IXRDependencyObject* pRoot) { HRESULT retcode; IXRApplicationPtr app; if (FAILED(retcode=GetXRApplicationInstance(&app))) return retcode; return ((A*)this)->Init(pRoot,hInstance,app); } This function will call the XAML2CPPUserControl::Init member that will connect the “root” member with the XAML sub tree that has been created for our control and then calls BindObjects and BindEvents to bind members and events to our code. Now we can go back to our application code (the code that you’ll have to actually write) to see the contents of our DirectoryItem class: class DirectoryItem : public DirectoryItemUserControlRegister<DirectoryItem,IDirectoryItem> { protected:   WCHAR fullpath[_MAX_PATH+1];   public:   DirectoryItem() { *fullpath=0; }   virtual HRESULT SetFullPath(BSTR fullpath) { wcscpy_s(this->fullpath,fullpath);   WCHAR* p=fullpath;   for(WCHAR*q=wcsstr(p,L"\\");q;p=q+1,q=wcsstr(p,L"\\")) ;   Name->SetText(p); return S_OK; }   virtual HRESULT GetFullPath(BSTR* retval) { *retval=SysAllocString(fullpath); return S_OK; } }; It’s pretty easy and contains a fullpath member (used to store that path of the directory connected with the user control) and the implementation of the two interface members that can be used to set and retrieve the path. The SetFullPath member parses the full path and displays just the last branch directory name inside the “Name” TextBlock object. As you can see, implementing a user control in Silverlight for Windows Embedded is not too complex and using XAML also for the UI of the control allows us to re-use the same mechanisms that we learnt and used in the previous steps of our tutorial. Now let’s see how the main page is managed by the ListPage class. class ListPage : public TListPage<ListPage> { protected:   // current path TCHAR curpath[_MAX_PATH+1]; It has a member named “curpath” that is used to store the current directory. It’s initialized inside the constructor: ListPage() { *curpath=0; } And it’s value is displayed inside the “CurrentDir” TextBlock inside the initialization function: virtual HRESULT Init(HINSTANCE hInstance,IXRApplication* app) { HRESULT retcode;   if (FAILED(retcode=TListPage<ListPage>::Init(hInstance,app))) return retcode;   CurrentDir->SetText(L"\\"); return S_OK; } The FillFileList function is used to enumerate subdirectories of the current dir and add entries for each one inside the list box that fills most of the client area of our main page: HRESULT FillFileList() { HRESULT retcode; IXRItemCollectionPtr items; IXRApplicationPtr app;   if (FAILED(retcode=GetXRApplicationInstance(&app))) return retcode; // retrieves the items contained in the listbox if (FAILED(retcode=FileList->GetItems(&items))) return retcode;   // clears the list if (FAILED(retcode=items->Clear())) return retcode;   // enumerates files and directory in the current path WCHAR filemask[_MAX_PATH+1];   wcscpy_s(filemask,curpath); wcscat_s(filemask,L"\\*.*");   WIN32_FIND_DATA finddata; HANDLE findhandle;   findhandle=FindFirstFile(filemask,&finddata);   // the directory is empty? if (findhandle==INVALID_HANDLE_VALUE) return S_OK;   do { if (finddata.dwFileAttributes&=FILE_ATTRIBUTE_DIRECTORY) { IXRListBoxItemPtr listboxitem;   // add a new item to the listbox if (FAILED(retcode=app->CreateObject(IID_IXRListBoxItem,&listboxitem))) { FindClose(findhandle); return retcode; }   if (FAILED(retcode=items->Add(listboxitem,NULL))) { FindClose(findhandle); return retcode; }   IDirectoryItemPtr directoryitem;   if (FAILED(retcode=app->CreateObject(IID_IDirectoryItem,&directoryitem))) { FindClose(findhandle); return retcode; }   WCHAR fullpath[_MAX_PATH+1];   wcscpy_s(fullpath,curpath); wcscat_s(fullpath,L"\\"); wcscat_s(fullpath,finddata.cFileName);   if (FAILED(retcode=directoryitem->SetFullPath(fullpath))) { FindClose(findhandle); return retcode; }   XAML2CPPXRValue value((IXRDependencyObject*)directoryitem);   if (FAILED(retcode=listboxitem->SetContent(&value))) { FindClose(findhandle); return retcode; } } } while (FindNextFile(findhandle,&finddata));   FindClose(findhandle); return S_OK; } This functions retrieve a pointer to the collection of the items contained in the directory listbox. The IXRItemCollection interface is used by listboxes and comboboxes and allow you to clear the list (using Clear(), as our function does at the beginning) and change its contents by adding and removing elements. This function uses the FindFirstFile/FindNextFile functions to enumerate all the objects inside our current directory and for each subdirectory creates a IXRListBoxItem object. You can insert any kind of control inside a list box, you don’t need a IXRListBoxItem, but using it will allow you to handle the selected state of an item, highlighting it inside the list. The function creates a list box item using the CreateObject function of XRApplication. The same function is then used to create an instance of our custom control. The function returns a pointer to the control IDirectoryItem interface and we can use it to store the directory full path inside the object and add it as content of the IXRListBox item object, adding it to the listbox contents. The listbox generates an event (SelectionChanged) each time the user clicks on one of the items contained in the listbox. We implement an event handler for that event and use it to change our current directory and repopulate the listbox. The current directory full path will be displayed in the TextBlock: HRESULT Filelist_SelectionChanged(IXRDependencyObject* source,XRSelectionChangedEventArgs* args) { HRESULT retcode;   IXRListBoxItemPtr listboxitem;   if (!args->pAddedItem) return S_OK;   if (FAILED(retcode=args->pAddedItem->QueryInterface(IID_IXRListBoxItem,(void**)&listboxitem))) return retcode;   XRValue content; if (FAILED(retcode=listboxitem->GetContent(&content))) return retcode;   if (content.vType!=VTYPE_OBJECT) return E_FAIL;   IDirectoryItemPtr directoryitem;   if (FAILED(retcode=content.pObjectVal->QueryInterface(IID_IDirectoryItem,(void**)&directoryitem))) return retcode;   content.pObjectVal->Release(); content.pObjectVal=NULL;   BSTR fullpath=NULL;   if (FAILED(retcode=directoryitem->GetFullPath(&fullpath))) return retcode;   CurrentDir->SetText(fullpath);   wcscpy_s(curpath,fullpath); FillFileList(); SysFreeString(fullpath);     return S_OK; } }; The function uses the pAddedItem member of the XRSelectionChangedEventArgs object to retrieve the currently selected item, converts it to a IXRListBoxItem interface using QueryInterface, and then retrives its contents (IDirectoryItem object). Using the GetFullPath method we can get the full path of our selected directory and assing it to the curdir member. A call to FillFileList will update the listbox contents, displaying the list of subdirectories of the selected folder. To build our sample we just need to add code to our WinMain function: int WINAPI WinMain(HINSTANCE hInstance, HINSTANCE hPrevInstance, LPTSTR lpCmdLine, int nCmdShow) { if (!XamlRuntimeInitialize()) return -1;   HRESULT retcode;   IXRApplicationPtr app; if (FAILED(retcode=GetXRApplicationInstance(&app))) return -1;   if (FAILED(retcode=DirectoryItem::RegisterUserControl(hInstance))) return retcode;   ListPage page;   if (FAILED(page.Init(hInstance,app))) return -1;   page.FillFileList();   UINT exitcode;   if (FAILED(page.GetVisualHost()->StartDialog(&exitcode))) return -1;   return 0; } This code is very similar to the one of the WinMains of our previous samples. The main differences are that we register our custom control (you should do that as soon as you have initialized the XAML runtime) and call FillFileList after the initialization of our ListPage object to load the contents of the root folder of our device inside the listbox. As usual you can download the full sample source code from here: http://cid-9b7b0aefe3514dc5.skydrive.live.com/self.aspx/.Public/ListBoxTest.zip

    Read the article

  • What&rsquo;s New in ASP.NET 4.0 Part Two: WebForms and Visual Studio Enhancements

    - by Rick Strahl
    In the last installment I talked about the core changes in the ASP.NET runtime that I’ve been taking advantage of. In this column, I’ll cover the changes to the Web Forms engine and some of the cool improvements in Visual Studio that make Web and general development easier. WebForms The WebForms engine is the area that has received most significant changes in ASP.NET 4.0. Probably the most widely anticipated features are related to managing page client ids and of ViewState on WebForm pages. Take Control of Your ClientIDs Unique ClientID generation in ASP.NET has been one of the most complained about “features” in ASP.NET. Although there’s a very good technical reason for these unique generated ids - they guarantee unique ids for each and every server control on a page - these unique and generated ids often get in the way of client-side JavaScript development and CSS styling as it’s often inconvenient and fragile to work with the long, generated ClientIDs. In ASP.NET 4.0 you can now specify an explicit client id mode on each control or each naming container parent control to control how client ids are generated. By default, ASP.NET generates mangled client ids for any control contained in a naming container (like a Master Page, or a User Control for example). The key to ClientID management in ASP.NET 4.0 are the new ClientIDMode and ClientIDRowSuffix properties. ClientIDMode supports four different ClientID generation settings shown below. For the following examples, imagine that you have a Textbox control named txtName inside of a master page control container on a WebForms page. <%@Page Language="C#"      MasterPageFile="~/Site.Master"     CodeBehind="WebForm2.aspx.cs"     Inherits="WebApplication1.WebForm2"  %> <asp:Content ID="content"  ContentPlaceHolderID="content"               runat="server"               ClientIDMode="Static" >       <asp:TextBox runat="server" ID="txtName" /> </asp:Content> The four available ClientIDMode values are: AutoID This is the existing behavior in ASP.NET 1.x-3.x where full naming container munging takes place. <input name="ctl00$content$txtName" type="text"        id="ctl00_content_txtName" /> This should be familiar to any ASP.NET developer and results in fairly unpredictable client ids that can easily change if the containership hierarchy changes. For example, removing the master page changes the name in this case, so if you were to move a block of script code that works against the control to a non-Master page, the script code immediately breaks. Static This option is the most deterministic setting that forces the control’s ClientID to use its ID value directly. No naming container naming at all is applied and you end up with clean client ids: <input name="ctl00$content$txtName"         type="text" id="txtName" /> Note that the name property which is used for postback variables to the server still is munged, but the ClientID property is displayed simply as the ID value that you have assigned to the control. This option is what most of us want to use, but you have to be clear on that because it can potentially cause conflicts with other controls on the page. If there are several instances of the same naming container (several instances of the same user control for example) there can easily be a client id naming conflict. Note that if you assign Static to a data-bound control, like a list child control in templates, you do not get unique ids either, so for list controls where you rely on unique id for child controls, you’ll probably want to use Predictable rather than Static. I’ll write more on this a little later when I discuss ClientIDRowSuffix. Predictable The previous two values are pretty self-explanatory. Predictable however, requires some explanation. To me at least it’s not in the least bit predictable. MSDN defines this value as follows: This algorithm is used for controls that are in data-bound controls. The ClientID value is generated by concatenating the ClientID value of the parent naming container with the ID value of the control. If the control is a data-bound control that generates multiple rows, the value of the data field specified in the ClientIDRowSuffix property is added at the end. For the GridView control, multiple data fields can be specified. If the ClientIDRowSuffix property is blank, a sequential number is added at the end instead of a data-field value. Each segment is separated by an underscore character (_). The key that makes this value a bit confusing is that it relies on the parent NamingContainer’s ClientID to build its own ClientID value. This effectively means that the value is not predictable at all but rather very tightly coupled to the parent naming container’s ClientIDMode setting. For my simple textbox example, if the ClientIDMode property of the parent naming container (Page in this case) is set to “Predictable” you’ll get this: <input name="ctl00$content$txtName" type="text"         id="content_txtName" /> which gives an id that based on walking up to the currently active naming container (the MasterPage content container) and starting the id formatting from there downward. Think of this as a semi unique name that’s guaranteed unique only for the naming container. If, on the other hand, the Page is set to “AutoID” you get the following with Predictable on txtName: <input name="ctl00$content$txtName" type="text"         id="ctl00_content_txtName" /> The latter is effectively the same as if you specified AutoID because it inherits the AutoID naming from the Page and Content Master Page control of the page. But again - predictable behavior always depends on the parent naming container and how it generates its id, so the id may not always be exactly the same as the AutoID generated value because somewhere in the NamingContainer chain the ClientIDMode setting may be set to a different value. For example, if you had another naming container in the middle that was set to Static you’d end up effectively with an id that starts with the NamingContainers id rather than the whole ctl000_content munging. The most common use for Predictable is likely to be for data-bound controls, which results in each data bound item getting a unique ClientID. Unfortunately, even here the behavior can be very unpredictable depending on which data-bound control you use - I found significant differences in how template controls in a GridView behave from those that are used in a ListView control. For example, GridView creates clean child ClientIDs, while ListView still has a naming container in the ClientID, presumably because of the template container on which you can’t set ClientIDMode. Predictable is useful, but only if all naming containers down the chain use this setting. Otherwise you’re right back to the munged ids that are pretty unpredictable. Another property, ClientIDRowSuffix, can be used in combination with ClientIDMode of Predictable to force a suffix onto list client controls. For example: <asp:GridView runat="server" ID="gvItems"              AutoGenerateColumns="false"             ClientIDMode="Static"              ClientIDRowSuffix="Id">     <Columns>     <asp:TemplateField>         <ItemTemplate>             <asp:Label runat="server" id="txtName"                        Text='<%# Eval("Name") %>'                   ClientIDMode="Predictable"/>         </ItemTemplate>     </asp:TemplateField>     <asp:TemplateField>         <ItemTemplate>         <asp:Label runat="server" id="txtId"                     Text='<%# Eval("Id") %>'                     ClientIDMode="Predictable" />         </ItemTemplate>     </asp:TemplateField>     </Columns>  </asp:GridView> generates client Ids inside of a column in the master page described earlier: <td>     <span id="txtName_0">Rick</span> </td> where the value after the underscore is the ClientIDRowSuffix field - in this case “Id” of the item data bound to the control. Note that all of the child controls require ClientIDMode=”Predictable” in order for the ClientIDRowSuffix to be applied, and the parent GridView controls need to be set to Static either explicitly or via Naming Container inheritance to give these simple names. It’s a bummer that ClientIDRowSuffix doesn’t work with Static to produce this automatically. Another real problem is that other controls process the ClientIDMode differently. For example, a ListView control processes the Predictable ClientIDMode differently and produces the following with the Static ListView and Predictable child controls: <span id="ctrl0_txtName_0">Rick</span> I couldn’t even figure out a way using ClientIDMode to get a simple ID that also uses a suffix short of falling back to manually generated ids using <%= %> expressions instead. Given the inconsistencies inside of list controls using <%= %>, ids for the ListView might not be a bad idea anyway. Inherit The final setting is Inherit, which is the default for all controls except Page. This means that controls by default inherit the parent naming container’s ClientIDMode setting. For more detailed information on ClientID behavior and different scenarios you can check out a blog post of mine on this subject: http://www.west-wind.com/weblog/posts/54760.aspx. ClientID Enhancements Summary The ClientIDMode property is a welcome addition to ASP.NET 4.0. To me this is probably the most useful WebForms feature as it allows me to generate clean IDs simply by setting ClientIDMode="Static" on either the page or inside of Web.config (in the Pages section) which applies the setting down to the entire page which is my 95% scenario. For the few cases when it matters - for list controls and inside of multi-use user controls or custom server controls) - I can use Predictable or even AutoID to force controls to unique names. For application-level page development, this is easy to accomplish and provides maximum usability for working with client script code against page controls. ViewStateMode Another area of large criticism for WebForms is ViewState. ViewState is used internally by ASP.NET to persist page-level changes to non-postback properties on controls as pages post back to the server. It’s a useful mechanism that works great for the overall mechanics of WebForms, but it can also cause all sorts of overhead for page operation as ViewState can very quickly get out of control and consume huge amounts of bandwidth in your page content. ViewState can also wreak havoc with client-side scripting applications that modify control properties that are tracked by ViewState, which can produce very unpredictable results on a Postback after client-side updates. Over the years in my own development, I’ve often turned off ViewState on pages to reduce overhead. Yes, you lose some functionality, but you can easily implement most of the common functionality in non-ViewState workarounds. Relying less on heavy ViewState controls and sticking with simpler controls or raw HTML constructs avoids getting around ViewState problems. In ASP.NET 3.x and prior, it wasn’t easy to control ViewState - you could turn it on or off and if you turned it off at the page or web.config level, you couldn’t turn it back on for specific controls. In short, it was an all or nothing approach. With ASP.NET 4.0, the new ViewStateMode property gives you more control. It allows you to disable ViewState globally either on the page or web.config level and then turn it back on for specific controls that might need it. ViewStateMode only works when EnableViewState="true" on the page or web.config level (which is the default). You can then use ViewStateMode of Disabled, Enabled or Inherit to control the ViewState settings on the page. If you’re shooting for minimal ViewState usage, the ideal situation is to set ViewStateMode to disabled on the Page or web.config level and only turn it back on particular controls: <%@Page Language="C#"      CodeBehind="WebForm2.aspx.cs"     Inherits="Westwind.WebStore.WebForm2"        ClientIDMode="Static"                ViewStateMode="Disabled"     EnableViewState="true"  %> <!-- this control has viewstate  --> <asp:TextBox runat="server" ID="txtName"  ViewStateMode="Enabled" />       <!-- this control has no viewstate - it inherits  from parent container --> <asp:TextBox runat="server" ID="txtAddress" /> Note that the EnableViewState="true" at the Page level isn’t required since it’s the default, but it’s important that the value is true. ViewStateMode has no effect if EnableViewState="false" at the page level. The main benefit of ViewStateMode is that it allows you to more easily turn off ViewState for most of the page and enable only a few key controls that might need it. For me personally, this is a perfect combination as most of my WebForm apps can get away without any ViewState at all. But some controls - especially third party controls - often don’t work well without ViewState enabled, and now it’s much easier to selectively enable controls rather than the old way, which required you to pretty much turn off ViewState for all controls that you didn’t want ViewState on. Inline HTML Encoding HTML encoding is an important feature to prevent cross-site scripting attacks in data entered by users on your site. In order to make it easier to create HTML encoded content, ASP.NET 4.0 introduces a new Expression syntax using <%: %> to encode string values. The encoding expression syntax looks like this: <%: "<script type='text/javascript'>" +     "alert('Really?');</script>" %> which produces properly encoded HTML: &lt;script type=&#39;text/javascript&#39; &gt;alert(&#39;Really?&#39;);&lt;/script&gt; Effectively this is a shortcut to: <%= HttpUtility.HtmlEncode( "<script type='text/javascript'>" + "alert('Really?');</script>") %> Of course the <%: %> syntax can also evaluate expressions just like <%= %> so the more common scenario applies this expression syntax against data your application is displaying. Here’s an example displaying some data model values: <%: Model.Address.Street %> This snippet shows displaying data from your application’s data store or more importantly, from data entered by users. Anything that makes it easier and less verbose to HtmlEncode text is a welcome addition to avoid potential cross-site scripting attacks. Although I listed Inline HTML Encoding here under WebForms, anything that uses the WebForms rendering engine including ASP.NET MVC, benefits from this feature. ScriptManager Enhancements The ASP.NET ScriptManager control in the past has introduced some nice ways to take programmatic and markup control over script loading, but there were a number of shortcomings in this control. The ASP.NET 4.0 ScriptManager has a number of improvements that make it easier to control script loading and addresses a few of the shortcomings that have often kept me from using the control in favor of manual script loading. The first is the AjaxFrameworkMode property which finally lets you suppress loading the ASP.NET AJAX runtime. Disabled doesn’t load any ASP.NET AJAX libraries, but there’s also an Explicit mode that lets you pick and choose the library pieces individually and reduce the footprint of ASP.NET AJAX script included if you are using the library. There’s also a new EnableCdn property that forces any script that has a new WebResource attribute CdnPath property set to a CDN supplied URL. If the script has this Attribute property set to a non-null/empty value and EnableCdn is enabled on the ScriptManager, that script will be served from the specified CdnPath. [assembly: WebResource(    "Westwind.Web.Resources.ww.jquery.js",    "application/x-javascript",    CdnPath =  "http://mysite.com/scripts/ww.jquery.min.js")] Cool, but a little too static for my taste since this value can’t be changed at runtime to point at a debug script as needed, for example. Assembly names for loading scripts from resources can now be simple names rather than fully qualified assembly names, which make it less verbose to reference scripts from assemblies loaded from your bin folder or the assembly reference area in web.config: <asp:ScriptManager runat="server" id="Id"          EnableCdn="true"         AjaxFrameworkMode="disabled">     <Scripts>         <asp:ScriptReference          Name="Westwind.Web.Resources.ww.jquery.js"         Assembly="Westwind.Web" />     </Scripts>        </asp:ScriptManager> The ScriptManager in 4.0 also supports script combining via the CompositeScript tag, which allows you to very easily combine scripts into a single script resource served via ASP.NET. Even nicer: You can specify the URL that the combined script is served with. Check out the following script manager markup that combines several static file scripts and a script resource into a single ASP.NET served resource from a static URL (allscripts.js): <asp:ScriptManager runat="server" id="Id"          EnableCdn="true"         AjaxFrameworkMode="disabled">     <CompositeScript          Path="~/scripts/allscripts.js">         <Scripts>             <asp:ScriptReference                    Path="~/scripts/jquery.js" />             <asp:ScriptReference                    Path="~/scripts/ww.jquery.js" />             <asp:ScriptReference            Name="Westwind.Web.Resources.editors.js"                 Assembly="Westwind.Web" />         </Scripts>     </CompositeScript> </asp:ScriptManager> When you render this into HTML, you’ll see a single script reference in the page: <script src="scripts/allscripts.debug.js"          type="text/javascript"></script> All you need to do to make this work is ensure that allscripts.js and allscripts.debug.js exist in the scripts folder of your application - they can be empty but the file has to be there. This is pretty cool, but you want to be real careful that you use unique URLs for each combination of scripts you combine or else browser and server caching will easily screw you up royally. The script manager also allows you to override native ASP.NET AJAX scripts now as any script references defined in the Scripts section of the ScriptManager trump internal references. So if you want custom behavior or you want to fix a possible bug in the core libraries that normally are loaded from resources, you can now do this simply by referencing the script resource name in the Name property and pointing at System.Web for the assembly. Not a common scenario, but when you need it, it can come in real handy. Still, there are a number of shortcomings in this control. For one, the ScriptManager and ClientScript APIs still have no common entry point so control developers are still faced with having to check and support both APIs to load scripts so that controls can work on pages that do or don’t have a ScriptManager on the page. The CdnUrl is static and compiled in, which is very restrictive. And finally, there’s still no control over where scripts get loaded on the page - ScriptManager still injects scripts into the middle of the HTML markup rather than in the header or optionally the footer. This, in turn, means there is little control over script loading order, which can be problematic for control developers. MetaDescription, MetaKeywords Page Properties There are also a number of additional Page properties that correspond to some of the other features discussed in this column: ClientIDMode, ClientTarget and ViewStateMode. Another minor but useful feature is that you can now directly access the MetaDescription and MetaKeywords properties on the Page object to set the corresponding meta tags programmatically. Updating these values programmatically previously required either <%= %> expressions in the page markup or dynamic insertion of literal controls into the page. You can now just set these properties programmatically on the Page object in any Control derived class on the page or the Page itself: Page.MetaKeywords = "ASP.NET,4.0,New Features"; Page.MetaDescription = "This article discusses the new features in ASP.NET 4.0"; Note, that there’s no corresponding ASP.NET tag for the HTML Meta element, so the only way to specify these values in markup and access them is via the @Page tag: <%@Page Language="C#"      CodeBehind="WebForm2.aspx.cs"     Inherits="Westwind.WebStore.WebForm2"      ClientIDMode="Static"                MetaDescription="Article that discusses what's                      new in ASP.NET 4.0"     MetaKeywords="ASP.NET,4.0,New Features" %> Nothing earth shattering but quite convenient. Visual Studio 2010 Enhancements for Web Development For Web development there are also a host of editor enhancements in Visual Studio 2010. Some of these are not Web specific but they are useful for Web developers in general. Text Editors Throughout Visual Studio 2010, the text editors have all been updated to a new core engine based on WPF which provides some interesting new features for various code editors including the nice ability to zoom in and out with Ctrl-MouseWheel to quickly change the size of text. There are many more API options to control the editor and although Visual Studio 2010 doesn’t yet use many of these features, we can look forward to enhancements in add-ins and future editor updates from the various language teams that take advantage of the visual richness that WPF provides to editing. On the negative side, I’ve noticed that occasionally the code editor and especially the HTML and JavaScript editors will lose the ability to use various navigation keys like arrows, back and delete keys, which requires closing and reopening the documents at times. This issue seems to be well documented so I suspect this will be addressed soon with a hotfix or within the first service pack. Overall though, the code editors work very well, especially given that they were re-written completely using WPF, which was one of my big worries when I first heard about the complete redesign of the editors. Multi-Targeting Visual Studio now targets all versions of the .NET framework from 2.0 forward. You can use Visual Studio 2010 to work on your ASP.NET 2, 3.0 and 3.5 applications which is a nice way to get your feet wet with the new development environment without having to make changes to existing applications. It’s nice to have one tool to work in for all the different versions. Multi-Monitor Support One cool feature of Visual Studio 2010 is the ability to drag windows out of the Visual Studio environment and out onto the desktop including onto another monitor easily. Since Web development often involves working with a host of designers at the same time - visual designer, HTML markup window, code behind and JavaScript editor - it’s really nice to be able to have a little more screen real estate to work on each of these editors. Microsoft made a welcome change in the environment. IntelliSense Snippets for HTML and JavaScript Editors The HTML and JavaScript editors now finally support IntelliSense scripts to create macro-based template expansions that have been in the core C# and Visual Basic code editors since Visual Studio 2005. Snippets allow you to create short XML-based template definitions that can act as static macros or real templates that can have replaceable values that can be embedded into the expanded text. The XML syntax for these snippets is straight forward and it’s pretty easy to create custom snippets manually. You can easily create snippets using XML and store them in your custom snippets folder (C:\Users\rstrahl\Documents\Visual Studio 2010\Code Snippets\Visual Web Developer\My HTML Snippets and My JScript Snippets), but it helps to use one of the third-party tools that exist to simplify the process for you. I use SnippetEditor, by Bill McCarthy, which makes short work of creating snippets interactively (http://snippeteditor.codeplex.com/). Note: You may have to manually add the Visual Studio 2010 User specific Snippet folders to this tool to see existing ones you’ve created. Code snippets are some of the biggest time savers and HTML editing more than anything deals with lots of repetitive tasks that lend themselves to text expansion. Visual Studio 2010 includes a slew of built-in snippets (that you can also customize!) and you can create your own very easily. If you haven’t done so already, I encourage you to spend a little time examining your coding patterns and find the repetitive code that you write and convert it into snippets. I’ve been using CodeRush for this for years, but now you can do much of the basic expansion natively for HTML and JavaScript snippets. jQuery Integration Is Now Native jQuery is a popular JavaScript library and recently Microsoft has recently stated that it will become the primary client-side scripting technology to drive higher level script functionality in various ASP.NET Web projects that Microsoft provides. In Visual Studio 2010, the default full project template includes jQuery as part of a new project including the support files that provide IntelliSense (-vsdoc files). IntelliSense support for jQuery is now also baked into Visual Studio 2010, so unlike Visual Studio 2008 which required a separate download, no further installs are required for a rich IntelliSense experience with jQuery. Summary ASP.NET 4.0 brings many useful improvements to the platform, but thankfully most of the changes are incremental changes that don’t compromise backwards compatibility and they allow developers to ease into the new features one feature at a time. None of the changes in ASP.NET 4.0 or Visual Studio 2010 are monumental or game changers. The bigger features are language and .NET Framework changes that are also optional. This ASP.NET and tools release feels more like fine tuning and getting some long-standing kinks worked out of the platform. It shows that the ASP.NET team is dedicated to paying attention to community feedback and responding with changes to the platform and development environment based on this feedback. If you haven’t gotten your feet wet with ASP.NET 4.0 and Visual Studio 2010, there’s no reason not to give it a shot now - the ASP.NET 4.0 platform is solid and Visual Studio 2010 works very well for a brand new release. Check it out. © Rick Strahl, West Wind Technologies, 2005-2010Posted in ASP.NET  

    Read the article

  • Where is my object allocation and memory leak in this iPhone/objective C code?

    - by Spottswoode
    Hello, I'm still a rookie when it comes to this programming gig and was wondering if someone could help me smooth out this code. Functionally, the code works great and does what I need it to do. But when I run the performance tool the allocation graph peaks, the CPU load is high, there's a leak(s), and I've also confirmed when running on my iPhone it seems noticeably slower then the rest of the components in my app. I'd appreciate any advice/tips/help anyone could give me. :) Thanks in advance! .h file // // Time_CalculatorViewController.h // Time Calculator // // Created by Adam Soloway on 2/19/10. // Copyright Legacy Pilots 2010. All rights reserved. // #import <UIKit/UIKit.h> @interface Time_CalculatorViewController : UIViewController { //BOOL moveViewUp; //CGFloat scrollAmount; IBOutlet UILabel *hoursLabel; IBOutlet UILabel *minutesLabel; IBOutlet UILabel *hoursDecimalLabel; IBOutlet UILabel *minutesDecimalLabel; IBOutlet UILabel *errorLabel; IBOutlet UITextField *minTextField1; IBOutlet UITextField *minTextField2; IBOutlet UITextField *minTextField3; IBOutlet UITextField *minTextField4; IBOutlet UITextField *minTextField5; IBOutlet UITextField *minTextField6; IBOutlet UITextField *minTextField7; IBOutlet UITextField *minTextField8; IBOutlet UITextField *minTextField9; IBOutlet UITextField *minTextField10; IBOutlet UITextField *hourTextField1; IBOutlet UITextField *hourTextField2; IBOutlet UITextField *hourTextField3; IBOutlet UITextField *hourTextField4; IBOutlet UITextField *hourTextField5; IBOutlet UITextField *hourTextField6; IBOutlet UITextField *hourTextField7; IBOutlet UITextField *hourTextField8; IBOutlet UITextField *hourTextField9; IBOutlet UITextField *hourTextField10; IBOutlet UIButton *resetAll; NSString *minutesString1; NSString *minutesString2; NSString *minutesString3; NSString *minutesString4; NSString *minutesString5; NSString *minutesString6; NSString *minutesString7; NSString *minutesString8; NSString *minutesString9; NSString *minutesString10; NSString *hoursString1; NSString *hoursString2; NSString *hoursString3; NSString *hoursString4; NSString *hoursString5; NSString *hoursString6; NSString *hoursString7; NSString *hoursString8; NSString *hoursString9; NSString *hoursString10; int hourDecimalNumber; int totalTime; int leftOverMinutes; int minuteNumber1; int minuteNumber2; int minuteNumber3; int minuteNumber4; int minuteNumber5; int minuteNumber6; int minuteNumber7; int minuteNumber8; int minuteNumber9; int minuteNumber10; int hourNumber1; int hourNumber2; int hourNumber3; int hourNumber4; int hourNumber5; int hourNumber6; int hourNumber7; int hourNumber8; int hourNumber9; int hourNumber10; } //- (void)scrollTheView:(BOOL)movedUp; - (void)calculateTime; - (IBAction)resetAllValues; @end .m file // // Time_CalculatorViewController.m // Time Calculator // // Created by Adam Soloway on 2/19/10. // Copyright Legacy Pilots 2010. All rights reserved. // #import "Time_CalculatorViewController.h" @implementation Time_CalculatorViewController - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { if( minTextField1.editing || minTextField2.editing || minTextField3.editing || minTextField4.editing || minTextField5.editing || minTextField6.editing || minTextField7.editing || minTextField8.editing || minTextField9.editing || minTextField10.editing || hourTextField1.editing || hourTextField2.editing || hourTextField3.editing || hourTextField4.editing || hourTextField5.editing || hourTextField6.editing || hourTextField7.editing || hourTextField8.editing || hourTextField9.editing || hourTextField10.editing) { [minTextField1 resignFirstResponder]; [minTextField2 resignFirstResponder]; [minTextField3 resignFirstResponder]; [minTextField4 resignFirstResponder]; [minTextField5 resignFirstResponder]; [minTextField6 resignFirstResponder]; [minTextField7 resignFirstResponder]; [minTextField8 resignFirstResponder]; [minTextField9 resignFirstResponder]; [minTextField10 resignFirstResponder]; [hourTextField1 resignFirstResponder]; [hourTextField2 resignFirstResponder]; [hourTextField3 resignFirstResponder]; [hourTextField4 resignFirstResponder]; [hourTextField5 resignFirstResponder]; [hourTextField6 resignFirstResponder]; [hourTextField7 resignFirstResponder]; [hourTextField8 resignFirstResponder]; [hourTextField9 resignFirstResponder]; [hourTextField10 resignFirstResponder]; [self calculateTime]; //if (moveViewUp) [self scrollTheView:NO]; } [super touchesBegan:touches withEvent:event]; } /* // The designated initializer. Override to perform setup that is required before the view is loaded. - (id)initWithNibName:(NSString *)nibNameOrNil bundle:(NSBundle *)nibBundleOrNil { if (self = [super initWithNibName:nibNameOrNil bundle:nibBundleOrNil]) { // Custom initialization } return self; } */ /* // Implement loadView to create a view hierarchy programmatically, without using a nib. - (void)loadView { } */ // Implement viewDidLoad to do additional setup after loading the view, typically from a nib. - (void)viewDidLoad { [super viewDidLoad]; } /* // Override to allow orientations other than the default portrait orientation. - (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation { // Return YES for supported orientations return (interfaceOrientation == UIInterfaceOrientationPortrait); } */ - (void)didReceiveMemoryWarning { // Releases the view if it doesn't have a superview. [super didReceiveMemoryWarning]; // Release any cached data, images, etc that aren't in use. } - (void)viewDidUnload { // Release any retained subviews of the main view. // e.g. self.myOutlet = nil; } - (void)dealloc { [minutesString1 release]; [minutesString2 release]; [minutesString3 release]; [minutesString4 release]; [minutesString5 release]; [minutesString6 release]; [minutesString7 release]; [minutesString8 release]; [minutesString9 release]; [minutesString10 release]; [hoursString1 release]; [hoursString2 release]; [hoursString3 release]; [hoursString4 release]; [hoursString5 release]; [hoursString6 release]; [hoursString7 release]; [hoursString8 release]; [hoursString9 release]; [hoursString10 release]; [super dealloc]; } -(BOOL)textFieldShouldReturn:(UITextField *)theTextField { //[minTextField10 resignFirstResponder]; //if (moveViewUp) [self scrollTheView:NO]; [self calculateTime]; return YES; } - (IBAction)resetAllValues { minTextField1.text = 0; minTextField2.text = 0; minTextField3.text = 0; minTextField4.text = 0; minTextField5.text = 0; minTextField6.text = 0; minTextField7.text = 0; minTextField8.text = 0; minTextField9.text = 0; minTextField10.text = 0; hourTextField1.text = 0; hourTextField2.text = 0; hourTextField3.text = 0; hourTextField4.text = 0; hourTextField5.text = 0; hourTextField6.text = 0; hourTextField7.text = 0; hourTextField8.text = 0; hourTextField9.text = 0; hourTextField10.text = 0; totalTime = 0; leftOverMinutes = 0; hoursLabel.text = [NSString stringWithFormat:@"0"]; hourDecimalNumber = 0; hoursDecimalLabel.text = [NSString stringWithFormat:@"%i", hourDecimalNumber]; minutesDecimalLabel.text = [NSString stringWithFormat:@"0"]; self.calculateTime; } - (void)calculateTime { minutesString1 = minTextField1.text; minutesString2 = minTextField2.text; minutesString3 = minTextField3.text; minutesString4 = minTextField4.text; minutesString5 = minTextField5.text; minutesString6 = minTextField6.text; minutesString7 = minTextField7.text; minutesString8 = minTextField8.text; minutesString9 = minTextField9.text; minutesString10 = minTextField10.text; hoursString1 = hourTextField1.text; hoursString2 = hourTextField2.text; hoursString3 = hourTextField3.text; hoursString4 = hourTextField4.text; hoursString5 = hourTextField5.text; hoursString6 = hourTextField6.text; hoursString7 = hourTextField7.text; hoursString8 = hourTextField8.text; hoursString9 = hourTextField9.text; hoursString10 = hourTextField10.text; minuteNumber1 = [minutesString1 intValue]; minuteNumber2 = [minutesString2 intValue]; minuteNumber3 = [minutesString3 intValue]; minuteNumber4 = [minutesString4 intValue]; minuteNumber5 = [minutesString5 intValue]; minuteNumber6 = [minutesString6 intValue]; minuteNumber7 = [minutesString7 intValue]; minuteNumber8 = [minutesString8 intValue]; minuteNumber9 = [minutesString9 intValue]; minuteNumber10 = [minutesString10 intValue]; hourNumber1 = ([hoursString1 intValue] * 60); hourNumber2 = ([hoursString2 intValue] * 60); hourNumber3 = ([hoursString3 intValue] * 60); hourNumber4 = ([hoursString4 intValue] * 60); hourNumber5 = ([hoursString5 intValue] * 60); hourNumber6 = ([hoursString6 intValue] * 60); hourNumber7 = ([hoursString7 intValue] * 60); hourNumber8 = ([hoursString8 intValue] * 60); hourNumber9 = ([hoursString9 intValue] * 60); hourNumber10 = ([hoursString10 intValue] * 60); totalTime = (hourNumber1 + hourNumber2 +hourNumber3 +hourNumber4 +hourNumber5 +hourNumber6 +hourNumber7 +hourNumber8 +hourNumber9 +hourNumber10 + minuteNumber1 + minuteNumber2 + minuteNumber3 + minuteNumber4 + minuteNumber5 +minuteNumber6 + minuteNumber7 + minuteNumber8 + minuteNumber9 + minuteNumber10); if (totalTime <= 59) { leftOverMinutes = totalTime; hoursLabel.text = [NSString stringWithFormat:@"0"]; hourDecimalNumber = 0; hoursDecimalLabel.text = [NSString stringWithFormat:@"%i", hourDecimalNumber]; errorLabel.hidden = TRUE; } else if (totalTime >59 && totalTime <= 119){ leftOverMinutes = totalTime - 60; hoursLabel.text = [NSString stringWithFormat:@"1"]; hourDecimalNumber = 1; hoursDecimalLabel.text = [NSString stringWithFormat:@"%i", hourDecimalNumber]; errorLabel.hidden = TRUE; } else if (totalTime >119 && totalTime <= 179){ leftOverMinutes = totalTime - 120; hoursLabel.text = [NSString stringWithFormat:@"2"]; hourDecimalNumber = 2; hoursDecimalLabel.text = [NSString stringWithFormat:@"%i", hourDecimalNumber]; errorLabel.hidden = TRUE; } else if (totalTime >179 && totalTime <= 239){ leftOverMinutes = totalTime - 180; hoursLabel.text = [NSString stringWithFormat:@"3"]; hourDecimalNumber = 3; hoursDecimalLabel.text = [NSString stringWithFormat:@"%i", hourDecimalNumber]; errorLabel.hidden = TRUE; } else if (totalTime >239 && totalTime <= 299){ leftOverMinutes = totalTime - 240; hoursLabel.text = [NSString stringWithFormat:@"4"]; hourDecimalNumber = 4; hoursDecimalLabel.text = [NSString stringWithFormat:@"%i", hourDecimalNumber]; errorLabel.hidden = TRUE; } else if (totalTime >299 && totalTime <= 359){ leftOverMinutes = totalTime - 300; hoursLabel.text = [NSString stringWithFormat:@"5"]; hourDecimalNumber = 5; hoursDecimalLabel.text = [NSString stringWithFormat:@"%i", hourDecimalNumber]; errorLabel.hidden = TRUE; } else if (totalTime >359 && totalTime <= 419){ leftOverMinutes = totalTime - 360; hoursLabel.text = [NSString stringWithFormat:@"6"]; hourDecimalNumber = 6; hoursDecimalLabel.text = [NSString stringWithFormat:@"%i", hourDecimalNumber]; errorLabel.hidden = TRUE; } else if (totalTime >419 && totalTime <= 479){ leftOverMinutes = totalTime - 420; hoursLabel.text = [NSString stringWithFormat:@"7"]; hourDecimalNumber = 7; hoursDecimalLabel.text = [NSString stringWithFormat:@"%i", hourDecimalNumber]; errorLabel.hidden = TRUE; } else if (totalTime >479 && totalTime <= 539){ leftOverMinutes = totalTime - 480; hoursLabel.text = [NSString stringWithFormat:@"8"]; hourDecimalNumber = 8; hoursDecimalLabel.text = [NSString stringWithFormat:@"%i", hourDecimalNumber]; errorLabel.hidden = TRUE; } else if (totalTime >539 && totalTime <= 599){ leftOverMinutes = totalTime - 540; hoursLabel.text = [NSString stringWithFormat:@"9"]; hourDecimalNumber = 9; hoursDecimalLabel.text = [NSString stringWithFormat:@"%i", hourDecimalNumber]; errorLabel.hidden = TRUE; } else if (totalTime >599 && totalTime <= 659){ leftOverMinutes = totalTime - 600; hoursLabel.text = [NSString stringWithFormat:@"10"]; hourDecimalNumber = 10; hoursDecimalLabel.text = [NSString stringWithFormat:@"%i", hourDecimalNumber]; errorLabel.hidden = TRUE; } else if (totalTime >659 && totalTime <= 719){ leftOverMinutes = totalTime - 660; hoursLabel.text = [NSString stringWithFormat:@"11"]; hourDecimalNumber = 11; hoursDecimalLabel.text = [NSString stringWithFormat:@"%i", hourDecimalNumber]; errorLabel.hidden = TRUE; } else if (totalTime >719 && totalTime <= 779){ leftOverMinutes = totalTime - 720; hoursLabel.text = [NSString stringWithFormat:@"12"]; hourDecimalNumber = 12; hoursDecimalLabel.text = [NSString stringWithFormat:@"%i", hourDecimalNumber]; errorLabel.hidden = TRUE; } else if (totalTime >779 && totalTime <= 839){ leftOverMinutes = totalTime - 780; hoursLabel.text = [NSString stringWithFormat:@"13"]; hourDecimalNumber = 13; hoursDecimalLabel.text = [NSString stringWithFormat:@"%i", hourDecimalNumber]; errorLabel.hidden = TRUE; } else if (totalTime >839 && totalTime <= 899){ leftOverMinutes = totalTime - 840; hoursLabel.text = [NSString stringWithFormat:@"14"]; hourDecimalNumber = 14; hoursDecimalLabel.text = [NSString stringWithFormat:@"%i", hourDecimalNumber]; errorLabel.hidden = TRUE; } else if (totalTime >899 && totalTime <= 959){ leftOverMinutes = totalTime - 900; hoursLabel.text = [NSString stringWithFormat:@"15"]; hourDecimalNumber = 15; hoursDecimalLabel.text = [NSString stringWithFormat:@"%i", hourDecimalNumber]; errorLabel.hidden = TRUE; } else if (totalTime >959 && totalTime <= 1019){ leftOverMinutes = totalTime - 960; hoursLabel.text = [NSString stringWithFormat:@"16"]; hourDecimalNumber = 16; hoursDecimalLabel.text = [NSString stringWithFormat:@"%i", hourDecimalNumber]; errorLabel.hidden = TRUE; } else if (totalTime >1019 && totalTime <= 1079){ leftOverMinutes = totalTime - 1020; hoursLabel.text = [NSString stringWithFormat:@"17"]; hourDecimalNumber = 17; hoursDecimalLabel.text = [NSString stringWithFormat:@"%i", hourDecimalNumber]; errorLabel.hidden = TRUE; } else if (totalTime >1079 && totalTime <= 1139){ leftOverMinutes = totalTime - 1080; hoursLabel.text = [NSString stringWithFormat:@"18"]; hourDecimalNumber = 18; hoursDecimalLabel.text = [NSString stringWithFormat:@"%i", hourDecimalNumber]; errorLabel.hidden = TRUE; } else if (totalTime >1139 && totalTime <= 1199){ leftOverMinutes = totalTime - 1140; hoursLabel.text = [NSString stringWithFormat:@"19"]; hourDecimalNumber = 19; hoursDecimalLabel.text = [NSString stringWithFormat:@"%i", hourDecimalNumber]; errorLabel.hidden = TRUE; } else if (totalTime >1199 && totalTime <= 1259){ leftOverMinutes = totalTime - 1200; hoursLabel.text = [NSString stringWithFormat:@"20"]; hourDecimalNumber = 20; hoursDecimalLabel.text = [NSString stringWithFormat:@"%i", hourDecimalNumber]; errorLabel.hidden = TRUE; } else if (totalTime >1259 && totalTime <= 1319){ leftOverMinutes = totalTime - 1260; hoursLabel.text = [NSString stringWithFormat:@"21"]; hourDecimalNumber = 21; hoursDecimalLabel.text = [NSString stringWithFormat:@"%i", hourDecimalNumber]; errorLabel.hidden = TRUE; } else if (totalTime >1319 && totalTime <= 1379){ leftOverMinutes = totalTime - 1320; hoursLabel.text = [NSString stringWithFormat:@"22"]; hourDecimalNumber = 22; hoursDecimalLabel.text = [NSString stringWithFormat:@"%i", hourDecimalNumber]; errorLabel.hidden = TRUE; } else if (totalTime >1379 && totalTime <= 1439){ leftOverMinutes = totalTime - 1380; hoursLabel.text = [NSString stringWithFormat:@"23"]; hourDecimalNumber = 23; hoursDecimalLabel.text = [NSString stringWithFormat:@"%i", hourDecimalNumber]; errorLabel.hidden = TRUE; } else if (totalTime >1439 && totalTime <= 1499){ leftOverMinutes = totalTime - 1440; hoursLabel.text = [NSString stringWithFormat:@"24"]; hourDecimalNumber = 24; hoursDecimalLabel.text = [NSString stringWithFormat:@"%i", hourDecimalNumber]; errorLabel.hidden = TRUE; } else if (totalTime >1499 && totalTime <= 1559){ leftOverMinutes = totalTime - 1500; hoursLabel.text = [NSString stringWithFormat:@"25"]; hourDecimalNumber = 25; hoursDecimalLabel.text = [NSString stringWithFormat:@"%i", hourDecimalNumber]; errorLabel.hidden = TRUE; } else if (totalTime >1559 && totalTime <= 1619){ leftOverMinutes = totalTime - 1560; hoursLabel.text = [NSString stringWithFormat:@"26"]; hourDecimalNumber = 26; hoursDecimalLabel.text = [NSString stringWithFormat:@"%i", hourDecimalNumber]; errorLabel.hidden = TRUE; } else if (totalTime >1619 && totalTime <= 1679){ leftOverMinutes = totalTime - 1620; hoursLabel.text = [NSString stringWithFormat:@"27"]; hourDecimalNumber = 27; hoursDecimalLabel.text = [NSString stringWithFormat:@"%i", hourDecimalNumber]; errorLabel.hidden = TRUE; } else if (totalTime >1679 && totalTime <= 1739){ leftOverMinutes = totalTime - 1680; hoursLabel.text = [NSString stringWithFormat:@"28"]; hourDecimalNumber = 28; hoursDecimalLabel.text = [NSString stringWithFormat:@"%i", hourDecimalNumber]; errorLabel.hidden = TRUE; } else if (totalTime >1739 && totalTime <= 1799){ leftOverMinutes = totalTime - 1740; hoursLabel.text = [NSString stringWithFormat:@"29"]; hourDecimalNumber = 29; hoursDecimalLabel.text = [NSString stringWithFormat:@"%i", hourDecimalNumber]; errorLabel.hidden = TRUE; } else if (totalTime >1799 && totalTime <= 1859){ leftOverMinutes = totalTime - 1800; hoursLabel.text = [NSString stringWithFormat:@"30"]; hourDecimalNumber = 30; hoursDecimalLabel.text = [NSString stringWithFormat:@"%i", hourDecimalNumber]; errorLabel.hidden = TRUE; } else if (totalTime >1859){ hoursLabel.text = [NSString stringWithFormat:@"Error"]; hoursDecimalLabel.text = [NSString stringWithFormat:@"Error"]; errorLabel.hidden = FALSE; } //Minutes Label if (leftOverMinutes < 10) { minutesLabel.text = [NSString stringWithFormat:@"0%d", leftOverMinutes]; } else minutesLabel.text = [NSString stringWithFormat:@"%d", leftOverMinutes]; //Minutes Decimal Label if (leftOverMinutes >=0 && leftOverMinutes <=2) { minutesDecimalLabel.text = [NSString stringWithFormat:@"0"]; } else if (leftOverMinutes >=3 && leftOverMinutes <=8){ minutesDecimalLabel.text = [NSString stringWithFormat:@"1"]; } else if (leftOverMinutes >=9 && leftOverMinutes <=14){ minutesDecimalLabel.text = [NSString stringWithFormat:@"2"]; } else if (leftOverMinutes >=15 && leftOverMinutes <=20){ minutesDecimalLabel.text = [NSString stringWithFormat:@"3"]; } else if (leftOverMinutes >=21 && leftOverMinutes <=26){ minutesDecimalLabel.text = [NSString stringWithFormat:@"4"]; } else if (leftOverMinutes >=27 && leftOverMinutes <=32){ minutesDecimalLabel.text = [NSString stringWithFormat:@"5"]; } else if (leftOverMinutes >=33 && leftOverMinutes <=38){ minutesDecimalLabel.text = [NSString stringWithFormat:@"6"]; } else if (leftOverMinutes >=39 && leftOverMinutes <=44){ minutesDecimalLabel.text = [NSString stringWithFormat:@"7"]; } else if (leftOverMinutes >=45 && leftOverMinutes <=50){ minutesDecimalLabel.text = [NSString stringWithFormat:@"8"]; } else if (leftOverMinutes >=51 && leftOverMinutes <=56){ minutesDecimalLabel.text = [NSString stringWithFormat:@"9"]; } else if (leftOverMinutes >=57 && leftOverMinutes <=60){ minutesDecimalLabel.text = [NSString stringWithFormat:@"0"]; hourDecimalNumber = hourDecimalNumber + 1; hoursDecimalLabel.text = [NSString stringWithFormat:@"%i", hourDecimalNumber]; } } @end

    Read the article

  • Problem deploying portlets on JBoss Portal 2.7.2: Not a canonical value

    - by Dee
    I just downloaded the JBoss Portal Server 2.7.2 (JBoss Portal + JBoss AS 4.2.3 bundle to be precise) and tried deploying portlets just as the SimpleHelloWorld provided in the samples. The portlet gets deployed fine but when I put it on a page I get the following exception. I tried adding other Portlets as well (such as the booking MVC portelt supplied with Spring WebFlow dist) but the same problem happens. The problem happens when the new instances are created by me, example when i create a new instance of CMS Portlet, I get the same error. If I use an existing instance it works. If I deploy a portlet that creates an instance using the "portle-instances.xml" then it works fine, but creating additional instances using the Admin and deploying them on page fails due to the following error. What am I doing wrong? Can anyone please help? javax.servlet.ServletException: java.lang.IllegalArgumentException: Not a canonical value SimplestHelloWorldWindow org.jboss.portal.server.servlet.PortalServlet.service(PortalServlet.java:278) javax.servlet.http.HttpServlet.service(HttpServlet.java:803) org.jboss.web.tomcat.filters.ReplyHeaderFilter.doFilter(ReplyHeaderFilter.java:96) root cause java.lang.IllegalArgumentException: Not a canonical value SimplestHelloWorldWindow org.jboss.portal.core.model.portal.PortalObjectPath$CanonicalFormat.parse(PortalObjectPath.java:357) org.jboss.portal.core.model.portal.PortalObjectPath.<init>(PortalObjectPath.java:161) org.jboss.portal.core.model.portal.PortalObjectPath.parse(PortalObjectPath.java:314) org.jboss.portal.core.model.portal.PortalObjectId.parse(PortalObjectId.java:158) org.jboss.portal.core.model.portal.PortalObjectId.parse(PortalObjectId.java:143) org.jboss.portal.core.model.portal.navstate.PortalObjectNavigationalStateContext.createWindowKey(PortalObjectNavigationalStateContext.java:299) org.jboss.portal.core.model.portal.navstate.PortalObjectNavigationalStateContext.getWindowNavigationalState(PortalObjectNavigationalStateContext.java:194) org.jboss.portal.core.controller.portlet.ControllerPageNavigationalState.getPortletWindowNavigationalState(ControllerPageNavigationalState.java:230) org.jboss.portal.core.model.portal.command.render.RenderWindowCommand.getPortletNavigationalState(RenderWindowCommand.java:121) org.jboss.portal.core.impl.model.content.InternalContentProvider.renderWindow(InternalContentProvider.java:211) org.jboss.portal.core.model.portal.command.render.RenderWindowCommand.execute(RenderWindowCommand.java:100) org.jboss.portal.core.controller.ControllerCommand$1.invoke(ControllerCommand.java:68) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:131) org.jboss.portal.core.aspects.controller.node.EventBroadcasterInterceptor.invoke(EventBroadcasterInterceptor.java:124) org.jboss.portal.core.controller.ControllerInterceptor.invoke(ControllerInterceptor.java:40) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.core.aspects.controller.PageCustomizerInterceptor.invoke(PageCustomizerInterceptor.java:134) org.jboss.portal.core.controller.ControllerInterceptor.invoke(ControllerInterceptor.java:40) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.core.aspects.controller.PolicyEnforcementInterceptor.invoke(PolicyEnforcementInterceptor.java:78) org.jboss.portal.core.controller.ControllerInterceptor.invoke(ControllerInterceptor.java:40) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.core.aspects.controller.node.PortalNodeInterceptor.invoke(PortalNodeInterceptor.java:81) org.jboss.portal.core.controller.ControllerInterceptor.invoke(ControllerInterceptor.java:40) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.core.aspects.controller.BackwardCompatibilityInterceptor.invoke(BackwardCompatibilityInterceptor.java:48) org.jboss.portal.core.controller.ControllerInterceptor.invoke(ControllerInterceptor.java:40) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.core.aspects.controller.ControlInterceptor.invoke(ControlInterceptor.java:56) org.jboss.portal.core.controller.ControllerInterceptor.invoke(ControllerInterceptor.java:40) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.core.aspects.controller.NavigationalStateInterceptor.invoke(NavigationalStateInterceptor.java:42) org.jboss.portal.core.controller.ControllerInterceptor.invoke(ControllerInterceptor.java:40) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.core.controller.ajax.AjaxInterceptor.invoke(AjaxInterceptor.java:55) org.jboss.portal.core.controller.ControllerInterceptor.invoke(ControllerInterceptor.java:40) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.core.aspects.controller.ResourceAcquisitionInterceptor.invoke(ResourceAcquisitionInterceptor.java:50) org.jboss.portal.core.controller.ControllerInterceptor.invoke(ControllerInterceptor.java:40) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.common.invocation.Invocation.invoke(Invocation.java:157) org.jboss.portal.core.controller.ControllerContext.execute(ControllerContext.java:134) org.jboss.portal.core.model.portal.command.render.RenderWindowCommand.render(RenderWindowCommand.java:80) org.jboss.portal.core.model.portal.command.render.RenderPageCommand.execute(RenderPageCommand.java:222) org.jboss.portal.core.controller.ControllerCommand$1.invoke(ControllerCommand.java:68) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:131) org.jboss.portal.core.aspects.controller.node.EventBroadcasterInterceptor.invoke(EventBroadcasterInterceptor.java:124) org.jboss.portal.core.controller.ControllerInterceptor.invoke(ControllerInterceptor.java:40) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.core.aspects.controller.PageCustomizerInterceptor.invoke(PageCustomizerInterceptor.java:134) org.jboss.portal.core.controller.ControllerInterceptor.invoke(ControllerInterceptor.java:40) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.core.aspects.controller.PolicyEnforcementInterceptor.invoke(PolicyEnforcementInterceptor.java:78) org.jboss.portal.core.controller.ControllerInterceptor.invoke(ControllerInterceptor.java:40) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.core.aspects.controller.node.PortalNodeInterceptor.invoke(PortalNodeInterceptor.java:81) org.jboss.portal.core.controller.ControllerInterceptor.invoke(ControllerInterceptor.java:40) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.core.aspects.controller.BackwardCompatibilityInterceptor.invoke(BackwardCompatibilityInterceptor.java:48) org.jboss.portal.core.controller.ControllerInterceptor.invoke(ControllerInterceptor.java:40) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.core.aspects.controller.ControlInterceptor.invoke(ControlInterceptor.java:56) org.jboss.portal.core.controller.ControllerInterceptor.invoke(ControllerInterceptor.java:40) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.core.aspects.controller.NavigationalStateInterceptor.invoke(NavigationalStateInterceptor.java:42) org.jboss.portal.core.controller.ControllerInterceptor.invoke(ControllerInterceptor.java:40) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.core.controller.ajax.AjaxInterceptor.invoke(AjaxInterceptor.java:55) org.jboss.portal.core.controller.ControllerInterceptor.invoke(ControllerInterceptor.java:40) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.core.aspects.controller.ResourceAcquisitionInterceptor.invoke(ResourceAcquisitionInterceptor.java:50) org.jboss.portal.core.controller.ControllerInterceptor.invoke(ControllerInterceptor.java:40) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.common.invocation.Invocation.invoke(Invocation.java:157) org.jboss.portal.core.controller.ControllerContext.execute(ControllerContext.java:134) org.jboss.portal.core.model.portal.PortalObjectResponseHandler.processCommandResponse(PortalObjectResponseHandler.java:80) org.jboss.portal.core.controller.classic.ClassicResponseHandler.processHandlers(ClassicResponseHandler.java:78) org.jboss.portal.core.controller.classic.ClassicResponseHandler.processCommandResponse(ClassicResponseHandler.java:53) org.jboss.portal.core.controller.handler.ResponseHandlerSelector.processCommandResponse(ResponseHandlerSelector.java:70) org.jboss.portal.core.controller.Controller.processCommandResponse(Controller.java:315) org.jboss.portal.core.controller.Controller.processCommand(Controller.java:303) org.jboss.portal.core.controller.Controller.handle(Controller.java:261) org.jboss.portal.server.RequestControllerDispatcher.invoke(RequestControllerDispatcher.java:51) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:131) org.jboss.portal.core.cms.aspect.IdentityBindingInterceptor.invoke(IdentityBindingInterceptor.java:47) org.jboss.portal.server.ServerInterceptor.invoke(ServerInterceptor.java:38) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.server.aspects.server.ContentTypeInterceptor.invoke(ContentTypeInterceptor.java:68) org.jboss.portal.server.ServerInterceptor.invoke(ServerInterceptor.java:38) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.core.aspects.server.PortalContextPathInterceptor.invoke(PortalContextPathInterceptor.java:45) org.jboss.portal.server.ServerInterceptor.invoke(ServerInterceptor.java:38) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.core.aspects.server.LocaleInterceptor.invoke(LocaleInterceptor.java:96) org.jboss.portal.server.ServerInterceptor.invoke(ServerInterceptor.java:38) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.core.aspects.server.UserInterceptor.invoke(UserInterceptor.java:196) org.jboss.portal.server.ServerInterceptor.invoke(ServerInterceptor.java:38) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.server.aspects.server.SignOutInterceptor.invoke(SignOutInterceptor.java:98) org.jboss.portal.server.ServerInterceptor.invoke(ServerInterceptor.java:38) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.core.impl.api.user.UserEventBridgeTriggerInterceptor.invoke(UserEventBridgeTriggerInterceptor.java:65) org.jboss.portal.server.ServerInterceptor.invoke(ServerInterceptor.java:38) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.core.aspects.server.IdentityCacheInterceptor.invoke(IdentityCacheInterceptor.java:68) org.jboss.portal.server.ServerInterceptor.invoke(ServerInterceptor.java:38) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.core.aspects.server.TransactionInterceptor.org$jboss$portal$core$aspects$server$TransactionInterceptor$invoke$aop(TransactionInterceptor.java:49) org.jboss.portal.core.aspects.server.TransactionInterceptor$invoke_N5143606530999904530.invokeNext(TransactionInterceptor$invoke_N5143606530999904530.java) org.jboss.aspects.tx.TxPolicy.invokeInOurTx(TxPolicy.java:79) org.jboss.aspects.tx.TxInterceptor$RequiresNew.invoke(TxInterceptor.java:253) org.jboss.portal.core.aspects.server.TransactionInterceptor$invoke_N5143606530999904530.invokeNext(TransactionInterceptor$invoke_N5143606530999904530.java) org.jboss.aspects.tx.TxPolicy.invokeInOurTx(TxPolicy.java:79) org.jboss.aspects.tx.TxInterceptor$RequiresNew.invoke(TxInterceptor.java:262) org.jboss.portal.core.aspects.server.TransactionInterceptor$invoke_N5143606530999904530.invokeNext(TransactionInterceptor$invoke_N5143606530999904530.java) org.jboss.portal.core.aspects.server.TransactionInterceptor.invoke(TransactionInterceptor.java) org.jboss.portal.server.ServerInterceptor.invoke(ServerInterceptor.java:38) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.server.aspects.LockInterceptor$InternalLock.invoke(LockInterceptor.java:69) org.jboss.portal.server.aspects.LockInterceptor.invoke(LockInterceptor.java:130) org.jboss.portal.common.invocation.Invocation.invokeNext(Invocation.java:115) org.jboss.portal.common.invocation.Invocation.invoke(Invocation.java:157) org.jboss.portal.server.servlet.PortalServlet.service(PortalServlet.java:252) javax.servlet.http.HttpServlet.service(HttpServlet.java:803) org.jboss.web.tomcat.filters.ReplyHeaderFilter.doFilter(ReplyHeaderFilter.java:96)

    Read the article

  • Metro: Creating an IndexedDbDataSource for WinJS

    - by Stephen.Walther
    The goal of this blog entry is to describe how you can create custom data sources which you can use with the controls in the WinJS library. In particular, I explain how you can create an IndexedDbDataSource which you can use to store and retrieve data from an IndexedDB database. If you want to skip ahead, and ignore all of the fascinating content in-between, I’ve included the complete code for the IndexedDbDataSource at the very bottom of this blog entry. What is IndexedDB? IndexedDB is a database in the browser. You can use the IndexedDB API with all modern browsers including Firefox, Chrome, and Internet Explorer 10. And, of course, you can use IndexedDB with Metro style apps written with JavaScript. If you need to persist data in a Metro style app written with JavaScript then IndexedDB is a good option. Each Metro app can only interact with its own IndexedDB databases. And, IndexedDB provides you with transactions, indices, and cursors – the elements of any modern database. An IndexedDB database might be different than the type of database that you normally use. An IndexedDB database is an object-oriented database and not a relational database. Instead of storing data in tables, you store data in object stores. You store JavaScript objects in an IndexedDB object store. You create new IndexedDB object stores by handling the upgradeneeded event when you attempt to open a connection to an IndexedDB database. For example, here’s how you would both open a connection to an existing database named TasksDB and create the TasksDB database when it does not already exist: var reqOpen = window.indexedDB.open(“TasksDB”, 2); reqOpen.onupgradeneeded = function (evt) { var newDB = evt.target.result; newDB.createObjectStore("tasks", { keyPath: "id", autoIncrement: true }); }; reqOpen.onsuccess = function () { var db = reqOpen.result; // Do something with db }; When you call window.indexedDB.open(), and the database does not already exist, then the upgradeneeded event is raised. In the code above, the upgradeneeded handler creates a new object store named tasks. The new object store has an auto-increment column named id which acts as the primary key column. If the database already exists with the right version, and you call window.indexedDB.open(), then the success event is raised. At that point, you have an open connection to the existing database and you can start doing something with the database. You use asynchronous methods to interact with an IndexedDB database. For example, the following code illustrates how you would add a new object to the tasks object store: var transaction = db.transaction(“tasks”, “readwrite”); var reqAdd = transaction.objectStore(“tasks”).add({ name: “Feed the dog” }); reqAdd.onsuccess = function() { // Tasks added successfully }; The code above creates a new database transaction, adds a new task to the tasks object store, and handles the success event. If the new task gets added successfully then the success event is raised. Creating a WinJS IndexedDbDataSource The most powerful control in the WinJS library is the ListView control. This is the control that you use to display a collection of items. If you want to display data with a ListView control, you need to bind the control to a data source. The WinJS library includes two objects which you can use as a data source: the List object and the StorageDataSource object. The List object enables you to represent a JavaScript array as a data source and the StorageDataSource enables you to represent the file system as a data source. If you want to bind an IndexedDB database to a ListView then you have a choice. You can either dump the items from the IndexedDB database into a List object or you can create a custom data source. I explored the first approach in a previous blog entry. In this blog entry, I explain how you can create a custom IndexedDB data source. Implementing the IListDataSource Interface You create a custom data source by implementing the IListDataSource interface. This interface contains the contract for the methods which the ListView needs to interact with a data source. The easiest way to implement the IListDataSource interface is to derive a new object from the base VirtualizedDataSource object. The VirtualizedDataSource object requires a data adapter which implements the IListDataAdapter interface. Yes, because of the number of objects involved, this is a little confusing. Your code ends up looking something like this: var IndexedDbDataSource = WinJS.Class.derive( WinJS.UI.VirtualizedDataSource, function (dbName, dbVersion, objectStoreName, upgrade, error) { this._adapter = new IndexedDbDataAdapter(dbName, dbVersion, objectStoreName, upgrade, error); this._baseDataSourceConstructor(this._adapter); }, { nuke: function () { this._adapter.nuke(); }, remove: function (key) { this._adapter.removeInternal(key); } } ); The code above is used to create a new class named IndexedDbDataSource which derives from the base VirtualizedDataSource class. In the constructor for the new class, the base class _baseDataSourceConstructor() method is called. A data adapter is passed to the _baseDataSourceConstructor() method. The code above creates a new method exposed by the IndexedDbDataSource named nuke(). The nuke() method deletes all of the objects from an object store. The code above also overrides a method named remove(). Our derived remove() method accepts any type of key and removes the matching item from the object store. Almost all of the work of creating a custom data source goes into building the data adapter class. The data adapter class implements the IListDataAdapter interface which contains the following methods: · change() · getCount() · insertAfter() · insertAtEnd() · insertAtStart() · insertBefore() · itemsFromDescription() · itemsFromEnd() · itemsFromIndex() · itemsFromKey() · itemsFromStart() · itemSignature() · moveAfter() · moveBefore() · moveToEnd() · moveToStart() · remove() · setNotificationHandler() · compareByIdentity Fortunately, you are not required to implement all of these methods. You only need to implement the methods that you actually need. In the case of the IndexedDbDataSource, I implemented the getCount(), itemsFromIndex(), insertAtEnd(), and remove() methods. If you are creating a read-only data source then you really only need to implement the getCount() and itemsFromIndex() methods. Implementing the getCount() Method The getCount() method returns the total number of items from the data source. So, if you are storing 10,000 items in an object store then this method would return the value 10,000. Here’s how I implemented the getCount() method: getCount: function () { var that = this; return new WinJS.Promise(function (complete, error) { that._getObjectStore().then(function (store) { var reqCount = store.count(); reqCount.onerror = that._error; reqCount.onsuccess = function (evt) { complete(evt.target.result); }; }); }); } The first thing that you should notice is that the getCount() method returns a WinJS promise. This is a requirement. The getCount() method is asynchronous which is a good thing because all of the IndexedDB methods (at least the methods implemented in current browsers) are also asynchronous. The code above retrieves an object store and then uses the IndexedDB count() method to get a count of the items in the object store. The value is returned from the promise by calling complete(). Implementing the itemsFromIndex method When a ListView displays its items, it calls the itemsFromIndex() method. By default, it calls this method multiple times to get different ranges of items. Three parameters are passed to the itemsFromIndex() method: the requestIndex, countBefore, and countAfter parameters. The requestIndex indicates the index of the item from the database to show. The countBefore and countAfter parameters represent hints. These are integer values which represent the number of items before and after the requestIndex to retrieve. Again, these are only hints and you can return as many items before and after the request index as you please. Here’s how I implemented the itemsFromIndex method: itemsFromIndex: function (requestIndex, countBefore, countAfter) { var that = this; return new WinJS.Promise(function (complete, error) { that.getCount().then(function (count) { if (requestIndex >= count) { return WinJS.Promise.wrapError(new WinJS.ErrorFromName(WinJS.UI.FetchError.doesNotExist)); } var startIndex = Math.max(0, requestIndex - countBefore); var endIndex = Math.min(count, requestIndex + countAfter + 1); that._getObjectStore().then(function (store) { var index = 0; var items = []; var req = store.openCursor(); req.onerror = that._error; req.onsuccess = function (evt) { var cursor = evt.target.result; if (index < startIndex) { index = startIndex; cursor.advance(startIndex); return; } if (cursor && index < endIndex) { index++; items.push({ key: cursor.value[store.keyPath].toString(), data: cursor.value }); cursor.continue(); return; } results = { items: items, offset: requestIndex - startIndex, totalCount: count }; complete(results); }; }); }); }); } In the code above, a cursor is used to iterate through the objects in an object store. You fetch the next item in the cursor by calling either the cursor.continue() or cursor.advance() method. The continue() method moves forward by one object and the advance() method moves forward a specified number of objects. Each time you call continue() or advance(), the success event is raised again. If the cursor is null then you know that you have reached the end of the cursor and you can return the results. Some things to be careful about here. First, the return value from the itemsFromIndex() method must implement the IFetchResult interface. In particular, you must return an object which has an items, offset, and totalCount property. Second, each item in the items array must implement the IListItem interface. Each item should have a key and a data property. Implementing the insertAtEnd() Method When creating the IndexedDbDataSource, I wanted to go beyond creating a simple read-only data source and support inserting and deleting objects. If you want to support adding new items with your data source then you need to implement the insertAtEnd() method. Here’s how I implemented the insertAtEnd() method for the IndexedDbDataSource: insertAtEnd:function(unused, data) { var that = this; return new WinJS.Promise(function (complete, error) { that._getObjectStore("readwrite").done(function(store) { var reqAdd = store.add(data); reqAdd.onerror = that._error; reqAdd.onsuccess = function (evt) { var reqGet = store.get(evt.target.result); reqGet.onerror = that._error; reqGet.onsuccess = function (evt) { var newItem = { key:evt.target.result[store.keyPath].toString(), data:evt.target.result } complete(newItem); }; }; }); }); } When implementing the insertAtEnd() method, you need to be careful to return an object which implements the IItem interface. In particular, you should return an object that has a key and a data property. The key must be a string and it uniquely represents the new item added to the data source. The value of the data property represents the new item itself. Implementing the remove() Method Finally, you use the remove() method to remove an item from the data source. You call the remove() method with the key of the item which you want to remove. Implementing the remove() method in the case of the IndexedDbDataSource was a little tricky. The problem is that an IndexedDB object store uses an integer key and the VirtualizedDataSource requires a string key. For that reason, I needed to override the remove() method in the derived IndexedDbDataSource class like this: var IndexedDbDataSource = WinJS.Class.derive( WinJS.UI.VirtualizedDataSource, function (dbName, dbVersion, objectStoreName, upgrade, error) { this._adapter = new IndexedDbDataAdapter(dbName, dbVersion, objectStoreName, upgrade, error); this._baseDataSourceConstructor(this._adapter); }, { nuke: function () { this._adapter.nuke(); }, remove: function (key) { this._adapter.removeInternal(key); } } ); When you call remove(), you end up calling a method of the IndexedDbDataAdapter named removeInternal() . Here’s what the removeInternal() method looks like: setNotificationHandler: function (notificationHandler) { this._notificationHandler = notificationHandler; }, removeInternal: function(key) { var that = this; return new WinJS.Promise(function (complete, error) { that._getObjectStore("readwrite").done(function (store) { var reqDelete = store.delete (key); reqDelete.onerror = that._error; reqDelete.onsuccess = function (evt) { that._notificationHandler.removed(key.toString()); complete(); }; }); }); } The removeInternal() method calls the IndexedDB delete() method to delete an item from the object store. If the item is deleted successfully then the _notificationHandler.remove() method is called. Because we are not implementing the standard IListDataAdapter remove() method, we need to notify the data source (and the ListView control bound to the data source) that an item has been removed. The way that you notify the data source is by calling the _notificationHandler.remove() method. Notice that we get the _notificationHandler in the code above by implementing another method in the IListDataAdapter interface: the setNotificationHandler() method. You can raise the following types of notifications using the _notificationHandler: · beginNotifications() · changed() · endNotifications() · inserted() · invalidateAll() · moved() · removed() · reload() These methods are all part of the IListDataNotificationHandler interface in the WinJS library. Implementing the nuke() Method I wanted to implement a method which would remove all of the items from an object store. Therefore, I created a method named nuke() which calls the IndexedDB clear() method: nuke: function () { var that = this; return new WinJS.Promise(function (complete, error) { that._getObjectStore("readwrite").done(function (store) { var reqClear = store.clear(); reqClear.onerror = that._error; reqClear.onsuccess = function (evt) { that._notificationHandler.reload(); complete(); }; }); }); } Notice that the nuke() method calls the _notificationHandler.reload() method to notify the ListView to reload all of the items from its data source. Because we are implementing a custom method here, we need to use the _notificationHandler to send an update. Using the IndexedDbDataSource To illustrate how you can use the IndexedDbDataSource, I created a simple task list app. You can add new tasks, delete existing tasks, and nuke all of the tasks. You delete an item by selecting an item (swipe or right-click) and clicking the Delete button. Here’s the HTML page which contains the ListView, the form for adding new tasks, and the buttons for deleting and nuking tasks: <!DOCTYPE html> <html> <head> <meta charset="utf-8" /> <title>DataSources</title> <!-- WinJS references --> <link href="//Microsoft.WinJS.1.0.RC/css/ui-dark.css" rel="stylesheet" /> <script src="//Microsoft.WinJS.1.0.RC/js/base.js"></script> <script src="//Microsoft.WinJS.1.0.RC/js/ui.js"></script> <!-- DataSources references --> <link href="indexedDb.css" rel="stylesheet" /> <script type="text/javascript" src="indexedDbDataSource.js"></script> <script src="indexedDb.js"></script> </head> <body> <div id="tmplTask" data-win-control="WinJS.Binding.Template"> <div class="taskItem"> Id: <span data-win-bind="innerText:id"></span> <br /><br /> Name: <span data-win-bind="innerText:name"></span> </div> </div> <div id="lvTasks" data-win-control="WinJS.UI.ListView" data-win-options="{ itemTemplate: select('#tmplTask'), selectionMode: 'single' }"></div> <form id="frmAdd"> <fieldset> <legend>Add Task</legend> <label>New Task</label> <input id="inputTaskName" required /> <button>Add</button> </fieldset> </form> <button id="btnNuke">Nuke</button> <button id="btnDelete">Delete</button> </body> </html> And here is the JavaScript code for the TaskList app: /// <reference path="//Microsoft.WinJS.1.0.RC/js/base.js" /> /// <reference path="//Microsoft.WinJS.1.0.RC/js/ui.js" /> function init() { WinJS.UI.processAll().done(function () { var lvTasks = document.getElementById("lvTasks").winControl; // Bind the ListView to its data source var tasksDataSource = new DataSources.IndexedDbDataSource("TasksDB", 1, "tasks", upgrade); lvTasks.itemDataSource = tasksDataSource; // Wire-up Add, Delete, Nuke buttons document.getElementById("frmAdd").addEventListener("submit", function (evt) { evt.preventDefault(); tasksDataSource.beginEdits(); tasksDataSource.insertAtEnd(null, { name: document.getElementById("inputTaskName").value }).done(function (newItem) { tasksDataSource.endEdits(); document.getElementById("frmAdd").reset(); lvTasks.ensureVisible(newItem.index); }); }); document.getElementById("btnDelete").addEventListener("click", function () { if (lvTasks.selection.count() == 1) { lvTasks.selection.getItems().done(function (items) { tasksDataSource.remove(items[0].data.id); }); } }); document.getElementById("btnNuke").addEventListener("click", function () { tasksDataSource.nuke(); }); // This method is called to initialize the IndexedDb database function upgrade(evt) { var newDB = evt.target.result; newDB.createObjectStore("tasks", { keyPath: "id", autoIncrement: true }); } }); } document.addEventListener("DOMContentLoaded", init); The IndexedDbDataSource is created and bound to the ListView control with the following two lines of code: var tasksDataSource = new DataSources.IndexedDbDataSource("TasksDB", 1, "tasks", upgrade); lvTasks.itemDataSource = tasksDataSource; The IndexedDbDataSource is created with four parameters: the name of the database to create, the version of the database to create, the name of the object store to create, and a function which contains code to initialize the new database. The upgrade function creates a new object store named tasks with an auto-increment property named id: function upgrade(evt) { var newDB = evt.target.result; newDB.createObjectStore("tasks", { keyPath: "id", autoIncrement: true }); } The Complete Code for the IndexedDbDataSource Here’s the complete code for the IndexedDbDataSource: (function () { /************************************************ * The IndexedDBDataAdapter enables you to work * with a HTML5 IndexedDB database. *************************************************/ var IndexedDbDataAdapter = WinJS.Class.define( function (dbName, dbVersion, objectStoreName, upgrade, error) { this._dbName = dbName; // database name this._dbVersion = dbVersion; // database version this._objectStoreName = objectStoreName; // object store name this._upgrade = upgrade; // database upgrade script this._error = error || function (evt) { console.log(evt.message); }; }, { /******************************************* * IListDataAdapter Interface Methods ********************************************/ getCount: function () { var that = this; return new WinJS.Promise(function (complete, error) { that._getObjectStore().then(function (store) { var reqCount = store.count(); reqCount.onerror = that._error; reqCount.onsuccess = function (evt) { complete(evt.target.result); }; }); }); }, itemsFromIndex: function (requestIndex, countBefore, countAfter) { var that = this; return new WinJS.Promise(function (complete, error) { that.getCount().then(function (count) { if (requestIndex >= count) { return WinJS.Promise.wrapError(new WinJS.ErrorFromName(WinJS.UI.FetchError.doesNotExist)); } var startIndex = Math.max(0, requestIndex - countBefore); var endIndex = Math.min(count, requestIndex + countAfter + 1); that._getObjectStore().then(function (store) { var index = 0; var items = []; var req = store.openCursor(); req.onerror = that._error; req.onsuccess = function (evt) { var cursor = evt.target.result; if (index < startIndex) { index = startIndex; cursor.advance(startIndex); return; } if (cursor && index < endIndex) { index++; items.push({ key: cursor.value[store.keyPath].toString(), data: cursor.value }); cursor.continue(); return; } results = { items: items, offset: requestIndex - startIndex, totalCount: count }; complete(results); }; }); }); }); }, insertAtEnd:function(unused, data) { var that = this; return new WinJS.Promise(function (complete, error) { that._getObjectStore("readwrite").done(function(store) { var reqAdd = store.add(data); reqAdd.onerror = that._error; reqAdd.onsuccess = function (evt) { var reqGet = store.get(evt.target.result); reqGet.onerror = that._error; reqGet.onsuccess = function (evt) { var newItem = { key:evt.target.result[store.keyPath].toString(), data:evt.target.result } complete(newItem); }; }; }); }); }, setNotificationHandler: function (notificationHandler) { this._notificationHandler = notificationHandler; }, /***************************************** * IndexedDbDataSource Method ******************************************/ removeInternal: function(key) { var that = this; return new WinJS.Promise(function (complete, error) { that._getObjectStore("readwrite").done(function (store) { var reqDelete = store.delete (key); reqDelete.onerror = that._error; reqDelete.onsuccess = function (evt) { that._notificationHandler.removed(key.toString()); complete(); }; }); }); }, nuke: function () { var that = this; return new WinJS.Promise(function (complete, error) { that._getObjectStore("readwrite").done(function (store) { var reqClear = store.clear(); reqClear.onerror = that._error; reqClear.onsuccess = function (evt) { that._notificationHandler.reload(); complete(); }; }); }); }, /******************************************* * Private Methods ********************************************/ _ensureDbOpen: function () { var that = this; // Try to get cached Db if (that._cachedDb) { return WinJS.Promise.wrap(that._cachedDb); } // Otherwise, open the database return new WinJS.Promise(function (complete, error, progress) { var reqOpen = window.indexedDB.open(that._dbName, that._dbVersion); reqOpen.onerror = function (evt) { error(); }; reqOpen.onupgradeneeded = function (evt) { that._upgrade(evt); that._notificationHandler.invalidateAll(); }; reqOpen.onsuccess = function () { that._cachedDb = reqOpen.result; complete(that._cachedDb); }; }); }, _getObjectStore: function (type) { type = type || "readonly"; var that = this; return new WinJS.Promise(function (complete, error) { that._ensureDbOpen().then(function (db) { var transaction = db.transaction(that._objectStoreName, type); complete(transaction.objectStore(that._objectStoreName)); }); }); }, _get: function (key) { return new WinJS.Promise(function (complete, error) { that._getObjectStore().done(function (store) { var reqGet = store.get(key); reqGet.onerror = that._error; reqGet.onsuccess = function (item) { complete(item); }; }); }); } } ); var IndexedDbDataSource = WinJS.Class.derive( WinJS.UI.VirtualizedDataSource, function (dbName, dbVersion, objectStoreName, upgrade, error) { this._adapter = new IndexedDbDataAdapter(dbName, dbVersion, objectStoreName, upgrade, error); this._baseDataSourceConstructor(this._adapter); }, { nuke: function () { this._adapter.nuke(); }, remove: function (key) { this._adapter.removeInternal(key); } } ); WinJS.Namespace.define("DataSources", { IndexedDbDataSource: IndexedDbDataSource }); })(); Summary In this blog post, I provided an overview of how you can create a new data source which you can use with the WinJS library. I described how you can create an IndexedDbDataSource which you can use to bind a ListView control to an IndexedDB database. While describing how you can create a custom data source, I explained how you can implement the IListDataAdapter interface. You also learned how to raise notifications — such as a removed or invalidateAll notification — by taking advantage of the methods of the IListDataNotificationHandler interface.

    Read the article

  • memory leak error when using an iterator

    - by Adnane Jaafari
    please i'm having this error if any one can explain it : while using an iterator in my methode public void createDemandeP() { if (demandep.getDateDebut().after(demandep.getDateFin())) { FacesContext .getCurrentInstance() .addMessage( null, new FacesMessage(FacesMessage.SEVERITY_WARN, "Attention aux dates", "la date de debut doit être avant la date de fin!")); } else if (demandep.getDateDebut().before(demandep.getDateFin())) { List<DemandeP> list = new ArrayList<DemandeP>(); list.addAll(chaletService.getChaletBylibelle(chaletChoisi).get(0) .getListDemandesP()); Iterator<DemandeP> it = list.iterator(); DemandeP d = it.next(); while (it.hasNext()) { if ((d.getDateDebut().compareTo(demandep.getDateDebut()) == 0) || (d.getDateFin().compareTo(demandep.getDateDebut()) == 0) || (d.getDateFin().compareTo(demandep.getDateFin()) == 0) || (d.getDateDebut().compareTo(demandep.getDateDebut()) == 0) || (d.getDateDebut().before(demandep.getDateDebut()) && d .getDateFin().after(demandep.getDateFin())) || (d.getDateDebut().before(demandep.getDateFin()) && d .getDateDebut().after(demandep.getDateDebut())) || (d.getDateFin().after(demandep.getDateDebut()) && d .getDateFin().before(demandep.getDateFin()))) { FacesContext.getCurrentInstance().getMessageList().clear(); FacesContext .getCurrentInstance() .addMessage( null, new FacesMessage( FacesMessage.SEVERITY_FATAL, "Periode Ou chalet indisponicle ", "Veillez choisir une autre marge de date !")); } } } else { demandep.setEtat("En traitement"); DateFormat dateFormat = new SimpleDateFormat("dd/MM/yyyy"); Date date = new Date(); try { demandep.setDateDemande(dateFormat.parse(dateFormat .format(date))); } catch (ParseException e) { System.out.println("errooor date"); e.printStackTrace(); } nameUser = auth.getName(); // System.out.println(nameUser); adherent = utilisateurService.findAdherentByNom(nameUser).get(0); demandep.setUtilisateur(adherent); // System.out.println(chaletService.getChaletBylibelle(chaletChoisi).get(0).getLibelle()); demandep.setChalet(chaletService.getChaletBylibelle(chaletChoisi) .get(0)); demandep.setNouvelleDemande(true); demandePService.ajouterDemandeP(demandep); } } oct. 23, 2013 7:19:30 PM org.apache.catalina.core.StandardContext reload INFO: Le rechargement du contexte [/ONICLFINAL] a démarré oct. 23, 2013 7:19:30 PM org.apache.catalina.core.StandardWrapper unload INFO: Waiting for 1 instance(s) to be deallocated oct. 23, 2013 7:19:31 PM org.apache.catalina.core.StandardWrapper unload INFO: Waiting for 1 instance(s) to be deallocated oct. 23, 2013 7:19:32 PM org.apache.catalina.core.StandardWrapper unload INFO: Waiting for 1 instance(s) to be deallocated oct. 23, 2013 7:19:32 PM org.apache.catalina.core.ApplicationContext log INFO: Closing Spring root WebApplicationContext oct. 23, 2013 7:19:32 PM org.apache.catalina.loader.WebappClassLoader clearReferencesJdbc SEVERE: The web application [/ONICLFINAL] registered the JDBC driver [com.mysql.jdbc.Driver] b but failed to unregister it when the web application was stopped. To prevent a memory leak, the JDBC Driver has been forcibly unregistered. oct. 23, 2013 7:19:32 PM org.apache.catalina.loader.WebappClassLoader clearReferencesThreads SEVERE: The web application [/ONICLFINAL] appears to have started a thread named [MySQL Statement Cancellation Timer] but has failed to stop it. This is very likely to create a memory leak. oct. 23, 2013 7:19:32 PM org.apache.catalina.loader.WebappClassLoader clearReferencesThreads SE VERE: The web application [/ONICLFINAL] is still processing a request that has yet to finish. This is very likely to create a memory leak. You can control the time allowed for requests to finish by using the unloadDelay attribute of the standard Context implementation. oct. 23, 2013 7:19:32 PM org.apache.catalina.loader.WebappClassLoader clearThreadLocalMap SEVERE: The web application [/ONICLFINAL] created a ThreadLocal with key of type [org.springframework.core.NamedThreadLocal] (value [Hibernate Sessions registered for deferred close]) and a value of type [java.util.HashMap] (value [{org.hibernate.impl.SessionFactoryImpl@f6e256=[SessionImpl(PersistenceContext[entityKeys=[EntityKey[bo.DemandeP#1], EntityKey[bo.Utilisateur#3], EntityKey[bo.Chalet#1], EntityKey[bo.Role#2], EntityKey[bo.DemandeP#2]],collectionKeys=[CollectionKey[bo.Role.ListeUsers#2], CollectionKey[bo.Chalet.listPeriodes#1], CollectionKey[bo.Utilisateur.demandes#3], CollectionKey[bo.Utilisateur.demandesP#3], CollectionKey[bo.Chalet.listDemandesP#1]]];ActionQueue[insertions=[] updates=[] deletions=[] collectionCreations=[] collectionRemovals=[] collectionUpdates=[]])]}]) but failed to remove it when the web application was stopped. This is very likely to create a memory leak. oct. 23, 2013 7:19:32 PM org.apache.catalina.loader.WebappClassLoader clearThreadLocalMap SEVERE: The web application [/ONICLFINAL] created a ThreadLocal with key of type [org.springframework.core.NamedThreadLocal] (value [Request attributes]) and a value of type [org.springframework.web.context.request.ServletRequestAttributes] (value [org.apache.catalina.connector.RequestFacade@17f3488]) but failed to remove it when the web application was stopped. This is very likely to create a memory leak. oct. 23, 2013 7:19:32 PM org.apache.catalina.loader.WebappClassLoader clearThreadLocalMap SEVERE: The web application [/ONICLFINAL] created a ThreadLocal with key of type [java.lang.ThreadLocal] (value [java.lang.ThreadLocal@51f78b]) and a value of type [org.springframework.security.core.context.SecurityContextImpl] (value [org.springframework.security.core.context.SecurityContextImpl@8e463c8b: Authentication: org.springframework.security.authentication.UsernamePasswordAuthenticationToken@8e463c8b: Principal: org.springframework.security.core.userdetails.User@311aa119: Username: maatouf; Password: [PROTECTED]; Enabled: true; AccountNonExpired: true; credentialsNonExpired: true; AccountNonLocked: true; Granted Authorities: ROLE_ADHER; Credentials: [PROTECTED]; Authenticated: true; Details: org.springframework.security.web.authentication.WebAuthenticationDetails@ffff4c9c: RemoteIpAddress: 0:0:0:0:0:0:0:1; SessionId: 14CD5D4E8E0E3AEB0367AB7115038FED; Granted Authorities: ROLE_ADHER]) but failed to remove it when the web application was stopped. This is very likely to create a memory leak. oct. 23, 2013 7:19:32 PM org.apache.catalina.loader.WebappClassLoader clearThreadLocalMap SEVERE: The web application [/ONICLFINAL] created a ThreadLocal with key of type [java.lang.ThreadLocal] (value [java.lang.ThreadLocal@152e9b7]) and a value of type [net.sf.cglib.proxy.Callback[]] (value [[Lnet.sf.cglib.proxy.Callback;@6e1f4c]) but failed to remove it when the web application was stopped. This is very likely to create a memory leak. oct. 23, 2013 7:19:32 PM org.apache.catalina.loader.WebappClassLoader clearThreadLocalMap SEVERE: The web application [/ONICLFINAL] created a ThreadLocal with key of type [javax.faces.context.FacesContext$1] (value [javax.faces.context.FacesContext$1@9ecc6d]) and a value of type [com.sun.faces.context.FacesContextImpl] (value [com.sun.faces.context.FacesContextImpl@1c8bbed]) but failed to remove it when the web application was stopped. This is very likely to create a memory leak. oct. 23, 2013 7:19:32 PM org.apache.catalina.loader.WebappClassLoader clearThreadLocalMap SEVERE: The web application [/ONICLFINAL] created a ThreadLocal with key of type [java.lang.ThreadLocal] (value [java.lang.ThreadLocal@1a9e75f]) and a value of type [com.sun.faces.context.FacesContextImpl] (value [com.sun.faces.context.FacesContextImpl@1c8bbed]) but failed to remove it when the web application was stopped. This is very likely to create a memory leak. oct. 23, 2013 7:19:32 PM org.apache.catalina.loader.WebappClassLoader clearThreadLocalMap SEVERE: The web application [/ONICLFINAL] created a ThreadLocal with key of type [org.springframework.core.NamedThreadLocal] (value [Locale context]) and a value of type [org.springframework.context.i18n.SimpleLocaleContext] (value [fr_FR]) but failed to remove it when the web application was stopped. This is very likely to create a memory leak. oct. 23, 2013 7:19:32 PM org.apache.catalina.loader.WebappClassLoader clearThreadLocalMap SEVERE: The web application [/ONICLFINAL] created a ThreadLocal with key of type [com.sun.faces.application.ApplicationAssociate$1] (value [com.sun.faces.application.ApplicationAssociate$1@195266b]) and a value of type [com.sun.faces.application.ApplicationAssociate] (value [com.sun.faces.application.ApplicationAssociate@10d595c]) but failed to remove it when the web application was stopped. This is very likely to create a memory leak. oct. 23, 2013 7:19:33 PM org.apache.catalina.loader.WebappClassLoader validateJarFile INFO: validateJarFile(D:\newWorkSpace\.metadata\.plugins\org.eclipse.wst.server.core\tmp0\wtpwebapps\ONICLF INAL\WEB-INF\lib\servlet-api-2.5.jar) - jar not loaded. See Servlet Spec 2.3, section 9.7.2. Offending class: javax/servlet/Servlet.class oct. 23, 2013 7:19:33 PM org.apache.catalina.core.ApplicationContext log

    Read the article

  • javax.validation.ConstraintViolationException: validation failed for classes during update time for groups

    - by Tim
    Hello all! I have a Java / Spring MVC 3 application, using Hibernate and a MySQL database. In my controller, I have this source code: Set<ConstraintViolation<Person>> failures = validator.validate(p); if (failures.isEmpty()) { Project project = this.projectService.findProjectById(projectid); Person newPerson = this.personService.addPerson(p); Set<Person> persons = this.personService.getAllPersonsByProjectId(projectid); persons.add(newPerson); project.setPersons(persons); Set<ConstraintViolation<Project>> failures1 = validator.validate(project); if (!failures1.isEmpty()) { System.out.println("ERROR"); } else { System.out.println("NO ERROR"); } this.projectService.updateProject(project); return Collections.singletonMap("person", newPerson); } Project and Person are a many-to-many relation annotated with @manytomany and Project is the mapping owner. The new Person is added, but on the line with this.projectService.updateProject(project); I get an error. What it does it this in a Dao Hibernate implementation: public void updateProject(Project p) { SessionFactory sessionFactory = HibernateUtil.getSessionFactory(); Session sess = sessionFactory.getCurrentSession(); Transaction tx = sess.beginTransaction(); sess.update(p); tx.commit(); } It failed on the line tx.commit();. My check with if (!failures1.isEmpty()) { tell me that there are nor errors in my project. So what's wrong here? And why there is a validation of my project? I did not call a validation method... so why is there a org.hibernate.cfg.beanvalidation.BeanValidationEventListener.validate()? I hope, someone can help me how to fix this! Best Regards, Tim. Here the full error stack trace: 13.01.2011 00:06:36 org.apache.catalina.core.ApplicationDispatcher invoke SERVE: Servlet.service() for servlet project3 threw exception javax.validation.ConstraintViolationException: validation failed for classes [com.mydomain.myproject.domain.Person] during update time for groups [javax.validation.groups.Default, ] at org.hibernate.cfg.beanvalidation.BeanValidationEventListener.validate(BeanValidationEventListener.java:155) at org.hibernate.cfg.beanvalidation.BeanValidationEventListener.onPreUpdate(BeanValidationEventListener.java:102) at org.hibernate.action.EntityUpdateAction.preUpdate(EntityUpdateAction.java:235) at org.hibernate.action.EntityUpdateAction.execute(EntityUpdateAction.java:86) at org.hibernate.engine.ActionQueue.execute(ActionQueue.java:273) at org.hibernate.engine.ActionQueue.executeActions(ActionQueue.java:265) at org.hibernate.engine.ActionQueue.executeActions(ActionQueue.java:185) at org.hibernate.event.def.AbstractFlushingEventListener.performExecutions(AbstractFlushingEventListener.java:321) at org.hibernate.event.def.DefaultFlushEventListener.onFlush(DefaultFlushEventListener.java:51) at org.hibernate.impl.SessionImpl.flush(SessionImpl.java:1216) at org.hibernate.impl.SessionImpl.managedFlush(SessionImpl.java:383) at org.hibernate.transaction.JDBCTransaction.commit(JDBCTransaction.java:133) at com.mydomain.myproject.dao.impl.ProjectDaoImplHibernate.updateProject(ProjectDaoImplHibernate.java:44) at com.mydomain.myproject.service.impl.ProjectServiceImpl.updateProject(ProjectServiceImpl.java:39) at com.mydomain.myproject.controller.ProjectPersonController.addPerson(ProjectPersonController.java:189) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.springframework.web.bind.annotation.support.HandlerMethodInvoker.invokeHandlerMethod(HandlerMethodInvoker.java:176) at org.springframework.web.servlet.mvc.annotation.AnnotationMethodHandlerAdapter.invokeHandlerMethod(AnnotationMethodHandlerAdapter.java:426) at org.springframework.web.servlet.mvc.annotation.AnnotationMethodHandlerAdapter.handle(AnnotationMethodHandlerAdapter.java:414) at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:790) at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:719) at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:644) at org.springframework.web.servlet.FrameworkServlet.doPost(FrameworkServlet.java:560) at javax.servlet.http.HttpServlet.service(HttpServlet.java:637) at javax.servlet.http.HttpServlet.service(HttpServlet.java:717) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) at org.apache.catalina.core.ApplicationDispatcher.invoke(ApplicationDispatcher.java:646) at org.apache.catalina.core.ApplicationDispatcher.processRequest(ApplicationDispatcher.java:436) at org.apache.catalina.core.ApplicationDispatcher.doForward(ApplicationDispatcher.java:374) at org.apache.catalina.core.ApplicationDispatcher.forward(ApplicationDispatcher.java:302) at org.tuckey.web.filters.urlrewrite.NormalRewrittenUrl.doRewrite(NormalRewrittenUrl.java:195) at org.tuckey.web.filters.urlrewrite.RuleChain.handleRewrite(RuleChain.java:159) at org.tuckey.web.filters.urlrewrite.RuleChain.doRules(RuleChain.java:141) at org.tuckey.web.filters.urlrewrite.UrlRewriter.processRequest(UrlRewriter.java:90) at org.tuckey.web.filters.urlrewrite.UrlRewriteFilter.doFilter(UrlRewriteFilter.java:417) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:88) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:76) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233) at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191) at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127) at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102) at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109) at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:298) at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:857) at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:588) at org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:489) at java.lang.Thread.run(Thread.java:619) 13.01.2011 00:06:36 org.apache.catalina.core.StandardWrapperValve invoke SERVE: Servlet.service() for servlet default threw exception javax.validation.ConstraintViolationException: validation failed for classes [com.mydomain.myproject.domain.Person] during update time for groups [javax.validation.groups.Default, ] at org.hibernate.cfg.beanvalidation.BeanValidationEventListener.validate(BeanValidationEventListener.java:155) at org.hibernate.cfg.beanvalidation.BeanValidationEventListener.onPreUpdate(BeanValidationEventListener.java:102) at org.hibernate.action.EntityUpdateAction.preUpdate(EntityUpdateAction.java:235) at org.hibernate.action.EntityUpdateAction.execute(EntityUpdateAction.java:86) at org.hibernate.engine.ActionQueue.execute(ActionQueue.java:273) at org.hibernate.engine.ActionQueue.executeActions(ActionQueue.java:265) at org.hibernate.engine.ActionQueue.executeActions(ActionQueue.java:185) at org.hibernate.event.def.AbstractFlushingEventListener.performExecutions(AbstractFlushingEventListener.java:321) at org.hibernate.event.def.DefaultFlushEventListener.onFlush(DefaultFlushEventListener.java:51) at org.hibernate.impl.SessionImpl.flush(SessionImpl.java:1216) at org.hibernate.impl.SessionImpl.managedFlush(SessionImpl.java:383) at org.hibernate.transaction.JDBCTransaction.commit(JDBCTransaction.java:133) at com.mydomain.myproject.dao.impl.ProjectDaoImplHibernate.updateProject(ProjectDaoImplHibernate.java:44) at com.mydomain.myproject.service.impl.ProjectServiceImpl.updateProject(ProjectServiceImpl.java:39) at com.mydomain.myproject.controller.ProjectPersonController.addPerson(ProjectPersonController.java:189) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.springframework.web.bind.annotation.support.HandlerMethodInvoker.invokeHandlerMethod(HandlerMethodInvoker.java:176) at org.springframework.web.servlet.mvc.annotation.AnnotationMethodHandlerAdapter.invokeHandlerMethod(AnnotationMethodHandlerAdapter.java:426) at org.springframework.web.servlet.mvc.annotation.AnnotationMethodHandlerAdapter.handle(AnnotationMethodHandlerAdapter.java:414) at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:790) at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:719) at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:644) at org.springframework.web.servlet.FrameworkServlet.doPost(FrameworkServlet.java:560) at javax.servlet.http.HttpServlet.service(HttpServlet.java:637) at javax.servlet.http.HttpServlet.service(HttpServlet.java:717) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) at org.apache.catalina.core.ApplicationDispatcher.invoke(ApplicationDispatcher.java:646) at org.apache.catalina.core.ApplicationDispatcher.processRequest(ApplicationDispatcher.java:436) at org.apache.catalina.core.ApplicationDispatcher.doForward(ApplicationDispatcher.java:374) at org.apache.catalina.core.ApplicationDispatcher.forward(ApplicationDispatcher.java:302) at org.tuckey.web.filters.urlrewrite.NormalRewrittenUrl.doRewrite(NormalRewrittenUrl.java:195) at org.tuckey.web.filters.urlrewrite.RuleChain.handleRewrite(RuleChain.java:159) at org.tuckey.web.filters.urlrewrite.RuleChain.doRules(RuleChain.java:141) at org.tuckey.web.filters.urlrewrite.UrlRewriter.processRequest(UrlRewriter.java:90) at org.tuckey.web.filters.urlrewrite.UrlRewriteFilter.doFilter(UrlRewriteFilter.java:417) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:88) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:76) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233) at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191) at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127) at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102) at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109) at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:298) at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:857) at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:588) at org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:489) at java.lang.Thread.run(Thread.java:619) UPDATE Before updating the Project where the error occurs, I add a person which have this annotated: @NotNull @Size(min = 1, max = 255) @Pattern(regexp="(?:[a-z0-9!#$%&'*+/=?^_`{|}~-]+(?:\\.[a-z0-9!#$%&'*+/=?^_`{|}~-]+)*|\"(?:[\\x01-\\x08\\x0b\\x0c\\x0e-\\x1f\\x21\\x23-\\x5b\\x5d-\\x7f]|\\\\[\\x01-\\x09\\x0b\\x0c\\x0e-\\x7f])*\")@(?:(?:[a-z0-9](?:[a-z0-9-]*[a-z0-9])?\\.)+[a-z0-9](?:[a-z0-9-]*[a-z0-9])?|\\[(?:(?:25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\\.){3}(?:25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?|[a-z0-9-]*[a-z0-9]:(?:[\\x01-\\x08\\x0b\\x0c\\x0e-\\x1f\\x21-\\x5a\\x53-\\x7f]|\\\\[\\x01-\\x09\\x0b\\x0c\\x0e-\\x7f])+)\\])", message="{my.email.error.message}") private String email; Without the @Pattern no error... So, what's wrong here? UPDATE-2: I use Hibernate 3.6.0.Final and I have these in my Maven pom.xml: <!-- JSR 303 with Hibernate Validator --> <dependency> <groupId>javax.validation</groupId> <artifactId>validation-api</artifactId> <version>1.0.0.GA</version> </dependency> <dependency> <groupId>org.hibernate</groupId> <artifactId>hibernate-validator</artifactId> <version>4.1.0.Final</version> </dependency>

    Read the article

  • Maven: Unresolved references to [org.osgi.service.http]

    - by Simone Vellei
    I'm trying to create a bundle using HttpService for register Servlet using maven-bundle-plugin. The pom.xml of the project is: <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd"> <modelVersion>4.0.0</modelVersion> <groupId>felix-tutorial</groupId> <artifactId>example-1</artifactId> <version>1.0</version> <packaging>bundle</packaging> <name>Apache Felix Tutorial Example 1</name> <description>Apache Felix Tutorial Example 1</description> <!-- Build Configuration --> <build> <plugins> <plugin> <groupId>org.apache.felix</groupId> <artifactId>maven-bundle-plugin</artifactId> <extensions>true</extensions> <configuration> <instructions> <Bundle-SymbolicName>${pom.groupId}.${pom.artifactId}</Bundle-SymbolicName> <Bundle-Name>Service listener example</Bundle-Name> <Bundle-Description>A bundle that displays messages at startup and when service events occur</Bundle-Description> <Bundle-Vendor>Apache Felix</Bundle-Vendor> <Bundle-Version>1.0.0</Bundle-Version> <Bundle-Activator>tutorial.example1.Activator</Bundle-Activator> <Import-Package>org.osgi.framework;version="1.0.0", javax.servlet, javax.servlet.http</Import-Package> </instructions> </configuration> </plugin> </plugins> </build> <!-- Dependecies Management --> <dependencies> <dependency> <groupId>org.apache.felix</groupId> <artifactId>org.apache.felix.framework</artifactId> <version>2.0.4</version> </dependency> <dependency> <groupId>junit</groupId> <artifactId>junit</artifactId> <version>4.8.1</version> <scope>test</scope> </dependency> <dependency> <groupId>org.apache.felix</groupId> <artifactId>org.apache.felix.http.api</artifactId> <version>2.0.4</version> </dependency> <dependency> <groupId>org.apache.felix</groupId> <artifactId>org.apache.felix.http.base</artifactId> <version>2.0.4</version> </dependency> <dependency> <groupId>org.apache.felix</groupId> <artifactId>org.apache.felix.http.bridge</artifactId> <version>2.0.4</version> </dependency> <dependency> <groupId>org.apache.felix</groupId> <artifactId>org.apache.felix.http.bundle</artifactId> <version>2.0.4</version> </dependency> <dependency> <groupId>org.apache.felix</groupId> <artifactId>org.apache.felix.http.proxy</artifactId> <version>2.0.4</version> </dependency> <dependency> <groupId>org.apache.felix</groupId> <artifactId>org.apache.felix.http.whiteboard</artifactId> <version>2.0.4</version> </dependency> <dependency> <groupId>org.osgi</groupId> <artifactId>osgi_R4_compendium</artifactId> <version>1.0</version> </dependency> </dependencies> </project> "mvn install" command returns the following error: [INFO] Scanning for projects... [INFO] ------------------------------------------------------------------------ [INFO] Building Apache Felix Tutorial Example 1 [INFO] task-segment: [install] [INFO] ------------------------------------------------------------------------ Downloading: http://repo1.maven.org/maven2/org/apache/maven/plugins/maven-resources-plugin/2.3/maven-resources-plugin-2.3.pom Downloading: http://repo1.maven.org/maven2/org/apache/maven/plugins/maven-resources-plugin/2.3/maven-resources-plugin-2.3.jar Downloading: http://repo1.maven.org/maven2/org/apache/maven/plugins/maven-install-plugin/2.2/maven-install-plugin-2.2.pom Downloading: http://repo1.maven.org/maven2/org/apache/maven/plugins/maven-install-plugin/2.2/maven-install-plugin-2.2.jar Downloading: http://repo1.maven.org/maven2/org/apache/maven/shared/maven-filtering/1.0-beta-2/maven-filtering-1.0-beta-2.pom Downloading: http://repo1.maven.org/maven2/org/codehaus/plexus/plexus-interpolation/1.6/plexus-interpolation-1.6.pom Downloading: http://repo1.maven.org/maven2/org/codehaus/plexus/plexus-interpolation/1.6/plexus-interpolation-1.6.jar Downloading: http://repo1.maven.org/maven2/org/apache/maven/shared/maven-filtering/1.0-beta-2/maven-filtering-1.0-beta-2.jar [INFO] [resources:resources {execution: default-resources}] [WARNING] Using platform encoding (Cp1252 actually) to copy filtered resources, i.e. build is platform dependent! [INFO] skip non existing resourceDirectory C:\eclipse\ws\stripes-bundle\src\main\resources [INFO] [compiler:compile {execution: default-compile}] [INFO] Nothing to compile - all classes are up to date [INFO] [resources:testResources {execution: default-testResources}] [WARNING] Using platform encoding (Cp1252 actually) to copy filtered resources, i.e. build is platform dependent! [INFO] skip non existing resourceDirectory C:\eclipse\ws\stripes-bundle\src\test\resources [INFO] [compiler:testCompile {execution: default-testCompile}] [INFO] Nothing to compile - all classes are up to date [INFO] [surefire:test {execution: default-test}] [INFO] Surefire report directory: C:\eclipse\ws\stripes-bundle\target\surefire-reports ------------------------------------------------------- T E S T S ------------------------------------------------------- Running com.beanopoly.stripes.AppTest Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.031 sec Results : Tests run: 1, Failures: 0, Errors: 0, Skipped: 0 [INFO] [bundle:bundle {execution: default-bundle}] [ERROR] Error building bundle felix-tutorial:example-1:bundle:1.0 : Unresolved references to [org.osgi.service.http] by class(es) on the Bundle-Classpath[Jar:do [ERROR] Error(s) found in bundle configuration [INFO] ------------------------------------------------------------------------ [ERROR] BUILD ERROR [INFO] ------------------------------------------------------------------------ [INFO] Error(s) found in bundle configuration [INFO] ------------------------------------------------------------------------ [INFO] For more information, run Maven with the -e switch [INFO] ------------------------------------------------------------------------ [INFO] Total time: 12 seconds [INFO] Finished at: Sat Mar 27 13:11:47 CET 2010 [INFO] Final Memory: 12M/21M [INFO] ------------------------------------------------------------------------

    Read the article

< Previous Page | 671 672 673 674 675 676 677  | Next Page >