Search Results

Search found 94339 results on 3774 pages for 'system data'.

Page 927/3774 | < Previous Page | 923 924 925 926 927 928 929 930 931 932 933 934  | Next Page >

  • Interpolating Large Datasets On the Fly

    - by Karl
    Interpolating Large Datasets I have a large data set of about 0.5million records representing the exchange rate between the USD / GBP over the course of a given day. I have an application that wants to be able to graph this data or maybe a subset. For obvious reasons I do not want to plot 0.5 million points on my graph. What I need is a smaller data set (100 points or so) which accurately (as possible) represents the given data. Does anyone know of any interesting and performant ways this data can be achieved? Cheers, Karl

    Read the article

  • Multi-Threaded Application vs. Single Threaded Application

    Why would we use a multi threaded application vs. a single threaded application? First we must define multithreading. Multithreading is a feature of an operating system that allows programs to run subcomponents or threads in parallel. Typically most applications only need to use one thread because they do not perform time consuming tasks. The use of multiple threads allows an application to distribute long running tasks so that they can be executed in parallel. This gives the user the perceived appearance that the application is working faster due to the fact that while one thread is waiting on an IO process the remaining tasks can make use of the available CPU. The allows working threads to execute in tandem so that they can be competed sooner. Multithreading Benefits Improved responsiveness — Users usually report improved responsiveness compared to single thread applications. Faster applications — Multiple threads can lead to improved application performance. Prioritization — Threads can be assigned a priority which would allow higher priority tasks to take precedence over lower priority tasks. Single Threading Benefits Programming and debugging —These activities are easier compared to multithreaded applications due to the reduced complexity Less Overhead — Threads add overhead to an application When developing multi-threaded applications, the following must be considered. Deadlocks occur when two threads hold a monitor that the other one requires. In essence each task is blocking the other and both tasks are waiting for the other monitor to be released. This forces an application to hang or deadlock. Resource allocation is used to prevent deadlocks because the system determines if approving the resource request will render the system in an unsafe state. An unsafe state could result in a deadlock. The system only approves requests that will lead to safe states. Thread Synchronization is used when multiple threads use the same instance of an object. The threads accessing the object can then be locked and then synchronized so that each task can interact with the static object on at a time.

    Read the article

  • CDN on Hosted Service in Windows Azure

    - by Shaun
    Yesterday I told Wang Tao, an annoying colleague sitting beside me, about how to make the static content enable the CDN in his website which had just been published on Windows Azure. The approach would be Move the static content, the images, CSS files, etc. into the blob storage. Enable the CDN on his storage account. Change the URL of those static files to the CDN URL. I think these are the very common steps when using CDN. But this morning I found that the new Windows Azure SDK 1.4 and new Windows Azure Developer Portal had just been published announced at the Windows Azure Blog. One of the new features in this release is about the CDN, which means we can enabled the CDN not only for a storage account, but a hosted service as well. Within this new feature the steps I mentioned above would be turned simpler a lot.   Enable CDN for Hosted Service To enable the CDN for a hosted service we just need to log on the Windows Azure Developer Portal. Under the “Hosted Services, Storage Accounts & CDN” item we will find a new menu on the left hand side said “CDN”, where we can manage the CDN for storage account and hosted service. As we can see the hosted services and storage accounts are all listed in my subscriptions. To enable a CDN for a hosted service is veru simple, just select a hosted service and click the New Endpoint button on top. In this dialog we can select the subscription and the storage account, or the hosted service we want the CDN to be enabled. If we selected the hosted service, like I did in the image above, the “Source URL for the CDN endpoint” will be shown automatically. This means the windows azure platform will make all contents under the “/cdn” folder as CDN enabled. But we cannot change the value at the moment. The following 3 checkboxes next to the URL are: Enable CDN: Enable or disable the CDN. HTTPS: If we need to use HTTPS connections check it. Query String: If we are caching content from a hosted service and we are using query strings to specify the content to be retrieved, check it. Just click the “Create” button to let the windows azure create the CDN for our hosted service. The CDN would be available within 60 minutes as Microsoft mentioned. My experience is that about 15 minutes the CDN could be used and we can find the CDN URL in the portal as well.   Put the Content in CDN in Hosted Service Let’s create a simple windows azure project in Visual Studio with a MVC 2 Web Role. When we created the CDN mentioned above the source URL of CDN endpoint would be under the “/cdn” folder. So in the Visual Studio we create a folder under the website named “cdn” and put some static files there. Then all these files would be cached by CDN if we use the CDN endpoint. The CDN of the hosted service can cache some kind of “dynamic” result with the Query String feature enabled. We create a controller named CdnController and a GetNumber action in it. The routed URL of this controller would be /Cdn/GetNumber which can be CDN-ed as well since the URL said it’s under the “/cdn” folder. In the GetNumber action we just put a number value which specified by parameter into the view model, then the URL could be like /Cdn/GetNumber?number=2. 1: using System; 2: using System.Collections.Generic; 3: using System.Linq; 4: using System.Web; 5: using System.Web.Mvc; 6:  7: namespace MvcWebRole1.Controllers 8: { 9: public class CdnController : Controller 10: { 11: // 12: // GET: /Cdn/ 13:  14: public ActionResult GetNumber(int number) 15: { 16: return View(number); 17: } 18:  19: } 20: } And we add a view to display the number which is super simple. 1: <%@ Page Title="" Language="C#" MasterPageFile="~/Views/Shared/Site.Master" Inherits="System.Web.Mvc.ViewPage<int>" %> 2:  3: <asp:Content ID="Content1" ContentPlaceHolderID="TitleContent" runat="server"> 4: GetNumber 5: </asp:Content> 6:  7: <asp:Content ID="Content2" ContentPlaceHolderID="MainContent" runat="server"> 8:  9: <h2>The number is: <% 1: : Model.ToString() %></h2> 10:  11: </asp:Content> Since this action is under the CdnController the URL would be under the “/cdn” folder which means it can be CDN-ed. And since we checked the “Query String” the content of this dynamic page will be cached by its query string. So if I use the CDN URL, http://az25311.vo.msecnd.net/GetNumber?number=2, the CDN will firstly check if there’s any content cached with the key “GetNumber?number=2”. If yes then the CDN will return the content directly; otherwise it will connect to the hosted service, http://aurora-sys.cloudapp.net/Cdn/GetNumber?number=2, and then send the result back to the browser and cached in CDN. But to be notice that the query string are treated as string when used by the key of CDN element. This means the URLs below would be cached in 2 elements in CDN: http://az25311.vo.msecnd.net/GetNumber?number=2&page=1 http://az25311.vo.msecnd.net/GetNumber?page=1&number=2 The final step is to upload the project onto azure. Test the Hosted Service CDN After published the project on azure, we can use the CDN in the website. The CDN endpoint we had created is az25311.vo.msecnd.net so all files under the “/cdn” folder can be requested with it. Let’s have a try on the sample.htm and c_great_wall.jpg static files. Also we can request the dynamic page GetNumber with the query string with the CDN endpoint. And if we refresh this page it will be shown very quickly since the content comes from the CDN without MCV server side process. We style of this page was missing. This is because the CSS file was not includes in the “/cdn” folder so the page cannot retrieve the CSS file from the CDN URL.   Summary In this post I introduced the new feature in Windows Azure CDN with the release of Windows Azure SDK 1.4 and new Developer Portal. With the CDN of the Hosted Service we can just put the static resources under a “/cdn” folder so that the CDN can cache them automatically and no need to put then into the blob storage. Also it support caching the dynamic content with the Query String feature. So that we can cache some parts of the web page by using the UserController and CDN. For example we can cache the log on user control in the master page so that the log on part will be loaded super-fast. There are some other new features within this release you can find here. And for more detailed information about the Windows Azure CDN please have a look here as well.   Hope this helps, Shaun All documents and related graphics, codes are provided "AS IS" without warranty of any kind. Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.

    Read the article

  • Using Rich Text Editor (WYSIWYG) in ASP.NET MVC

    - by imran_ku07
       Introduction:          In ASP.NET MVC forum I found some question regarding a sample HTML Rich Text Box Editor(also known as wysiwyg).So i decided to create a sample ASP.NET MVC web application which will use a Rich Text Box Editor. There are are lot of Html Editors are available, but for creating a sample application, i decided to use cross-browser WYSIWYG editor from openwebware. In this article I will discuss what changes needed to work this editor with ASP.NET MVC. Also I had attached the sample application for download at http://www.speedfile.org/155076. Also note that I will only show the important features, not discuss every feature in detail.   Description:          So Let's start create a sample ASP.NET MVC application. You need to add the following script files,         jquery-1.3.2.min.js        jquery_form.js        wysiwyg.js        wysiwyg-settings.js        wysiwyg-popup.js          Just put these files inside Scripts folder. Also put wysiwyg.css in your Content Folder and add the following folders in your project        addons        popups          Also create a empty folder Uploads to store the uploaded images. Next open wysiwyg.js and set your configuration                  // Images Directory        this.ImagesDir = "/addons/imagelibrary/images/";                // Popups Directory        this.PopupsDir = "/popups/";                // CSS Directory File        this.CSSFile = "/Content/wysiwyg.css";              Next create a simple View TextEditor.aspx inside View / Home Folder and add the folllowing HTML.        <%@ Page Language="C#" Inherits="System.Web.Mvc.ViewPage" %>            <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">        <html >            <head runat="server">                <title>TextEditor</title>                <script src="../../Scripts/wysiwyg.js" type="text/javascript"></script>                <script src="../../Scripts/wysiwyg-settings.js" type="text/javascript"></script>                <script type="text/javascript">                            WYSIWYG.attach('text', full);                            </script>            </head>            <body>                <% using (Html.BeginForm()){ %>                    <textarea id="text" name="test2" style="width:850px;height:200px;">                    </textarea>                    <input type="submit" value="submit" />                <%} %>            </body>        </html>                  Here i have just added a text area control and a submit button inside a form. Note the id of text area and WYSIWYG.attach function's first parameter is same and next to watch is the HomeController.cs        using System;        using System.Collections.Generic;        using System.Linq;        using System.Web;        using System.Web.Mvc;        using System.IO;        namespace HtmlTextEditor.Controllers        {            [HandleError]            public class HomeController : Controller            {                public ActionResult Index()                {                    ViewData["Message"] = "Welcome to ASP.NET MVC!";                    return View();                }                    public ActionResult About()                {                                return View();                }                        public ActionResult TextEditor()                {                    return View();                }                [AcceptVerbs(HttpVerbs.Post)]                [ValidateInput(false)]                public ActionResult TextEditor(string test2)                {                    Session["html"] = test2;                            return RedirectToAction("Index");                }                        public ActionResult UploadImage()                {                    if (Request.Files[0].FileName != "")                    {                        Request.Files[0].SaveAs(Server.MapPath("~/Uploads/" + Path.GetFileName(Request.Files[0].FileName)));                        return Content(Url.Content("~/Uploads/" + Path.GetFileName(Request.Files[0].FileName)));                    }                    return Content("a");                }            }        }          So simple code, just save the posted Html into Session. Here the parameter of TextArea action is test2 which is same as textarea control name of TextArea.aspx View. Also note ValidateInputAttribute is false, so it's up to you to defends against XSS. Also there is an Action method which simply saves the file inside Upload Folder.          I am uploading the file using Jquery Form Plugin. Here is the code which is found in insert_image.html inside addons folder,        function ChangeImage() {            var myform=document.getElementById("formUpload");                    $(myform).ajaxSubmit({success: function(responseText){                insertImage(responseText);                        window.close();                }            });        }          and here is the Index View which simply renders the html of Editor which was saved in Session        <%@ Page Language="C#" MasterPageFile="~/Views/Shared/Site.Master" Inherits="System.Web.Mvc.ViewPage" %>        <asp:Content ID="indexTitle" ContentPlaceHolderID="TitleContent" runat="server">            Home Page        </asp:Content>        <asp:Content ID="indexContent" ContentPlaceHolderID="MainContent" runat="server">            <h2><%= Html.Encode(ViewData["Message"]) %></h2>            <p>                To learn more about ASP.NET MVC visit <a href="http://asp.net/mvc" title="ASP.NET MVC Website">http://asp.net/mvc</a>.            </p>            <%if (Session["html"] != null){                  Response.Write(Session["html"].ToString());            } %>                    </asp:Content>   Summary:          Hopefully you will enjoy this article. Just download the code and see the effect. From security point, you must handle the XSS attack your self. I had uploaded the sample application in http://www.speedfile.org/155076

    Read the article

  • Get more information about the crash

    - by Tito
    when i issue the command last in my terminal i see the following entries i.e. "crash" How can i find out more information about the crash: Is there any system dump file generated or a backtrace file that i can see or analyze. reboot system boot 3.2.0-32-generic Mon Nov 12 23:58 - 09:13 (09:14) tito pts/1 192.168.26.5 Mon Nov 12 23:56 - crash (00:01) tito pts/4 192.168.26.5 Mon Nov 12 22:46 - crash (01:12)

    Read the article

  • SQL SERVER – Maximize Database Performance with DB Optimizer – SQL in Sixty Seconds #054

    - by Pinal Dave
    Performance tuning is an interesting concept and everybody evaluates it differently. Every developer and DBA have different opinion about how one can do performance tuning. I personally believe performance tuning is a three step process Understanding the Query Identifying the Bottleneck Implementing the Fix While, we are working with large database application and it suddenly starts to slow down. We are all under stress about how we can get back the database back to normal speed. Most of the time we do not have enough time to do deep analysis of what is going wrong as well what will fix the problem. Our primary goal at that time is to just fix the database problem as fast as we can. However, here is one very important thing which we need to keep in our mind is that when we do quick fix, it should not create any further issue with other parts of the system. When time is essence and we want to do deep analysis of our system to give us the best solution we often tend to make mistakes. Sometimes we make mistakes as we do not have proper time to analysis the entire system. Here is what I do when I face such a situation – I take the help of DB Optimizer. It is a fantastic tool and does superlative performance tuning of the system. Everytime when I talk about performance tuning tool, the initial reaction of the people is that they do not want to try this as they believe it requires lots of the learning of the tool before they use it. It is absolutely not true with the case of the DB optimizer. It is a very easy to use and self intuitive tool. Once can get going with the product, in no time. Here is a quick video I have build where I demonstrate how we can identify what index is missing for query and how we can quickly create the index. Entire three steps of the query tuning are completed in less than 60 seconds. If you are into performance tuning and query optimization you should download DB Optimizer and give it a go. Let us see the same concept in following SQL in Sixty Seconds Video: You can Download DB Optimizer and reproduce the same Sixty Seconds experience. Related Tips in SQL in Sixty Seconds: Performance Tuning – Part 1 of 2 – Getting Started and Configuration Performance Tuning – Part 2 of 2 – Analysis, Detection, Tuning and Optimizing What would you like to see in the next SQL in Sixty Seconds video? Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Database, Pinal Dave, PostADay, SQL, SQL Authority, SQL in Sixty Seconds, SQL Interview Questions and Answers, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, T SQL, Technology, Video Tagged: Identity

    Read the article

  • Simple way of converting server side objects into client side using JSON serialization for asp.net websites

    - by anil.kasalanati
     Introduction:- With the growth of Web2.0 and the need for faster user experience the spotlight has shifted onto javascript based applications built using REST pattern or asp.net AJAX Pagerequest manager. And when we are working with javascript wouldn’t it be much better if we could create objects in an OOAD way and easily push it to the client side.  Following are the reasons why you would push the server side objects onto client side -          Easy availability of the complex object. -          Use C# compiler and rick intellisense to create and maintain the objects but use them in the javascript. You could run code analysis etc. -          Reduce the number of calls we make to the server side by loading data on the pageload.   I would like to explain about the 3rd point because that proved to be highly beneficial to me when I was fixing the performance issues of a major website. There could be a scenario where in you be making multiple AJAX based webrequestmanager calls in order to get the same response in a single page. This happens in the case of widget based framework when all the widgets are independent but they need some common information available in the framework to load the data. So instead of making n multiple calls we could load the data needed during pageload. The above picture shows the scenario where in all the widgets need the common information and then call GetData webservice on the server side. Ofcourse the result can be cached on the client side but a better solution would be to avoid the call completely.  In order to do that we need to JSONSerialize the content and send it in the DOM.                                                                                                                                                                                                                                                                                                                                                                                            Example:- I have developed a simple application to demonstrate the idea and I would explaining that in detail here. The class called SimpleClass would be sent as serialized JSON to the client side .   And this inherits from the base class which has the implementation for the GetJSONString method. You can create a single base class and all the object which need to be pushed to the client side can inherit from that class. The important thing to note is that the class should be annotated with DataContract attribute and the methods should have the Data Member attribute. This is needed by the .Net DataContractSerializer and this follows the opt-in mode so if you want to send an attribute to the client side then you need to annotate the DataMember attribute. So if I didn’t want to send the Result I would simple remove the DataMember attribute. This is default WCF/.Net 3.5 stuff but it provides the flexibility of have a fullfledged object on the server side but sending a smaller object to the client side. Sometimes you may hide some values due to security constraints. And thing you will notice is that I have marked the class as Serializable so that it can be stored in the Session and used in webfarm deployment scenarios. Following is the implementation of the base class –  This implements the default DataContractJsonSerializer and for more information or customization refer to following blogs – http://softcero.blogspot.com/2010/03/optimizing-net-json-serializing-and-ii.html http://weblogs.asp.net/gunnarpeipman/archive/2010/12/28/asp-net-serializing-and-deserializing-json-objects.aspx The next part is pretty simple, I just need to inject this object into the aspx page.   And in the aspx markup I have the following line – <script type="text/javascript"> var data =(<%=SimpleClassJSON  %>);   alert(data.ResultText); </script>   This will output the content as JSON into the variable data and this can be any element in the DOM. And you can verify the element by checking data in the Firebug console.    Design Consideration – If you have a lot of javascripts then you need to think about using Script # and you can write javascript in C#. Refer to Nikhil’s blog – http://projects.nikhilk.net/ScriptSharp Ensure that you are taking security into consideration while exposing server side objects on to client side. I have seen application exposing passwords, secret key so it is not a good practice.   The application can be tested using the following url – http://techconsulting.vpscustomer.com/Samples/JsonTest.aspx The source code is available at http://techconsulting.vpscustomer.com/Source/HistoryTest.zip

    Read the article

  • Oracle Database to SQL Server Comparisons

    One of the initial obstacles a database administrator encounters is learning where features of his/her system live or reside on a less familiar system. Steve Callan approaches this feature comparison by taking SQL Server and mapping its features back into Oracle.

    Read the article

  • Fidelity Investments Life Insurance Executive Weighs in on Policy Administration Modernization

    - by helen.pitts(at)oracle.com
    James Klauer, vice president of Client Services Technology at Fidelity Investment Life Insurance, weighs in on the rationale and challenges associated with policy administration system replacement in this month's digital issue of Insurance Networking News.    In "The Policy Administration Replacement Quandary"  Klauer shared the primary business benefit that can be realized by adopting a modern policy administration system--a timely topic given that recent industry analyst surveys indicate policy administration replacement and modernization will continue to be a top priority for insurers this year.    "Modern policy administration systems are more flexible than systems of the past," Klauer says in the article. " This has allowed us to shorten our delivery time for new products and product changes.  We have also had a greater ability to integrate with other systems and to deliver process efficiencies."   Klauer goes on to advise that insurers ensure they have a solid understanding of the requirements when replacing their legacy policy administration system. "If you can afford the time, take the opportunity to re-engineer your business processes.  We were able to drastically change our death processes, introducing automation and error-proofing." Click here to read more of Klauer's insights and recommendations for best practices in the publication's "Ask & Answered" column.   You also can learn more the benefits of an adaptive, rules-driven approach to policy administration and how to mitigate risks associated with system replacement by attending the free Oracle Insurance Virtual Summit:  Fueling the Adaptive Insurance Enterprise, 10:00 a.m. - 6:00 p.m. EST, Wednesday, January 26.      Insurance Networking News and Oracle Insurance have teamed up to bring you this first-of-its-kind event. This year's theme, "Fueling the Adaptive Insurance Enterprise," will focus on bringing you information about exciting new technology concepts, which can help your company react more quickly to new market opportunities and, ultimately, grow the business.    Visit virtual booths and chat online with Oracle product specialists, network with other insurers, learn about exciting new product announcements, win prizes, and much more--all without leaving your office.  Be sure and attend the on-demand session, "Adapt, Transform and Grow: Accelerate Speed to Market with Adaptive Insurance Policy Administration," hosted by Kate Fowler, product strategy director for Oracle Insurance Policy Administration for Life and Annuity.   Register Now!   Helen Pitts is senior product marketing manager for Oracle Insurance's life and annuities solutions.

    Read the article

  • Simple Excel Export with EPPlus

    - by Jesse Taber
    Originally posted on: http://geekswithblogs.net/GruffCode/archive/2013/10/30/simple-excel-export-with-epplus.aspxAnyone I’ve ever met who works with an application that sits in front of a lot of data loves it when they can get that data exported to an Excel file for them to mess around with offline. As both developer and end user of a little website project that I’ve been working on, I found myself wanting to be able to get a bunch of the data that the application was collecting into an Excel file. The great thing about being both an end user and a developer on a project is that you can build the features that you really want! While putting this feature together I came across the fantastic EPPlus library. This library is certainly very well known and popular, but I was so impressed with it that I thought it was worth a quick blog post. This library is extremely powerful; it lets you create and manipulate Excel 2007/2010 spreadsheets in .NET code with a high degree of flexibility. My only gripe with the project is that they are not touting how insanely easy it is to build a basic Excel workbook from a simple data source. If I were running this project the approach I’m about to demonstrate in this post would be front and center on the landing page for the project because it shows how easy it really is to get started and serves as a good way to ease yourself in to some of the more advanced features. The website in question uses RavenDB, which means that we’re dealing with POCOs to model the data throughout all layers of the application. I love working like this so when it came time to figure out how to export some of this data to an Excel spreadsheet I wanted to find a way to take an IEnumerable<T> and just have it dumped to Excel with each item in the collection being modeled as a single row in the Excel worksheet. Consider the following class: public class Employee { public int Id { get; set; } public string Name { get; set; } public decimal HourlyRate { get; set; } public DateTime HireDate { get; set; } } Now let’s say we have a collection of these represented as an IEnumerable<Employee> and we want to be able to output it to an Excel file for offline querying/manipulation. As it turns out, this is dead simple to do with EPPlus. Have a look: public void ExportToExcel(IEnumerable<Employee> employees, FileInfo targetFile) { using (var excelFile = new ExcelPackage(targetFile)) { var worksheet = excelFile.Workbook.Worksheets.Add("Sheet1"); worksheet.Cells["A1"].LoadFromCollection(Collection: employees, PrintHeaders: true); excelFile.Save(); } } That’s it. Let’s break down what’s going on here: Create a ExcelPackage to model the workbook (Excel file). Note that the ‘targetFile’ value here is a FileInfo object representing the location on disk where I want the file to be saved. Create a worksheet within the workbook. Get a reference to the top-leftmost cell (addressed as A1) and invoke the ‘LoadFromCollection’ method, passing it our collection of Employee objects. Behind the scenes this is reflecting over the properties of the type provided and pulling out any public members to become columns in the resulting Excel output. The ‘PrintHeaders’ parameter tells EPPlus to grab the name of the property and put it in the first row. Save the Excel file All of the heavy lifting here is being done by the ‘LoadFromCollection’ method, and that’s a good thing. Now, this was really easy to do, but it has some limitations. Using this approach you get a very plain, un-styled Excel worksheet. The column widths are all set to the default. The number format for all cells is ‘General’ (which proves particularly interesting if you have a DateTime property in your data source). I’m a “no frills” guy, so I wasn’t bothered at all by trading off simplicity for style and formatting. That said, EPPlus has tons of samples that you can download that illustrate how to apply styles and formatting to cells and a ton of other advanced features that are way beyond the scope of this post.

    Read the article

  • Programming pattern to flatten deeply nested ajax callbacks?

    - by chiborg
    I've inherited JavaScript code where the success callback of an Ajax handler initiates another Ajax call where the success callback may or may not initiate another Ajax call. This leads to deeply nested anonymous functions. Maybe there is a clever programming pattern that avoids the deep-nesting and is more DRY. jQuery.extend(Application.Model.prototype, { process: function() { jQuery.ajax({ url:myurl1, dataType:'json', success:function(data) { // process data, then send it back jQuery.ajax({ url:myurl2, dataType:'json', success:function(data) { if(!data.ok) { jQuery.ajax({ url:myurl2, dataType:'json', success:mycallback }); } else { mycallback(data); } } }); } }); } });

    Read the article

  • ERP in a Flash! Latest News on JD Edwards and Oracle VM Templates

    - by Kem Butller-Oracle
    Oracle Announces the Availability of Oracle VM Templates for JD Edwards EnterpriseOne 9.1 Update 2 and Tools 9.1 Update 4.4 Continuing the commitment to rapid and predictable deployments of JD Edwards EnterpriseOne, Oracle announces the general availability of Oracle VM templates for JD Edwards EnterpriseOne Application release 9.1 Update 2 and Tools release 9.1 Update 4.4. These templates can be used with Oracle VM for x86, on the Oracle Exalogic Elastic Cloud, and on the Oracle Database Machine. Oracle VM Templates for JD Edwards EnterpriseOne accelerate the process of setting up a working environment compared to the traditional installation process. The templates can be a key component to a well-managed cloud infrastructure, allowing system administrators to quickly provision fully functional JD Edwards EnterpriseOne environments for evaluation, development, or production use. The templates contain preconfigured images of the major JD Edwards EnterpriseOne server components, including: • Enterprise server • HTML server • Database server • BI Publisher (for use with One View Reporting) • Business Services Server and ADF Runtime (for use with Mobile Smartphone Applications) • Application Interface Services (new with this release, for use with Mobile Enterprise Applications) • Server Manager (new with this release) The virtual server images are built on a complete Oracle technology stack, including Oracle VM for x86, Oracle Linux, Oracle WebLogic Server, Oracle Database, and Oracle Business Intelligence Publisher. The templates can be installed into an Oracle VM for x86 system running on standard x86 servers, the Oracle Exalogic Elastic Cloud, and the Oracle Database Appliance as a composite “all-in-one” system. The database can be deployed as a fully preconfigured VM template, or it can be deployed to a preexisting database server, for example, the Oracle Exadata Database Machine or the Oracle Database Appliance. This latest set of templates includes the following applications and technology components: • JD Edwards EnterpriseOne Applications Release 9.1 Update 2 with ESUs as of April 8, 2014 • JD Edwards EnterpriseOne Tools 9.1 Update 4, maintenance pack 4 (9.1.4.4) • Oracle Database 12c (12.1.0.1) • Oracle WebLogic Server 12c (12.1.2) • Oracle Linux 5 Update 8, 64-bit • Oracle Business Intelligence Publisher 11.1.1.7.1, for use with JD Edwards EnterpriseOne One View Reporting • JD Edwards EnterpriseOne Business Services Server and Oracle Application Development Framework (ADF) 11.1.1.5, for use with the JD Edwards EnterpriseOne Mobile Applications. The delivery also includes a JD Edwards EnterpriseOne deployment server preconfigured to match the content of the templates. This edition of the templates also includes enhanced configuration utilities that greatly simplify the process of configuring the templates for deployment into a running system. The templates are immediately available for download from the Oracle Software Delivery Cloud. For more information see: • My Oracle Support article 884592.1 • Oracle Technology Network

    Read the article

  • SQL University: Parallelism Week - Part 2, Query Processing

    - by Adam Machanic
    Welcome back for the second part of Parallelism Week here at SQL University . Get your pencils ready, and make sure to raise your hand if you have a question. Last time we covered the necessary background material to help you understand how the SQL Server Operating System schedules its many active threads, and the differences between its behavior and that of the Windows operating system's scheduler. We also discussed some of the variations on the theme of parallel processing. Today we'll take a look...(read more)

    Read the article

  • Deployment of SQL compact Edition (SDF files) using Setup project

    - by Emad
    Hi, I have a C#.NET desktop application using SQL Compact edition as data store. The application should be used by any user on the machine and all should be seeing the same data ( data should not different per user). I am wondering where should I deploy the SDF file? User's Personal data folder (My Documents) means each user will have a separate database. Deploying on the same folder as the application causes vista to copy the file to \USers\Appdata\local\VirtualStore\ and it seems to make different copies for each user. Where is it best to deploy the SDF file to ensure all users are looking at the same data?

    Read the article

  • SQL SERVER – Powershell – Importing CSV File Into Database – Video

    - by pinaldave
    Laerte Junior is my very dear friend and Powershell Expert. On my request he has agreed to share Powershell knowledge with us. Laerte Junior is a SQL Server MVP and, through his technology blog and simple-talk articles, an active member of the Microsoft community in Brasil. He is a skilled Principal Database Architect, Developer, and Administrator, specializing in SQL Server and Powershell Programming with over 8 years of hands-on experience. He holds a degree in Computer Science, has been awarded a number of certifications (including MCDBA), and is an expert in SQL Server 2000 / SQL Server 2005 / SQL Server 2008 technologies. Let us read the blog post in his own words. I was reading an excellent post from my great friend Pinal about loading data from CSV files, SQL SERVER – Importing CSV File Into Database – SQL in Sixty Seconds #018 – Video,   to SQL Server and was honored to write another guest post on SQL Authority about the magic of the PowerShell. The biggest stuff in TechEd NA this year was PowerShell. Fellows, if you still don’t know about it, it is better to run. Remember that The Core Servers to SQL Server are the future and consequently the Shell. You don’t want to be out of this, right? Let’s see some PowerShell Magic now. To start our tour, first we need to download these two functions from Powershell and SQL Server Master Jedi Chad Miller.Out-DataTable and Write-DataTable. Save it in a module and add it in your profile. In my case, the module is called functions.psm1. To have some data to play, I created 10 csv files with the same content. I just put the SQL Server Errorlog into a csv file and created 10 copies of it. #Just create a CSV with data to Import. Using SQLErrorLog [reflection.assembly]::LoadWithPartialName(“Microsoft.SqlServer.Smo”) $ServerInstance=new-object (“Microsoft.SqlServer.Management.Smo.Server“) $Env:Computername $ServerInstance.ReadErrorLog() | export-csv-path“c:\SQLAuthority\ErrorLog.csv”-NoTypeInformation for($Count=1;$Count-le 10;$count++)  {       Copy-Item“c:\SQLAuthority\Errorlog.csv”“c:\SQLAuthority\ErrorLog$($count).csv” } Now in my path c:\sqlauthority, I have 10 csv files : Now it is time to create a table. In my case, the SQL Server is called R2D2 and the Database is SQLServerRepository and the table is CSV_SQLAuthority. CREATE TABLE [dbo].[CSV_SQLAuthority]( [LogDate] [datetime] NULL, [Processinfo] [varchar](20) NULL, [Text] [varchar](MAX) NULL ) Let’s play a little bit. I want to import synchronously all csv files from the path to the table: #Importing synchronously $DataImport=Import-Csv-Path ( Get-ChildItem“c:\SQLAuthority\*.csv”) $DataTable=Out-DataTable-InputObject$DataImport Write-DataTable-ServerInstanceR2D2-DatabaseSQLServerRepository-TableNameCSV_SQLAuthority-Data$DataTable Very cool, right? Let’s do it asynchronously and in background using PowerShell  Jobs: #If you want to do it to all asynchronously Start-job-Name‘ImportingAsynchronously‘ ` -InitializationScript  {IpmoFunctions-Force-DisableNameChecking} ` -ScriptBlock {    ` $DataImport=Import-Csv-Path ( Get-ChildItem“c:\SQLAuthority\*.csv”) $DataTable=Out-DataTable-InputObject$DataImport Write-DataTable   -ServerInstance“R2D2″`                   -Database“SQLServerRepository“`                   -TableName“CSV_SQLAuthority“`                   -Data$DataTable             } Oh, but if I have csv files that are large in size and I want to import each one asynchronously. In this case, this is what should be done: Get-ChildItem“c:\SQLAuthority\*.csv” | % { Start-job-Name“$($_)” ` -InitializationScript  {IpmoFunctions-Force-DisableNameChecking} ` -ScriptBlock { $DataImport=Import-Csv-Path$args[0]                $DataTable=Out-DataTable-InputObject$DataImport                Write-DataTable-ServerInstance“R2D2″`                               -Database“SQLServerRepository“`                               -TableName“CSV_SQLAuthority“`                               -Data$DataTable             } -ArgumentList$_.fullname } How cool is that? Let’s make the funny stuff now. Let’s schedule it on an SQL Server Agent Job. If you are using SQL Server 2012, you can use the PowerShell Job Step. Otherwise you need to use a CMDexec job step calling PowerShell.exe. We will use the second option. First, create a ps1 file called ImportCSV.ps1 with the script above and save it in a path. In my case, it is in c:\temp\automation. Just add the line at the end: Get-ChildItem“c:\SQLAuthority\*.csv” | % { Start-job-Name“$($_)” ` -InitializationScript  {IpmoFunctions-Force-DisableNameChecking} ` -ScriptBlock { $DataImport=Import-Csv-Path$args[0]                $DataTable=Out-DataTable-InputObject$DataImport                Write-DataTable-ServerInstance“R2D2″`                               -Database“SQLServerRepository“`                               -TableName“CSV_SQLAuthority“`                               -Data$DataTable             } -ArgumentList$_.fullname } Get-Job | Wait-Job | Out-Null Remove-Job -State Completed Why? See my post Dooh PowerShell Trick–Running Scripts That has Posh Jobs on a SQL Agent Job Remember, this trick is for  ALL scripts that will use PowerShell Jobs and any kind of schedule tool (SQL Server agent, Windows Schedule) Create a Job Called ImportCSV and a step called Step_ImportCSV and choose CMDexec. Then you just need to schedule or run it. I did a short video (with matching good background music) and you can see it at: That’s it guys. C’mon, join me in the #PowerShellLifeStyle. You will love it. If you want to check what we can do with PowerShell and SQL Server, don’t miss Laerte Junior LiveMeeting on July 18. You can have more information in : LiveMeeting VC PowerShell PASS–Troubleshooting SQL Server With PowerShell–English Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQL Utility, T SQL, Technology, Video Tagged: Powershell

    Read the article

  • Play video in UIWebView from NSData

    - by Papagalli
    Hello there, Does anybody know how to play a video in a UIVebView ? The video comes from the iphone library, when I pick it, I store it in core data as binary data. If i use the local url of the video it's working fine, the problem I have is that the this url refers to a folder named "tmp", and I don't know the lifetime of it. If I can get the real url of the video file it will be ok. //attachment is the core data object containing the data NSURL *url = [NSURL URLWithString:attachment.path]; [self.webView loadRequest:[NSURLRequest requestWithURL:url]]; I tried a solution that doesn't work: (i tried MIMETypes : "video/quicktime" and "video/mp4") [self.webView loadData:attachment.data MIMEType:@"video/mov" textEncodingName:nil baseURL:nil]; I think the second solution is the closest, and the problem comes from the MIMEType. Thanks by advance if anybody has a solution for this :)

    Read the article

  • Benefits of PerformancePoint Services Using SharePoint Server 2010

    - by Wayne
    What is PerformancePoint Services? Most of the time it happens that the metrics that make up your key performance indicators are not simple values from a data source. In SharePoint Server 2007 PerformancePoint Services, you could create two kinds of KPI metrics: Simple single value metrics from any supported data source or Complex multiple value metrics from a single Analysis Services data source using MDX. Now things are even easier with Performance Point Services in SharePoint 2010. Let us check what is it? PerformancePoint Services in SharePoint Server 2010 is a performance management service that you can use to monitor and analyze your business. By providing flexible, easy-to-use tools for building dashboards, scorecards, reports, and key performance indicators (KPIs), PerformancePoint Services can help everyone across an organization make informed business decisions that align with companywide objectives and strategy. Scorecards, dashboards, and KPIs help drive accountability. Integrated analytics help employees move quickly from monitoring information to analyzing it and, when appropriate, sharing it throughout the organization. Prior to the addition of PerformancePoint Services to SharePoint Server, Microsoft Office PerformancePoint Server 2007 functioned as a standalone server. Now PerformancePoint functionality is available as an integrated part of the SharePoint Server Enterprise license, as is the case with Excel Services in Microsoft SharePoint Server 2010. The popular features of earlier versions of PerformancePoint Services are preserved along with numerous enhancements and additional functionality. New PerformancePoint Services features PerformancePoint Services now can utilize SharePoint Server scalability, collaboration, backup and recovery, and disaster recovery capabilities. Dashboards and dashboard items are stored and secured within SharePoint lists and libraries, providing you with a single security and repository framework. New features and enhancements of SharePoint 2010 PerformancePoint Services • With PerformancePoint Services, functioning as a service in SharePoint Server, dashboards and dashboard items are stored and secured within SharePoint lists and libraries, providing you with a single security and repository framework. The new architecture also takes advantage of SharePoint Server scalability, collaboration, backup and recovery, and disaster recovery capabilities. You also can include and link PerformancePoint Services Web Parts with other SharePoint Server Web Parts on the same page. The new architecture also streamlines security models that simplify access to report data. • The Decomposition Tree is a new visualization report type available in PerformancePoint Services. You can use it to quickly and visually break down higher-level data values from a multi-dimensional data set to understand the driving forces behind those values. The Decomposition Tree is available in scorecards and analytic reports and ultimately in dashboards. • You can access more detailed business information with improved scorecards. Scorecards have been enhanced to make it easy for you to drill down and quickly access more detailed information. PerformancePoint scorecards also offer more flexible layout options, dynamic hierarchies, and calculated KPI features. Using this enhanced functionality, you can now create custom metrics that use multiple data sources. You can also sort, filter, and view variances between actual and target values to help you identify concerns or risks. • Better Time Intelligence filtering capabilities that you can use to create and use dynamic time filters that are always up to date. Other improved filters improve the ability for dashboard users to quickly focus in on information that is most relevant. • Ability to include and link PerformancePoint Services Web Parts together with other PerformancePoint Services Web parts on the same page. • Easier to author and publish dashboard items by using Dashboard Designer. • SQL Server Analysis Services 2008 support. • Increased support for accessibility compliance in individual reports and scorecards. • The KPI Details report is a new report type that displays contextually relevant information about KPIs, metrics, rows, columns, and cells within a scorecard. The KPI Details report works as a Web part that links to a scorecard or individual KPI to show relevant metadata to the end user in SharePoint Server. This Web part can be added to PerformancePoint dashboards or any SharePoint Server page. • Create analytics reports to better understand underlying business forces behind the results. Analytic reports have been enhanced to support value filtering, new chart types, and server-based conditional formatting. To conclude, PerformancePoint Services, by becoming tightly integrated with SharePoint Server 2010, takes advantage of many enterprise-level SharePoint Server 2010 features. Unfortunately, SharePoint Foundation 2010 doesn’t include this feature. There are still many choices in SharePoint family of products that include SharePoint Server 2010, SharePoint Foundation, SharePoint Server 2007 and associated free SharePoint web parts and templates.

    Read the article

  • Why is Evolution the default mail/calendar package?

    - by Android Eve
    Why is Evolution the default mail/calendar package that comes with Ubuntu? Why not Thunderbird + Lightning? Are there any features in Evolution that are not available in Thunderbird + Lightning? Can I use the Evolution database via a Samba network share, on a Windows XP or 7 client, just like I can do with Thunderbird? What happens if I uninstall Evolution from my 10.04 system? Will I lose any integrated functionality built into the system?

    Read the article

  • What are some useful things you can do with Mvc Modelbinders?

    - by George Mauer
    It occurs to me that the ModelBinder mechanism in ASP MVC public interface IModelBinder { object BindModel(System.Web.Mvc.ControllerContext controllerContext, System.Web.Mvc.ModelBindingContext bindingContext); } Is insanely powerful. What are some cool uses of this mechanism that you've done/seen? I guess since the concept is similar in other frameworks there's no reason to limit it to Asp Mvc

    Read the article

  • Evolution Of High Definition TV Viewing

    - by Gopinath
    The following guest post is written by Rob, who is also blogging on entertainment technology topics on iwantsky.com Gone are the days when you need to squint to be able to see the emotions on the faces of Humphrey Bogart and Ingrid Bergman as the lovers bid each other adieu in the classic film Casablanca. These days, watching an ordinary ant painstakingly carry a leaf in Animal Planet can be an exhilarating experience as you get to see not only the slightest movement but also the demarcation line between the insect’s head, thorax and abdomen. The crystal clear imagery was made possible by the sharp minds and the tinkering hands of the scientists that have designed the modern world’s HDTV. What is HDTV and what makes people so agog to have this new innovation in TV watching? HDTV stands for High Definition TV. Television viewing has indeed made a big leap. From the grainy black and whites, TV viewing had moved to colored TVs, progressed to SD TVs and now to HDTV. HDTV is the emerging trend in TV viewing as it delivers bigger and clearer pictures and better audio. Viewers can have a cinema-like TV viewing experience right in the comforts of their own home. With HDTV the viewer is allowed to have a better viewing range. With Standard (SD) TV, the viewer has to be at a distance that is from 3 to 6 times the size of the screen. HDTV allows the viewer to enjoy sharper and clearer images as it is possible to sit at a distance that is 1.5 or 3 times the size of the screen without noticing any image pixilation. Although HDTV appears to be a fairly new innovation, this system has actually existed in various forms years ago. Development of the HDTV was started in Europe as early as 1940s. However, the NTSC and the PAL/SECAM, the two analog TV standards became dominant and became popular worldwide. The analog TV was replaced by the digital TV platform in the 1990s. Even during the analog era, attempts have been made to develop HDTV. Japan has come out with MUSE system. However, due to channel bandwidth requirement concerns, the program was shelved. The entry of four organizations into the HDTV market spurred the development of a beneficial coalition. The AT&T, ATRC, MIT and Zenith HDTV combined forces. In 1993, a Grand Alliance was formed. This group is composed of researchers and HDTV manufacturers. A common standard for the broadcast system of HDTV was developed. In 1995, the system was tested and found successful. With the higher screen resolution of HDTV, viewing has never been more enjoyable. [Image courtesy: samsung] This article titled,Evolution Of High Definition TV Viewing, was originally published at Tech Dreams. Grab our rss feed or fan us on Facebook to get updates from us.

    Read the article

  • Populating DataGrid with SQLResult Air Flex

    - by Deyon
    I been beating my self up all day on this...I'm about to call it quits and get some Chinese food -=\ I'm selecting data from a local SQL DB. [Bindable] public var ac:ArrayCollection; public function select():void { statement.sqlConnection=conn; statement.text="SELECT * FROM PROJECT"; statement.execute(); var res : SQLResult = statement.getResult(); ac = new ArrayCollection(res.data); //So this traces out [Object][object] So it works trace(ac); } If I do var myObj:Object=res.data[0]; I can trace myObj and view the data. But I don't know how to insert the data into a datagrid. mygrid.dataProvider=ac; dose not work. I'm using Flash Builder 4 Help please....

    Read the article

  • Java EE 6 and NoSQL/MongoDB on GlassFish using JPA and EclipseLink 2.4 (TOTD #175)

    - by arungupta
    TOTD #166 explained how to use MongoDB in your Java EE 6 applications. The code in that tip used the APIs exposed by the MongoDB Java driver and so requires you to learn a new API. However if you are building Java EE 6 applications then you are already familiar with Java Persistence API (JPA). Eclipse Link 2.4, scheduled to release as part of Eclipse Juno, provides support for NoSQL databases by mapping a JPA entity to a document. Their wiki provides complete explanation of how the mapping is done. This Tip Of The Day (TOTD) will show how you can leverage that support in your Java EE 6 applications deployed on GlassFish 3.1.2. Before we dig into the code, here are the key concepts ... A POJO is mapped to a NoSQL data source using @NoSQL or <no-sql> element in "persistence.xml". A subset of JPQL and Criteria query are supported, based upon the underlying data store Connection properties are defined in "persistence.xml" Now, lets lets take a look at the code ... Download the latest EclipseLink 2.4 Nightly Bundle. There is a Installer, Source, and Bundle - make sure to download the Bundle link (20120410) and unzip. Download GlassFish 3.1.2 zip and unzip. Install the Eclipse Link 2.4 JARs in GlassFish Remove the following JARs from "glassfish/modules": org.eclipse.persistence.antlr.jar org.eclipse.persistence.asm.jar org.eclipse.persistence.core.jar org.eclipse.persistence.jpa.jar org.eclipse.persistence.jpa.modelgen.jar org.eclipse.persistence.moxy.jar org.eclipse.persistence.oracle.jar Add the following JARs from Eclipse Link 2.4 nightly build to "glassfish/modules": org.eclipse.persistence.antlr_3.2.0.v201107111232.jar org.eclipse.persistence.asm_3.3.1.v201107111215.jar org.eclipse.persistence.core.jpql_2.4.0.v20120407-r11132.jar org.eclipse.persistence.core_2.4.0.v20120407-r11132.jar org.eclipse.persistence.jpa.jpql_2.0.0.v20120407-r11132.jar org.eclipse.persistence.jpa.modelgen_2.4.0.v20120407-r11132.jar org.eclipse.persistence.jpa_2.4.0.v20120407-r11132.jar org.eclipse.persistence.moxy_2.4.0.v20120407-r11132.jar org.eclipse.persistence.nosql_2.4.0.v20120407-r11132.jar org.eclipse.persistence.oracle_2.4.0.v20120407-r11132.jar Start MongoDB Download latest MongoDB from here (2.0.4 as of this writing). Create the default data directory for MongoDB as: sudo mkdir -p /data/db/sudo chown `id -u` /data/db Refer to Quickstart for more details. Start MongoDB as: arungup-mac:mongodb-osx-x86_64-2.0.4 <arungup> ->./bin/mongod./bin/mongod --help for help and startup optionsMon Apr  9 12:56:02 [initandlisten] MongoDB starting : pid=3124 port=27017 dbpath=/data/db/ 64-bit host=arungup-mac.localMon Apr  9 12:56:02 [initandlisten] db version v2.0.4, pdfile version 4.5Mon Apr  9 12:56:02 [initandlisten] git version: 329f3c47fe8136c03392c8f0e548506cb21f8ebfMon Apr  9 12:56:02 [initandlisten] build info: Darwin erh2.10gen.cc 9.8.0 Darwin Kernel Version 9.8.0: Wed Jul 15 16:55:01 PDT 2009; root:xnu-1228.15.4~1/RELEASE_I386 i386 BOOST_LIB_VERSION=1_40Mon Apr  9 12:56:02 [initandlisten] options: {}Mon Apr  9 12:56:02 [initandlisten] journal dir=/data/db/journalMon Apr  9 12:56:02 [initandlisten] recover : no journal files present, no recovery neededMon Apr  9 12:56:02 [websvr] admin web console waiting for connections on port 28017Mon Apr  9 12:56:02 [initandlisten] waiting for connections on port 27017 Check out the JPA/NoSQL sample from SVN repository. The complete source code built in this TOTD can be downloaded here. Create Java EE 6 web app Create a Java EE 6 Maven web app as: mvn archetype:generate -DarchetypeGroupId=org.codehaus.mojo.archetypes -DarchetypeArtifactId=webapp-javaee6 -DgroupId=model -DartifactId=javaee-nosql -DarchetypeVersion=1.5 -DinteractiveMode=false Copy the model files from the checked out workspace to the generated project as: cd javaee-nosqlcp -r ~/code/workspaces/org.eclipse.persistence.example.jpa.nosql.mongo/src/model src/main/java Copy "persistence.xml" mkdir src/main/resources cp -r ~/code/workspaces/org.eclipse.persistence.example.jpa.nosql.mongo/src/META-INF ./src/main/resources Add the following dependencies: <dependency> <groupId>org.eclipse.persistence</groupId> <artifactId>org.eclipse.persistence.jpa</artifactId> <version>2.4.0-SNAPSHOT</version> <scope>provided</scope></dependency><dependency> <groupId>org.eclipse.persistence</groupId> <artifactId>org.eclipse.persistence.nosql</artifactId> <version>2.4.0-SNAPSHOT</version></dependency><dependency> <groupId>org.mongodb</groupId> <artifactId>mongo-java-driver</artifactId> <version>2.7.3</version></dependency> The first one is for the EclipseLink latest APIs, the second one is for EclipseLink/NoSQL support, and the last one is the MongoDB Java driver. And the following repository: <repositories> <repository> <id>EclipseLink Repo</id> <url>http://www.eclipse.org/downloads/download.php?r=1&amp;nf=1&amp;file=/rt/eclipselink/maven.repo</url> <snapshots> <enabled>true</enabled> </snapshots> </repository>  </repositories> Copy the "Test.java" to the generated project: mkdir src/main/java/examplecp -r ~/code/workspaces/org.eclipse.persistence.example.jpa.nosql.mongo/src/example/Test.java ./src/main/java/example/ This file contains the source code to CRUD the JPA entity to MongoDB. This sample is explained in detail on EclipseLink wiki. Create a new Servlet in "example" directory as: package example;import java.io.IOException;import java.io.PrintWriter;import javax.servlet.ServletException;import javax.servlet.annotation.WebServlet;import javax.servlet.http.HttpServlet;import javax.servlet.http.HttpServletRequest;import javax.servlet.http.HttpServletResponse;/** * @author Arun Gupta */@WebServlet(name = "TestServlet", urlPatterns = {"/TestServlet"})public class TestServlet extends HttpServlet { protected void processRequest(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException { response.setContentType("text/html;charset=UTF-8"); PrintWriter out = response.getWriter(); try { out.println("<html>"); out.println("<head>"); out.println("<title>Servlet TestServlet</title>"); out.println("</head>"); out.println("<body>"); out.println("<h1>Servlet TestServlet at " + request.getContextPath() + "</h1>"); try { Test.main(null); } catch (Exception ex) { ex.printStackTrace(); } out.println("</body>"); out.println("</html>"); } finally { out.close(); } } @Override protected void doGet(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException { processRequest(request, response); } @Override protected void doPost(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException { processRequest(request, response); }} Build the project and deploy it as: mvn clean packageglassfish3/bin/asadmin deploy --force=true target/javaee-nosql-1.0-SNAPSHOT.war Accessing http://localhost:8080/javaee-nosql/TestServlet shows the following messages in the server.log: connecting(EISLogin( platform=> MongoPlatform user name=> "" MongoConnectionSpec())) . . .Connected: User: Database: 2.7  Version: 2.7 . . .Executing MappedInteraction() spec => null properties => {mongo.collection=CUSTOMER, mongo.operation=INSERT} input => [DatabaseRecord( CUSTOMER._id => 4F848E2BDA0670307E2A8FA4 CUSTOMER.NAME => AMCE)]. . .Data access result: [{TOTALCOST=757.0, ORDERLINES=[{DESCRIPTION=table, LINENUMBER=1, COST=300.0}, {DESCRIPTION=balls, LINENUMBER=2, COST=5.0}, {DESCRIPTION=rackets, LINENUMBER=3, COST=15.0}, {DESCRIPTION=net, LINENUMBER=4, COST=2.0}, {DESCRIPTION=shipping, LINENUMBER=5, COST=80.0}, {DESCRIPTION=handling, LINENUMBER=6, COST=55.0},{DESCRIPTION=tax, LINENUMBER=7, COST=300.0}], SHIPPINGADDRESS=[{POSTALCODE=L5J1H7, PROVINCE=ON, COUNTRY=Canada, CITY=Ottawa,STREET=17 Jane St.}], VERSION=2, _id=4F848E2BDA0670307E2A8FA8,DESCRIPTION=Pingpong table, CUSTOMER__id=4F848E2BDA0670307E2A8FA7, BILLINGADDRESS=[{POSTALCODE=L5J1H8, PROVINCE=ON, COUNTRY=Canada, CITY=Ottawa, STREET=7 Bank St.}]}] You'll not see any output in the browser, just the output in the console. But the code can be easily modified to do so. Once again, the complete Maven project can be downloaded here. Do you want to try accessing relational and non-relational (aka NoSQL) databases in the same PU ?

    Read the article

  • Creating line graph/chart in vb.net (VS2008)

    - by typoknig
    I am reluctant to ask this question because a lot of similar questions have been asked, but after reading through them I am not getting the info I need. I am trying to follow this tutorial and I think it is going to work ok, but I have a lot of data to put in and the tutorial has the reader create the chart data points manually. I want the data points to be generated from an integer which can change while the program is running (thus the chart size needs to change) and the y coordinate of the data points needs to come from an array. I have attempted to "bind" the data but I am messing it up somehow and I don't even think that is the best way to do what I want. Also, I do not have to use the methods suggested in the tutorial, I am looking for the highest quality most efficient way to generate a line graph in vb.net (VS2008) based on the criteria I previously mentioned.

    Read the article

  • Why isn't my assets folder being installed on emulator?

    - by Brad Hein
    Where are my assets being installed to? I utilize an assets folder in my new app. I have two files in the folder. When I install my app on the emulator, I cannot access my assets, and furthermore I cannot see them on the emulator filesystem. Extracted my apk and confirmed the assets folder exists: $ ls -ltr assets/ total 16 -rw-rw-r--. 1 brad brad 1050 2010-05-20 00:33 schema-DashDB.sql -rw-rw-r--. 1 brad brad 9216 2010-05-20 00:33 dash.db On the emulator, no assets folder: # pwd /data/data/com.gtosoft.dash # ls -l drwxr-xr-x system system 2010-05-20 00:46 lib # I just want to package a pre-built database with my app and then open it to obtain data when needed. Just tried it on my Moto Droid, unable to access/open the DB, just like the emulator: DBFile=/data/data/com.gtosoft.dash/assets/dash.db Building the DB on the fly from a schema file is out of the question because its such a slow process (about 5-10 statements per second is all I get for throughput).

    Read the article

  • Solaris 11 Resources

    - by user12618891
    .. Oracle Solaris 11 (November, 2011) Oracle Solaris 11 Landing Page Download Oracle Solaris 11 Oracle Solaris 11 Documentation Solaris 11 End-of-Life Notices What's New in Oracle Solaris 11 (blog) Oracle Solaris 11 Feature Demo Videos (blog) Solaris 11 Developer Resources (November, 2011) Oracle Solaris 11 ISV Adoption Guide Oracle Solaris 11 Preflight Checker The IPS System Repository (blog) Packaging and Delivering Software with the Image Packaging System - A Developer's Guide How to Create and Publish packages to an IPS Repository on Solaris 11 Solaris 10 Branded zone VM Templates for Solaris 11 (blog) Oracle Solaris 11 Security: What's New For Developers Optimizing Application with Oracle Solaris Studio Tools and CompilersOther Solaris 11 Technology Spotlights (Landing Page)

    Read the article

< Previous Page | 923 924 925 926 927 928 929 930 931 932 933 934  | Next Page >