Search Results

Search found 56327 results on 2254 pages for 'console application'.

Page 710/2254 | < Previous Page | 706 707 708 709 710 711 712 713 714 715 716 717  | Next Page >

  • Displaying JSON in your Browser

    - by Rick Strahl
    Do you work with AJAX requests a lot and need to quickly check URLs for JSON results? Then you probably know that it’s a fairly big hassle to examine JSON results directly in the browser. Yes, you can use FireBug or Fiddler which work pretty well for actual AJAX requests, but if you just fire off a URL for quick testing in the browser you usually get hit by the Save As dialog and the download manager, followed by having to open the saved document in a text editor in FireFox. Enter JSONView which allows you to simply display JSON results directly in the browser. For example, imagine I have a URL like this: http://localhost/westwindwebtoolkitweb/RestService.ashx?Method=ReturnObject&format=json&Name1=Rick&Name2=John&date=12/30/2010 typed directly into the browser and that that returns a complex JSON object. With JSONView the result looks like this: No fuss, no muss. It just works. Here the result is an array of Person objects that contain additional address child objects displayed right in the browser. JSONView basically adds content type checking for application/json results and when it finds a JSON result takes over the rendering and formats the display in the browser. Note that it re-formats the raw JSON as well for a nicer display view along with collapsible regions for objects. You can still use View Source to see the raw JSON string returned. For me this is a huge time-saver. As I work with AJAX result data using GET and REST style URLs quite a bit it’s a big timesaver. To quickly and easily display JSON is a key feature in my development day and JSONView for all its simplicity fits that bill for me. If you’re doing AJAX development and you often review URL based JSON results do yourself a favor and pick up a copy of JSONView. Other Browsers JSONView works only with FireFox – what about other browsers? Chrome Chrome actually displays raw JSON responses as plain text without any plug-ins. There’s no plug-in or configuration needed, it just works, although you won’t get any fancy formatting. [updated from comments] There’s also a port of JSONView available for Chrome from here: https://chrome.google.com/webstore/detail/chklaanhfefbnpoihckbnefhakgolnmc It looks like it works just about the same as the JSONView plug-in for FireFox. Thanks for all that pointed this out… Internet Explorer Internet Explorer probably has the worst response to JSON encoded content: It displays an error page as it apparently tries to render JSON as XML: Yeah that seems real smart – rendering JSON as an XML document. WTF? To get at the actual JSON output, you can use View Source. To get IE to display JSON directly as text you can add a Mime type mapping in the registry:   Create a new application/json key in: HKEY_CLASSES_ROOT\MIME\Database\ContentType\application/json Add a string value of CLSID with a value of {25336920-03F9-11cf-8FD0-00AA00686F13} Add a DWORD value of Encoding with a value of 80000 I can’t take credit for this tip – found it here first on Sky Sander’s Blog. Note that the CLSID can be used for just about any type of text data you want to display as plain text in the IE. It’s the in-place display mechanism and it should work for most text content. For example it might also be useful for looking at CSS and JS files inside of the browser instead of downloading those documents as well. © Rick Strahl, West Wind Technologies, 2005-2011Posted in ASP.NET  AJAX  

    Read the article

  • How to I fix software center after installing the Linux Mint MATE desktop?

    - by tinuz
    I installed the MATE desktop using this manual but now i can't open my Ubuntu Software Center and can't open the settings from the update manager. I removed mate desktop but it doesn't fix the problem, i also reinstalled the software center, software-properties-gtk and software-properties-common using: sudo apt-get update; sudo apt-get --purge --reinstall install software-center software-properties-common software-properties-gtk. But when using this line i get the following error: Reading package lists... Done Reading package lists... Done Building dependency tree Reading state information... Done 0 upgraded, 0 newly installed, 3 reinstalled, 0 to remove and 0 not upgraded. Need to get 0 B/735 kB of archives. After this operation, 0 B of additional disk space will be used. (Reading database ... 304824 files and directories currently installed.) Preparing to replace software-center 5.0.2 (using .../software-center_5.0.2_all.deb) ... Unpacking replacement software-center ... Preparing to replace software-properties-common 0.81.13.1 (using .../software-properties-common_0.81.13.1_all.deb) ... Unpacking replacement software-properties-common ... Preparing to replace software-properties-gtk 0.81.13.1 (using .../software-properties-gtk_0.81.13.1_all.deb) ... Unpacking replacement software-properties-gtk ... Processing triggers for desktop-file-utils ... Processing triggers for gnome-menus ... Processing triggers for bamfdaemon ... Rebuilding /usr/share/applications/bamf.index... Processing triggers for hicolor-icon-theme ... Processing triggers for man-db ... Processing triggers for shared-mime-info ... Unknown media type in type 'all/all' Unknown media type in type 'all/allfiles' Unknown media type in type 'uri/mms' Unknown media type in type 'uri/mmst' Unknown media type in type 'uri/mmsu' Unknown media type in type 'uri/pnm' Unknown media type in type 'uri/rtspt' Unknown media type in type 'uri/rtspu' Unknown media type in type 'interface/x-winamp-skin' Setting up software-center (5.0.2) ... Traceback (most recent call last): File "/usr/sbin/update-software-center", line 38, in <module> from softwarecenter.db.update import rebuild_database File "/usr/share/software-center/softwarecenter/db/update.py", line 59, in <module> from softwarecenter.db.database import parse_axi_values_file File "/usr/share/software-center/softwarecenter/db/database.py", line 26, in <module> from softwarecenter.db.application import Application File "/usr/share/software-center/softwarecenter/db/application.py", line 25, in <module> from softwarecenter.backend.channel import is_channel_available File "/usr/share/software-center/softwarecenter/backend/channel.py", line 25, in <module> from softwarecenter.distro import get_distro File "/usr/share/software-center/softwarecenter/distro/__init__.py", line 165, in <module> distro_instance=_get_distro() File "/usr/share/software-center/softwarecenter/distro/__init__.py", line 148, in _get_distro module = __import__(distro_id, globals(), locals(), [], -1) ImportError: No module named LinuxMint Setting up software-properties-common (0.81.13.1) ... Setting up software-properties-gtk (0.81.13.1) ... $ Is there a way to fix this problem without having to reinstall Ubuntu 11.10?? thanks in advance tinuz

    Read the article

  • BizTalk 2009 - BizTalk Benchmark Wizard: Installation

    - by StuartBrierley
    As previously detailed, I have completed a single server installation of BizTalk Server 2009 standard on my development laptop; a MacBook Pro Core2Duo running at 2.16Ghz with 2Gb of RAM.  Following this I also posted on my use of the BizTalk Server Best Practices Anaylser and how to configure the BizTalk SQL Server Jobs.  All of which means that I should have some confidence that I have a decent working BizTalk Server 2009 environment, Next I thought that it would be a good idea to try and get some idea of how this setup performs by carrying out some baseline tests that can then be replicated on the test and live servers. The aim of this would be to allow confident predictions to be made of how any solutions developed on a single "server" installation may be expected to perform when deployed to these multi-server BizTalk Server 2009 standard installations. The BizTalk Benchmark Wizard would seem to be the perfect tool for the job. The BizTalk Benchmark Wizard is a ultility that can be used to gain some validation of a BizTalk installation, giving a level of guidance on whether it is performing as might be expected. This utility should be used after BizTalk Server has been installed and before any solutions are deployed to the environment.  This will ensure that you are getting consistent and clean results from the BizTalk Benchmark Wizard. The BizTalk Benchmark Wizard applies load to the BizTalk Server environment under a choice of specific scenarios. During these scenarios performance counter information is collected and assessed against statistics that are appropriate to the BizTalk Server environment: "The executed scenarios may or may not be relative to any realistic scenario, and is only intended for testing. The BizTalk Benchmark Wizard has been developed in relation to the BizTalk Server 2009 Scale Out Testing Study. More information about the study can be found here: http://msdn.microsoft.com/en-us/library/ee377068(BTS.10).aspx" After downloading and installing the wizard you will need set up the Hosts, Instances and Adapter handlers.  This is done by running a script file using the “cscript” detailed below.  To do this you will need to open a command prompt window and navigate to the script folder; assuming the default installation location this would be C:\Program Files\Blogical\BizTalk Benchmark Wizard\Artefacts\BizTalk. In this folder you should find an InstallHosts.vbs file which can be executed using the following parameters: NTGroupName - The name of the Windows NT group. UserName – The name of the user account running the service instances. Password – The password of the user account running the service instances. Receive Host – The name of the server where you want to run the receive host instance.  Send Host - The name of the server where you want to run the sen host instance. Processing Host - The name of the server where you want to run the process host instance. By default the script is set up for 64 bit hosts, so if you are running in 32 bit environment make sure that you change the following line in the script before continuing: from:   objHS.IsHost32BitOnly = False to:    objHS.IsHost32BitOnly = True If you have a single box installation, your script command might look like this: cscript InstallHosts.vbs "BizTalk Application Users" “\MyUser” “MyPassword” “BtsServer1” “BtsServer1” “BtsServer1” If you have a multi server installation, your script command might look like this: cscript InstallHosts.vbs "MyDomain\BizTalk Application Users" “MyDomain\MyUser” “MyPassword” “BtsServer1” “BtsServer2” “BtsServer2” Running this script will create: Three hosts (BBW_RxHost, BBW_TxHost and BBW_PxHost) Three host instances One send and one receive adapter handler for the WCF NetTcp adapter. You will then need to import the BizTalk MSI via the BizTalk Administration Console.  Open the BizTalk Administration Console, point to the “Applications” node and import the BizTalk Benchmark Wizard.msi found in the same folder as the script above. This will create a “BizTalk Benchmark Wizard” application along with all ports and orchestrations needed. To finish the installation you will need to run the BizTalk Benchmark Wizard.msi on all BizTalk servers to add the assemblies to the Global Assembly Cache (GAC). Next I will look at running the BizTalk Benchmark Wizard.

    Read the article

  • Tool to convert blogger.com content to dasBlog

    - by Daniel Moth
    Due to blogger.com dropping FTP support, I've had to move my blog. If you are in a similar situation, this post will help you by showing you the necessary steps to take. Goals No loss on blog posts, comments AND all existing permalinks continue to work (redirect to the correct place). Steps Download the XML files corresponding to your blogger.com content and store them in a folder. Install and configure dasBlog on your local machine. Configure your web.config file (will need updating once you run step 4). Use the tool I describe further down to generate the content and place it at the right place. Test your site locally. Once you are happy, repeat step 2 on your hosting provider of choice. Remember to copy up your dasBlog theme folder if you created one. Copy up the local web.config file and the XML dasBlog content files generated by the tool of step 4. Test your site on the server. Once you are happy, go live (following instructions from your hoster). In my case, I gave the nameservers from my new hoster to my existing domain registrar and they made the switch. Tool (code) At step 4 above I referred to a tool. That is an overstatement, it is simply one 450-line C#code file that you can download here: BloggerToDasBlog.cs. I used this from a .NET 2.0 console app (and I run it under the Visual Studio debugger, i.e. F5) like this: Program.cs. The console app referenced the dasBlog 2.3 ASP.NET Blogging Engine i.e. the newtelligence.DasBlog.Runtime.dll assembly. Let me describe what the code does: Input: A path to a folder where the XML files from the old blogger.com blog reside. It can deal with both types of XML file. A full file path to a file where it creates XML redirect input (as required by the rewriteMap mentioned here). The blog URL. The author's email. The blog author name. A path to an empty folder where the new XML dasBlog content files will get created. The subfolder name used after the domain name in the URL. The 3 reg ex patterns to use. You can use the same as mine, but will need to tweak the monthly_archive rule. Again, to see what values I passed for all the above, see my Program.cs file. Output: It creates dasBlog XML files in the folder specified. It creates those by parsing the old blogger.com XML files that reside in the folder specified. After that is generated, copy it to the "Content" folder under your dasBlog installation. It creates an XML file with a single ignorable root element and a bunch of inner XML elements. You can copy paste these in the web.config file as discussed in this post. Other notes: For each blog post, it detects outgoing links to itself (i.e. to the same blog), and rewrites those to point to the new URLs. So internal links do not rely on the web.config redirects. It deals with duplicate post titles; it does not deal with triplicates and higher. Removes all references to blogger.com (e.g. references to [email protected], the injected hidden footer for statistics that each blog post has and others – see the code). It creates a lot of diagnostic output (in the Output window) and indeed the documentation for the code is in the Debug.WriteLine statements ;) This is not code I will maintain or support – it was a throwaway one-use project that I am sharing here as a starting point for anyone finding themselves in the same boat that I was. Enjoy "as is". Comments about this post welcome at the original blog.

    Read the article

  • Java Resources for Windows Azure

    - by BuckWoody
    Windows Azure is a Platform as a Service – a PaaS – that runs code you write. That code doesn’t just mean the languages on the .NET platform – you can run code from multiple languages, including Java. In fact, you can develop for Windows and SQL Azure using not only Visual Studio but the Eclipse Integrated Development Environment (IDE) as well.  Although not an exhaustive list, here are several links that deal with Java and Windows Azure: Resource Link Windows Azure Java Development Center http://www.windowsazure.com/en-us/develop/java/  Java Development Guidance http://msdn.microsoft.com/en-us/library/hh690943(VS.103).aspx  Running a Java Environment on Windows Azure http://blogs.technet.com/b/port25/archive/2010/10/28/running-a-java-environment-on-windows-azure.aspx  Running a Java Environment on Windows Azure http://blogs.technet.com/b/port25/archive/2010/10/28/running-a-java-environment-on-windows-azure.aspx  Run Java with Jetty in Windows Azure http://blogs.msdn.com/b/dachou/archive/2010/03/21/run-java-with-jetty-in-windows-azure.aspx  Using the plugin for Eclipse http://blogs.msdn.com/b/craig/archive/2011/03/22/new-plugin-for-eclipse-to-get-java-developers-off-the-ground-with-windows-azure.aspx  Run Java with GlassFish in Windows Azure http://blogs.msdn.com/b/dachou/archive/2011/01/17/run-java-with-glassfish-in-windows-azure.aspx  Improving experience for Java developers with Windows  Azure http://blogs.msdn.com/b/interoperability/archive/2011/02/23/improving-experience-for-java-developers-with-windows-azure.aspx  Java Access to SQL Azure via the JDBC Driver for SQL  Server http://blogs.msdn.com/b/brian_swan/archive/2011/03/29/java-access-to-sql-azure-via-the-jdbc-driver-for-sql-server.aspx  How to Get Started with Java, Tomcat on Windows Azure http://blogs.msdn.com/b/usisvde/archive/2011/03/04/how-to-get-started-with-java-tomcat-on-windows-azure.aspx  Deploying Java Applications in Azure http://blogs.msdn.com/b/mariok/archive/2011/01/05/deploying-java-applications-in-azure.aspx  Using the Windows Azure Storage Explorer in Eclipse http://blogs.msdn.com/b/brian_swan/archive/2011/01/11/using-the-windows-azure-storage-explorer-in-eclipse.aspx  Windows Azure Tomcat Solution Accelerator http://archive.msdn.microsoft.com/winazuretomcat  Deploying a Java application to Windows Azure with  Command-line Ant http://java.interoperabilitybridges.com/articles/deploying-a-java-application-to-windows-azure-with-command-line-ant  Video: Open in the Cloud: Windows Azure and Java http://channel9.msdn.com/Events/PDC/PDC10/CS10  AzureRunMe  http://azurerunme.codeplex.com/  Windows Azure SDK for Java http://www.interoperabilitybridges.com/projects/windows-azure-sdk-for-java  AppFabric SDK for Java http://www.interoperabilitybridges.com/projects/azure-java-sdk-for-net-services  Information Cards for Java http://www.interoperabilitybridges.com/projects/information-card-for-java  Apache Stonehenge http://www.interoperabilitybridges.com/projects/apache-stonehenge  Channel 9 Case Study on Java and Windows Azure http://www.microsoft.com/casestudies/Windows-Azure/Gigaspaces/Solution-Provider-Streamlines-Java-Application-Deployment-in-the-Cloud/400000000081   

    Read the article

  • Uncovering Compiler Errors in ASP.NET MVC Views

    - by Ben Griswold
    ASPX and ASCX files are compiled on the fly when they are requested on the web server. This means it’s possible that you aren’t catching compile errors associated with your views when you build your ASP.NET MVC project in Visual Studio.  Unless you’re willing to click through your entire application, rendering each view looking for errors, you application is left a little vulnerable to user issues.  Fortunately, there’s a work around.  Open up your MVC project file in notepad or within the Visual Studio IDE by unloading the project and then editing the .csproj file (both actions are available by right-clicking on the Project Node in Solution Explorer.)  Notice the MvcBuildViews option.  It’s probably set to false.  Flip the value to true and you’ll magically start compiling your views when you build your application. <MvcBuildViews>false</MvcBuildViews> Taking this action will slow down your builds a bit, but if you’re a hack like me, it’ll probably save your day in the long run. Now you’re probably thinking, “Neat trick – how’s it work?”  Scroll down toward the bottom of your csproj file and you will notice the AfterBuild target triggers the AspNetCompiler action if the MvcBuildViews option is set to true.  <Target Name="AfterBuild" Condition="'$(MvcBuildViews)'=='true'">   <AspNetCompiler VirtualPath="temp"                   PhysicalPath="$(ProjectDir)\..\$(ProjectName)" /> </Target> Great. One more thing. Let’s say you don’t want to slow down all of your builds, but you absolutely want to know if there are any compiler issues with your views before you commit your code to version control or deploy or whatever.  Here’s what you can do – change the AfterBuild condition to run if your configuration is set to Release mode.  <Target Name="AfterBuild" Condition="'$(Configuration)'=='Release'">   <!– Always pre-compile ASPX and ASCX in release mode –>   <AspNetCompiler VirtualPath="temp"                   PhysicalPath="$(ProjectDir)\..\$(ProjectName)" /> </Target> Now your debug mode builds will continue to be as fast as ever and you can quickly validate your views by building in release mode when you so choose.  There’s one little catch – this setup won’t consider the MvcBuildViews option whatsoever! So if you decide to go with this configuration, you might want to add a comment near the MvcBuildViews option letting other developers know they can change the MvcBuildViews option as much as they’d like but it’s not going to affect the AfterBuild action.  Or don’t include the comment and let your team members figure it out for themselves…

    Read the article

  • VMWare player - compiling server modules - Ubuntu 13.10

    - by user211976
    While running Ubuntu 13.04 whenever the Linux kernel had been updated, this used to make vmware player happy: sudo apt-get install linux-headers-$(uname -r) sudo vmware-modconfig --console --install-all Yesterday I upgraded to Ubuntu 13.10 and lo and behold, the above workaround does not work anymore: Unable to install all modules. See log for details. I assume by "See log" it means the files in /tmp/vmware-root/*log root@hugin:/tmp/vmware-root# ls -ltr /tmp/vmware-root/ totalt 16 -rw-r--r-- 1 root root 3815 nov 6 13:54 vmware-apploader-17267.log -rw-r--r-- 1 root root 0 nov 6 13:54 vmware-vmis-17693.log -rw-r--r-- 1 root root 0 nov 6 13:54 vmware-vmis-17742.log -rw-r--r-- 1 root root 0 nov 6 13:54 vmware-vmis-18701.log -rw-r--r-- 1 root root 0 nov 6 13:54 vmware-vmis-18750.log -rw-r--r-- 1 root root 0 nov 6 13:54 vmware-vmis-19100.log -rw-r--r-- 1 root root 0 nov 6 13:54 vmware-vmis-19149.log -rw-r--r-- 1 root root 9250 nov 6 13:54 vmware-modconfig-17267.log root@hugin:/tmp/vmware-root# tail /tmp/vmware-root/vmware-modconfig-17267.log 2013-11-06T13:54:28.950+01:00| modconfig| I120: Copied Module.symvers from "/tmp/modconfig-wpDrtf/vmci-only/Module.symvers" to "/tmp/modconfig-wpDrtf/vsock-only/Module.symvers". 2013-11-06T13:54:28.950+01:00| modconfig| I120: Building module with command "/usr/bin/make -j8 -C /tmp/modconfig-wpDrtf/vsock-only auto-build HEADER_DIR=/lib/modules/3.11.0-12-generic/build/include CC=/usr/bin/gcc IS_GCC_3=no" 2013-11-06T13:54:31.048+01:00| modconfig| I120: Successfully built vsock. Module is currently at "/tmp/modconfig-wpDrtf/vsock.o". 2013-11-06T13:54:31.048+01:00| modconfig| I120: Found the vsock symvers file at "/tmp/modconfig-wpDrtf/vsock-only/Module.symvers". 2013-11-06T13:54:31.048+01:00| modconfig| I120: Installing vsock from /tmp/modconfig-wpDrtf/vsock.o to /lib/modules/3.11.0-12-generic/misc/vsock.ko. 2013-11-06T13:54:31.048+01:00| modconfig| I120: Registering file "/lib/modules/3.11.0-12-generic/misc/vsock.ko". 2013-11-06T13:54:31.400+01:00| modconfig| I120: "/usr/lib/vmware-installer/2.1.0/vmware-installer" exited with status 0. 2013-11-06T13:54:31.400+01:00| modconfig| I120: Registering file "/usr/lib/vmware/symvers/vsock-3.11.0-12-generic". 2013-11-06T13:54:31.764+01:00| modconfig| I120: "/usr/lib/vmware-installer/2.1.0vmware-installer" exited with status 0. 2013-11-06T13:54:31.786+01:00| modconfig| I120: We are now shutdown. Ready to die! root@hugin:/tmp/vmware-root# tail /tmp/vmware-root/vmware-apploader-17267.log 2013-11-06T13:54:20.911+01:00| appLoader| I120: libglib-2.0.so.0 <SYSTEM> 2013-11-06T13:54:20.911+01:00| appLoader| I120: libz.so.1 <SYSTEM> 2013-11-06T13:54:20.911+01:00| appLoader| I120: libvmware-modconfig-console.so <SHIPPED> 2013-11-06T13:54:20.912+01:00| appLoader| I120: Shipped glib version is 2.24 2013-11-06T13:54:20.912+01:00| appLoader| I120: System glib version is 2.38 2013-11-06T13:54:20.912+01:00| appLoader| I120: Using system version of glib. 2013-11-06T13:54:20.912+01:00| appLoader| I120: Loading system version of libgcc_s.so.1. 2013-11-06T13:54:20.912+01:00| appLoader| I120: Loading system version of libglib-2.0.so.0. 2013-11-06T13:54:20.912+01:00| appLoader| I120: Loading system version of libz.so.1. 2013-11-06T13:54:20.912+01:00| appLoader| I120: Loading shipped version of libxml2.so.2.

    Read the article

  • How to remove the Mint stuff from my system after installing MATE desktop using MINT repository

    - by tinuz
    I installed the MATE desktop using this manual http://www.unixmen.com/linux-tutorials/linux-distributions/linux-distributions4-ubuntu/1975-install-linuxmint12-extensions-mate-a-mgse-in-ubuntu-1110- but now i can't open my Ubuntu Software Center and can't open the settings from the update manager. I removed mate desktop but it doesn't fix the problem, i also reinstalled the software center, software-properties-gtk and software-properties-common using: sudo apt-get update; sudo apt-get --purge --reinstall install software-center software-properties-common software-properties-gtk. But when using this line i get the following error: Reading package lists... Done Reading package lists... Done Building dependency tree Reading state information... Done 0 upgraded, 0 newly installed, 3 reinstalled, 0 to remove and 0 not upgraded. Need to get 0 B/735 kB of archives. After this operation, 0 B of additional disk space will be used. (Reading database ... 304824 files and directories currently installed.) Preparing to replace software-center 5.0.2 (using .../software-center_5.0.2_all.deb) ... Unpacking replacement software-center ... Preparing to replace software-properties-common 0.81.13.1 (using .../software-properties-common_0.81.13.1_all.deb) ... Unpacking replacement software-properties-common ... Preparing to replace software-properties-gtk 0.81.13.1 (using .../software-properties-gtk_0.81.13.1_all.deb) ... Unpacking replacement software-properties-gtk ... Processing triggers for desktop-file-utils ... Processing triggers for gnome-menus ... Processing triggers for bamfdaemon ... Rebuilding /usr/share/applications/bamf.index... Processing triggers for hicolor-icon-theme ... Processing triggers for man-db ... Processing triggers for shared-mime-info ... Unknown media type in type 'all/all' Unknown media type in type 'all/allfiles' Unknown media type in type 'uri/mms' Unknown media type in type 'uri/mmst' Unknown media type in type 'uri/mmsu' Unknown media type in type 'uri/pnm' Unknown media type in type 'uri/rtspt' Unknown media type in type 'uri/rtspu' Unknown media type in type 'interface/x-winamp-skin' Setting up software-center (5.0.2) ... Traceback (most recent call last): File "/usr/sbin/update-software-center", line 38, in <module> from softwarecenter.db.update import rebuild_database File "/usr/share/software-center/softwarecenter/db/update.py", line 59, in <module> from softwarecenter.db.database import parse_axi_values_file File "/usr/share/software-center/softwarecenter/db/database.py", line 26, in <module> from softwarecenter.db.application import Application File "/usr/share/software-center/softwarecenter/db/application.py", line 25, in <module> from softwarecenter.backend.channel import is_channel_available File "/usr/share/software-center/softwarecenter/backend/channel.py", line 25, in <module> from softwarecenter.distro import get_distro File "/usr/share/software-center/softwarecenter/distro/__init__.py", line 165, in <module> distro_instance=_get_distro() File "/usr/share/software-center/softwarecenter/distro/__init__.py", line 148, in _get_distro module = __import__(distro_id, globals(), locals(), [], -1) ImportError: No module named LinuxMint Setting up software-properties-common (0.81.13.1) ... Setting up software-properties-gtk (0.81.13.1) ... $ Is there a way to fix this problem without having to reinstall Ubuntu 11.10?? thanks in advance tinuz

    Read the article

  • BI and EPM Landscape

    - by frank.buytendijk
    Most of my blog entries are not about Oracle products, and most of the latest entries are about topics such as IT strategy and enterprise architecture. However, given my background at Gartner, and at Hyperion, I still keep a close eye on what's happening in BI and EPM. One important reason is that I believe there is significant competitive value for organizations getting BI and EPM right. Davenport and Harris wrote a great book called "Competing on Analytics", in which they explain this in a very engaging and convincing way. At Oracle we have defined the concept of "management excellence" that outlines what organizations have to do to keep or create a competitive edge. It's not only in the business processes, but also in the management processes. Recently, Gartner published its 2009 market shares report for BI, Analytics, and Performance Management. Gartner identifies the same three segments that Oracle does: (1) CPM Suites (Oracle refers not to Corporate Performance Management, but Enterprise Performance Management), (2) BI Platform, and (3) Analytic Applications & Performance Management. According to Gartner, Oracle's share is increasing with revenue growing by more than 5%. Oracle currently holds the #2 market share position in the overall BI Software space based on total BI software revenue. Source: Gartner Dataquest Market Share: Business Intelligence, Analytics and Performance Management Software, Worldwide, 2009; Dan Sommer and Bhavish Sood; Apr 2010 Gartner has ranked Oracle as #1 in the CPM Suites worldwide sub-segment based on total BI software revenue, and Oracle is gaining share with revenue growing by more than 6% in 2009. Source: Gartner Dataquest Market Share: Business Intelligence, Analytics and Performance Management Software, Worldwide, 2009; Dan Sommer and Bhavish Sood; Apr 2010 The Analytic Applications & Performance Management subsegment is more fragmented. It has for instance a very large "Other Vendors" category. The largest player traditionally is SAS. Analytic Applications are often meant for very specific analytic needs in very specific industry sectors. According to Gartner, from the large vendors, again Oracle is the one who is gaining the most share - with total BI software revenue growth close to 15% in 2009. Source: Gartner Dataquest Market Share: Business Intelligence, Analytics and Performance Management Software, Worldwide, 2009; Dan Sommer and Bhavish Sood; Apr 2010 I believe this shows Oracle's integration strategy is working. In fact, integration actually is the innovation. BI and EPM have been silo technology platforms and application suites way too long. Management and measuring performance should be very closely linked to strategy execution, which is the domain of other business application areas such as CRM, ERP, and Supply Chain. BI and EPM are not about "making better decisions" anymore, but are part of a tangible action framework. Furthermore, organizations are getting more serious about ecosystem thinking. They do not evaluate single tools anymore for different application areas, but buy into a complete ecosystem of hardware, software and services. The best ecosystem is the one that offers the most options, in environments where the uncertainty is high and investments are hard to reverse. The key to successfully managing such an environment is middleware, and BI and EPM become increasingly middleware intensive. In fact, given the horizontal nature of BI and EPM, sitting on top of all business functions and applications, you could call them "upperware". Many are active in the BI and EPM space. Big players can offer a lot, but there are always many areas that are covered by specialty vendors. Oracle openly embraces those technologies within the ecosystem as well. Complete, open and integrated still accurately describes the Oracle product strategy. frank

    Read the article

  • ODI 11g – Insight to the SDK

    - by David Allan
    This post is a useful index into the ODI SDK that cross references the type names from the user interface with the SDK class and also the finder for how to get a handle on the object or objects. The volume of content in the SDK might seem a little ominous, there is a lot there, but there is a general pattern to the SDK that I will describe here. Also I will illustrate some basic CRUD operations so you can see how the SDK usage pattern works. The examples are written in groovy, you can simply run from the groovy console in ODI 11.1.1.6. Entry to the Platform   Object Finder SDK odiInstance odiInstance (groovy variable for console) OdiInstance Topology Objects Object Finder SDK Technology IOdiTechnologyFinder OdiTechnology Context IOdiContextFinder OdiContext Logical Schema IOdiLogicalSchemaFinder OdiLogicalSchema Data Server IOdiDataServerFinder OdiDataServer Physical Schema IOdiPhysicalSchemaFinder OdiPhysicalSchema Logical Schema to Physical Mapping IOdiContextualSchemaMappingFinder OdiContextualSchemaMapping Logical Agent IOdiLogicalAgentFinder OdiLogicalAgent Physical Agent IOdiPhysicalAgentFinder OdiPhysicalAgent Logical Agent to Physical Mapping IOdiContextualAgentMappingFinder OdiContextualAgentMapping Master Repository IOdiMasterRepositoryInfoFinder OdiMasterRepositoryInfo Work Repository IOdiWorkRepositoryInfoFinder OdiWorkRepositoryInfo Project Objects Object Finder SDK Project IOdiProjectFinder OdiProject Folder IOdiFolderFinder OdiFolder Interface IOdiInterfaceFinder OdiInterface Package IOdiPackageFinder OdiPackage Procedure IOdiUserProcedureFinder OdiUserProcedure User Function IOdiUserFunctionFinder OdiUserFunction Variable IOdiVariableFinder OdiVariable Sequence IOdiSequenceFinder OdiSequence KM IOdiKMFinder OdiKM Load Plans and Scenarios   Object Finder SDK Load Plan IOdiLoadPlanFinder OdiLoadPlan Load Plan and Scenario Folder IOdiScenarioFolderFinder OdiScenarioFolder Model Objects Object Finder SDK Model IOdiModelFinder OdiModel Sub Model IOdiSubModel OdiSubModel DataStore IOdiDataStoreFinder OdiDataStore Column IOdiColumnFinder OdiColumn Key IOdiKeyFinder OdiKey Condition IOdiConditionFinder OdiCondition Operator Objects   Object Finder SDK Session Folder IOdiSessionFolderFinder OdiSessionFolder Session IOdiSessionFinder OdiSession Schedule OdiSchedule How to Create an Object? Here is a simple example to create a project, it uses IOdiEntityManager.persist to persist the object. import oracle.odi.domain.project.OdiProject; import oracle.odi.core.persistence.transaction.support.DefaultTransactionDefinition; txnDef = new DefaultTransactionDefinition(); tm = odiInstance.getTransactionManager() txnStatus = tm.getTransaction(txnDef) project = new OdiProject("Project For Demo", "PROJECT_DEMO") odiInstance.getTransactionalEntityManager().persist(project) tm.commit(txnStatus) How to Update an Object? This update example uses the methods on the OdiProject object to change the project’s name that was created above, it is then persisted. import oracle.odi.domain.project.OdiProject; import oracle.odi.domain.project.finder.IOdiProjectFinder; import oracle.odi.core.persistence.transaction.support.DefaultTransactionDefinition; txnDef = new DefaultTransactionDefinition(); tm = odiInstance.getTransactionManager() txnStatus = tm.getTransaction(txnDef) prjFinder = (IOdiProjectFinder)odiInstance.getTransactionalEntityManager().getFinder(OdiProject.class); project = prjFinder.findByCode("PROJECT_DEMO"); project.setName("A Demo Project"); odiInstance.getTransactionalEntityManager().persist(project) tm.commit(txnStatus) How to Delete an Object? Here is a simple example to delete all of the sessions, it uses IOdiEntityManager.remove to delete the object. import oracle.odi.domain.runtime.session.finder.IOdiSessionFinder; import oracle.odi.domain.runtime.session.OdiSession; import oracle.odi.core.persistence.transaction.support.DefaultTransactionDefinition; txnDef = new DefaultTransactionDefinition(); tm = odiInstance.getTransactionManager() txnStatus = tm.getTransaction(txnDef) sessFinder = (IOdiSessionFinder)odiInstance.getTransactionalEntityManager().getFinder(OdiSession.class); sessc = sessFinder.findAll(); sessItr = sessc.iterator() while (sessItr.hasNext()) {   sess = (OdiSession) sessItr.next()   odiInstance.getTransactionalEntityManager().remove(sess) } tm.commit(txnStatus) This isn't an all encompassing summary of the SDK, but covers a lot of the content to give you a good handle on the objects and how they work. For details of how specific complex objects are created via the SDK, its best to look at postings such as the interface builder posting here. Have fun, happy coding!

    Read the article

  • Spending the summer at camp… Web Camp, that is

    - by Jon Galloway
    Microsoft is sponsoring a series of Web Camps this summer. They’re a series of free two day events being held worldwide, and I’m really excited about being taking part. The camp is targeted at a broad range of developer background and experience. Content builds from 101 level introductory material to 200-300 level coverage, but we hit some advanced bits (e.g. MVC 2 features, jQuery templating, IIS 7 features, etc.) that advanced developers may not yet have seen. We start with a lap around ASP.NET & Web Forms, then move on to building and application with ASP.NET MVC 2, jQuery, and Entity Framework 4, and finally deploy to IIS. I got to spend some time working with James before the first Web Camp refining the content, and I think he’s packed about as much goodness into the time available as is scientifically possible. The content is really code focused – we start with File/New Project and spend the day building a real, working application. The second day of the Web Camp provides attendees an opportunity to get hands on. There are two options: Join a team and build an application of your choice Work on a lab or tutorial James Senior and I kicked off the fun with the first Web Camp in Toronto a few weeks ago. It was sold out, lots of fun, and by all accounts a great way to spend two days. I’m really enthusiastic about the format. Rather than just listening to speakers and then forgetting everything in a few days, attendees actually build something of their choice. They get an opportunity to pitch projects they’re interested in, form teams, and build it – getting experience with “real world” problems, with all the help they need from experienced developers. James got help on the second day practical part from the good folks that run Startup Weekend. Startup Weekend is a fantastic program that gathers developers together to build cool apps in a weekend, so their input on how to organize successful teams for weekend projects was invaluable. Nick Seguin joined us in Toronto, and in addition to making sure that everything flowed smoothly, he just added a lot of fun and excitement to the event, reminding us all about how much fun it is to come up with a cool idea and just build it. In addition to the Toronto camp, I’ll be at the Mountain View, London, Munich, and New York camps over the next month. London is sold out, but the rest still have space available, so come join us! Here’s the full list, with the ones I’ll be at bolded because - you know - it’s my blog. The the whole speaker list is great, including Scott Guthrie, Scott Hanselman, James Senior, Rachel Appel, Dan Wahlin, and Christian Wenz. Toronto May 7-8 (James Senior and I were thrown out on our collective ears) Moscow May 19 Beijing May 21-22 Shanghai May 24-25 Mountain View May 27-28 (I’m speaking with Rachel Appel) Sydney May 28-29 Singapore June 04-05 London June 04-05 (I’m speaking with Christian Wenz – SOLD OUT) Munich June 07-08 (I’m speaking with Christian Wenz) Chicago June 11-12 Redmond, WA June 18-19 New York June 25-26 (I’m speaking with Dan Wahlin) Come say hi!

    Read the article

  • SQL SERVER – List of All the Samples Database Available to Download for FREE

    - by Pinal Dave
    It is pretty much very common to have a sample database for any database product. Different companies keep on improving their product and keep on coming up with innovation in their product. To demonstrate the capability of their new enhancements they need the sample database. Microsoft have various sample database available for free download for their SQL Server Product. I have collected them here in a single blog post. Download an AdventureWorks Database The AdventureWorks OLTP database supports standard online transaction processing scenarios for a fictitious bicycle manufacturer (Adventure Works Cycles). Scenarios include Manufacturing, Sales, Purchasing, Product Management, Contact Management, and Human Resources. Coconut Dal Coconut Dal is a lightweight data access layer, for use in projects where the Entity Framework cannot be used or Microsoft’s Enterprise Library Data Block is unsuitable. Anyone who is handwriting ADO.NET should use a library instead and Coconut Dal might be the answer.  DataBooster – Extension to ADO.NET Data Provider The dbParallel DataBooster library is a high-performance extension to ADO.NET Data Provider, includes two aspects: 1) A slimmed down API encapsulation which simplified the most common data access operations (DbConnection -> DbCommand -> DbParameter -> DbDataReader) into a single class DbAccess, to help application with a clean DAL, avoid over-packing and redundant-copy of data transfer. 2) A booster for writing mass data onto database. Base on a rational utilization of database concurrency and a effective utilization of network bandwidth. Tabular AMO 2012 The sample is made of two project parts. The first part is a library of functions to manage tabular models -AMO2Tabular V2-. The second part is a sample to build a tabular model -AdventureWorks Tabular AMO 2012- using the AMO2Tabular library; the created model is similar to the ‘AdventureWorks Tabular Model 2012. SQL Server Analysis Services Product Samples SQL Server Analysis Services provides, a unified and integrated view of all your business data as the foundation for all of your traditional reporting, online analytical processing (OLAP) analysis, Key Performance Indicator (KPI) scorecards, and data mining. Analysis Services Samples for SQL Server 2008 R2 This release is dedicated to the samples that ship for Microsoft SQL Server 2008R2. For many of these samples you will also need to download the AdventureWorks family of databases. SQL Server Reporting Services Product Samples This project contains Reporting Services samples released with Microsoft SQL Server product. These samples are in the following five categories: Application Samples, Extension Samples, Model Samples, Report Samples, and Script Samples. If you are interested in contributing Reporting Services samples, please let us know by posting in the developers’ forum. Reporting Services Samples for SQL Server 2008 R2 This release is dedicated to the samples that ship for Microsoft SQL Server 2008 R2 PCU1. For many of these samples you will also need to download the AdventureWorks family of databases. SQL Server Integration Services Product Samples This project contains Integration Services samples released with Microsoft SQL Server product. These samples are in the following two categories: Package Samples and Programming Samples. If you are interested in contributing Integration Services samples, please let us know by posting in the developers’ forum. Integration Services Samples for SQL Server 2008 R2 This release is dedicated to the samples that ship for Microsoft SQL Server 2008R2. For many of these samples you will also need to download the AdventureWorks family of databases. Windows Azure SQL Reporting Admin Sample The SQLReportingAdmin sample for Windows Azure SQL Reporting demonstrates the usage of SQL Reporting APIs, and manages (add/update/delete) permissions of SQL Reporting users. Windows Azure SQL Reporting ReportViewer-SOAP API usage sample These sample projects demonstrate how to embed a Microsoft ReportViewer control that points to reports hosted on SQL Reporting report servers and how to use SQL Reporting SOAP APIs in your Windows Azure Web application. Enterprise Library 5.0 – Integration Pack for Windows Azure This NuGet package contains a zip file with the source code for the Enterprise Library Integration Pack for Windows Azure.  Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Download, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology Tagged: SQL Sample Database

    Read the article

  • Oracle Forms Migration to ADF - Webinar vom ORACLE Partner PITSS

    - by Thomas Leopold
      Tuesday, February 22, 2011 5:00 PM - 6:00 PM CET Free Webinar Re-Engineering Legacy Oracle Forms Migration from Forms to ADF - A Case Study Join Oracle's Grant Ronald and PITSS to see a software architecture comparison of Oracle Forms and ADF and a live step-by-step presentation on how to achieve a successful migration. Learn about various migration options, challenges and best practices to protect your current investment in Oracle Forms. PL/SQL is without match for what it does: manipulating data in the database. If you blindly migrate all your PL/SQL to Java you will, in all probability, end up with less maintainable and less efficient code. Instead you should consider which code it best left as PL/SQL..." Grant Ronald - "Migrating Oracle Forms to Fusion: Myth or Magic Bullet?" Re-Engineering existing business logic is mandatory for your legacy Forms application to take advantage of the new Software architectures like ADF. The PITSS.CON solution combines the deep understanding of Oracle Forms and Reports applications architecture with powerful re-engineering capabilities that allows the developer community to protect the investment in the existing Forms applications and to concentrate on fine-tuning and customization of the modernized functionality rather than manually recreating every module and business logic from bottom up. Registration: https://www2.gotomeeting.com/register/971702250   PITSS GmbHKönigsdorferstrasse 25D-82515 WolfratshausenDo not forget to check out these Free Webinars in March! Thursday, March 3, 2011 Upgrade and Modernize Your Application to Forms 11g Registration/Information Tuesday, March 15, 2011 Shaping the Future for your Oracle Forms Application:Forms 11g, ADF, APEX Registration/Information Tuesday, March 29, 2011 Oracle Forms Modernization to APEX Registration/Information Registration is limited, so sign up  today!Presented By:        Grant Ronald, Senior Group Product Manager,Oracle       Magdalena Serban, Product Manager,PITSS   Contact Us:            PITSS in Americas +1 248.740.0935 [email protected] www.pitssamerica.com       PITSS in Europe +49 (0) 717287 5200 [email protected] www.pitss.com   White Paper:      From Oracle Forms to Oracle ADF and JEE     © Copyright 2010 PITSS GmbH, Wolfratshausen, Stuttgart, München; Managing Directors: Dipl.-Ing. Andreas Gaede, Michael Kilimann, Dipl.-Ing. Dirk Fleischmann Commercial Register: HRB 125471 at District Court Munich. All rights reserved. Any duplication or further treatment in any medium, in parts or as a whole, requires a written agreement. If you do not want to receive invitations for events, meetings and seminars from us, then please click here.

    Read the article

  • Oracle Fusion Procurement Designed for User Productivity

    - by Applications User Experience
    Sean Rice, Manager, Applications User Experience Oracle Fusion Procurement Design Goals In Oracle Fusion Procurement, we set out to create a streamlined user experience based on the way users do their jobs. Oracle has spent hundreds of hours with customers to get to the heart of what users need to do their jobs. By designing a procurement application around user needs, Oracle has crafted a user experience that puts the tools that people need at their fingertips. In Oracle Fusion Procurement, the user experience is designed to provide the user with information that will drive navigation rather than requiring the user to find information. One of our design goals for Oracle Fusion Procurement was to reduce the number of screens and clicks that a user must go through to complete frequently performed tasks. The requisition process in Oracle Fusion Procurement (Figure 1) illustrates how we have streamlined workflows. Oracle Fusion Self-Service Procurement brings together billing metrics, descriptions of the order, justification for the order, a breakdown of the components of the order, and the amount—all in one place. Previous generations of procurement software required the user to navigate to several different pages to gather all of this information. With Oracle Fusion, everything is presented on one page. The result is that users can complete their tasks in less time. The focus is on completing the work, not finding the work. Figure 1. Creating a requisition in Oracle Fusion Self-Service Procurement is a consumer-like shopping experience. Will Oracle Fusion Procurement Increase Productivity? To answer this question, Oracle sought to model how two experts working head to head—one in an existing enterprise application and another in Oracle Fusion Procurement—would perform the same task. We compared Oracle Fusion designs to corresponding existing applications using the keystroke-level modeling (KLM) method. This method is based on years of research at universities such as Carnegie Mellon and research labs like Xerox Palo Alto Research Center. The KLM method breaks tasks into a sequence of operations and uses standardized models to evaluate all of the physical and cognitive actions that a person must take to complete a task: what a user would have to click, how long each click would take (not only the physical action of the click or typing of a letter, but also how long someone would have to think about the page when taking the action), and user interface changes that result from the click. By applying standard time estimates for all of the operators in the task, an estimate of the overall task time is calculated. Task times from the model enable researchers to predict end-user productivity. For the study, we focused on modeling procurement business process task flows that were considered business or mission critical: high-frequency tasks and high-value tasks. The designs evaluated encompassed tasks that are currently performed by employees, professional buyers, suppliers, and sourcing professionals in advanced procurement applications. For each of these flows, we created detailed task scenarios that provided the context for each task, conducted task walk-throughs in both the Oracle Fusion design and the existing application, analyzed and documented the steps and actions required to complete each task, and applied standard time estimates to the operators in each task to estimate overall task completion times. The Results The KLM method predicted that the Oracle Fusion Procurement designs would result in productivity gains in each task, ranging from 13 percent to 38 percent, with an overall productivity gain of 22.5 percent. These performance gains can be attributed to a reduction in the number of clicks and screens needed to complete the tasks. For example, creating a requisition in Oracle Fusion Procurement takes a user through only two screens, while ordering the same item in a previous version requires six screens to complete the task. Modeling user productivity has resulted not only in advances in Oracle Fusion applications, but also in advances in other areas. We leveraged lessons learned from the KLM studies to establish products like Oracle E-Business Suite (EBS). New user experience features in EBS 12.1.3, such as navigational improvements to the main menu, a Google-type search using auto-suggest, embedded analytics, and an in-context list of values tool help to reduce clicks and improve efficiency. For more information about KLM, refer to the Measuring User Productivity blog.

    Read the article

  • The new Auto Scaling Service in Windows Azure

    - by shiju
    One of the key features of the Cloud is the on-demand scalability, which lets the cloud application developers to scale up or scale down the number of compute resources hosted on the Cloud. Auto Scaling provides the capability to dynamically scale up and scale down your compute resources based on user-defined policies, Key Performance Indicators (KPI), health status checks, and schedules, without any manual intervention. Auto Scaling is an important feature to consider when designing and architecting cloud based solutions, which can unleash the real power of Cloud to the apps for providing truly on-demand scalability and can also guard the organizational budget for cloud based application deployment. In the past, you have had to leverage the the Microsoft Enterprise Library Autoscaling Application Block (WASABi) or a services like  MetricsHub for implementing Automatic Scaling for your cloud apps hosted on the Windows Azure. The WASABi required to host your auto scaling block in a Windows Azure Worker Role for effectively implementing the auto scaling behaviour to your Windows Azure apps. The newly announced Auto Scaling service in Windows Azure lets you add automatic scaling capability to your Windows Azure Compute Services such as Cloud Services, Web Sites and Virtual Machine. Unlike WASABi hosted on a Worker Role, you don’t need to host any monitoring service for using the new Auto Scaling service and the Auto Scaling service will be available to individual Windows Azure Compute Services as part of the Scaling. Configure Auto Scaling for a Windows Azure Cloud Service Currently the Auto Scaling service supports Cloud Services, Web Sites and Virtual Machine. In this demo, I will be used a Cloud Services app with a Web Role and a Worker Role. To enable the Auto Scaling, select t your Windows Azure app in the Windows Azure management portal, and choose “SCLALE” tab. The Scale tab will show the all information regards with Auto Scaling. The below image shows that we have currently disabled the AutoScale service. To enable Auto Scaling, you need to choose either CPU or QUEUE. The QUEUE option is not available for Web Sites. The image below demonstrates how to configure Auto Scaling for a Web Role based on the utilization of CPU. We have configured the web role app for running with 1 to 5 Virtual Machine instances based on the CPU utilization with a range of 50 to 80%. If the aggregate utilization is becoming above above 80%, it will scale up instances and it will scale down instances when utilization is becoming below 50%. The image below demonstrates how to configure Auto Scaling for a Worker Role app based on the messages added into the Windows Azure storage Queue. We configured the worker role app for running with 1 to 3 Virtual Machine instances based on the Queue messages added into the Windows Azure storage Queue. Here we have specified the number of messages target per machine is 2000. The image below shows the summary of the Auto Scaling for the Cloud Service after configuring auto scaling service. Summary Auto Scaling is an extremely important behaviour of the Cloud applications for providing on-demand scalability without any manual intervention. Windows Azure provides greater support for enabling Auto Scaling for the apps deployed on the Windows Azure cloud platform. The new Auto Scaling service in Windows Azure lets you add automatic scaling capability to your Windows Azure Compute Services such as Cloud Services, Web Sites and Virtual Machine. In the new Auto Scaling service, you don’t have to host any monitor service like you have had in WASABi block. The Auto Scaling service is an excellent alternative to the manually hosting WASABi block in a Worker Role app.

    Read the article

  • Using the StopWatch class to calculate the execution time of a block of code

    - by vik20000in
      Many of the times while doing the performance tuning of some, class, webpage, component, control etc. we first measure the current time taken in the execution of that code. This helps in understanding the location in code which is actually causing the performance issue and also help in measuring the amount of improvement by making the changes. This measurement is very important as it helps us understand the problem in code, Helps us to write better code next time (as we have already learnt what kind of improvement can be made with different code) . Normally developers create 2 objects of the DateTime class. The exact time is collected before and after the code where the performance needs to be measured.  Next the difference between the two objects is used to know about the time spent in the code that is measured. Below is an example of the sample code.             DateTime dt1, dt2;             dt1 = DateTime.Now;             for (int i = 0; i < 1000000; i++)             {                 string str = "string";             }             dt2 = DateTime.Now;             TimeSpan ts = dt2.Subtract(dt1);             Console.WriteLine("Time Spent : " + ts.TotalMilliseconds.ToString());   The above code works great. But the dot net framework also provides for another way to capture the time spent on the code without doing much effort (creating 2 datetime object, timespan object etc..). We can use the inbuilt StopWatch class to get the exact time spent. Below is an example of the same work with the help of the StopWatch class.             Stopwatch sw = Stopwatch.StartNew();             for (int i = 0; i < 1000000; i++)             {                 string str = "string";             }             sw.Stop();             Console.WriteLine("Time Spent : " +sw.Elapsed.TotalMilliseconds.ToString());   [Note the StopWatch class resides in the System.Diagnostics namespace] If you use the StopWatch class the time taken for measuring the performance is much better, with very little effort. Vikram

    Read the article

  • Calling XAI Inbound Services from Oracle BI Publisher

    - by ACShorten
    Note: This technique requires Oracle BI Publisher 1.1.3.4.1 which supports Service Complex Types. Web Services require credentials for authentication. Note: The deafults for the product installation are used in this article. If your site uses alternative values then substitute those alternatives where applicable. Note: Examples shown in this article are examples for illustrative purposes only. When building a report in Oracle BI Publisher it may be necessary to call an XAI Inbound Service to get information via the object rather than directly calling the database tables for various reasons: The CLOB fields used in the Object are accessible for a report. Note: CLOB fields cannot be used as criteria in the current release. Objects can take advantage of algorithms to format or calculate additional data that is not stored in the database directly. For example, Information format strings can automatically generated by the object which gives consistent information between a report and the online screens. To use this facility the following process must be performed: Ensure that the product group, cisusers by default, is enabled for the SPLServiceBean in the console. This allows BI Publisher access to call Web Services directly. To ensure this follow the instructions below: Logon to the Oracle WebLogic server console using an appropriate administrator account. By default the user system or weblogic is provided for this purpose. Navigate to the Security Realms section and select your configured realm. This is set to myrealm by default. In the Roles and Policies section, expand the SPLService section of the Deployments option to reveal the SPLServiceBean roles. If there is no role associated with the SPLServiceBean, create a new EJB role and specify the cisusers role, by default. For example:   Add a Role Condition to the role just created, with a Predicate List of Group and specify cisusers as the Group Argument Name. For example: Save all your changes. The XAI Inbound Services to be used by BI Publisher must be defined prior to using the interface. Refer to the XAI Best Practices (Doc Id: 942074.1) from My Oracle Support or via the online help for more information about this process. Inside BI Publisher create your report, according to the BI Publisher documentation. When specifying the dataset, under the Data Model Report option, specify the following to use an XAI Inbound Service as a data source: Parameter Comment Type Web Service Complex Type true Username Any valid user name within the product. This user MUST have security access to the objects referenced in the XAI Inbound Service Password Authentication password for Username Timeout Timeout, in seconds, set for the Web Service call. For example 60 seconds. WSDL URL Use the WSDL URL on the XAI Inbound Service definition as your WSDL URL. It will be in the following format by default:http://<host>:<port>/<server>/XAIApp/xaiserver/<service>?WSDLwhere: <host> - Host Name of Web Application Server <port> - Port allocated to Web Application Server for product access <server> - Server context for server <service> - XAI Inbound Service Name Note: For customers using secure transmission should substitute https instead of http and use the HTTPS port allocated to the product at installation time. Web Service Select the name of the service that shows in the drop-down menu. If no service name shows up, it means that Publisher could not establish a connection with the server or WSDL name provided in the above URL in order to get the service name. See BI Publisher server log for more information. Method Select the name of the Method that shows in the drop-down menu. A method name should show in the Method drop-down menu once the Web Service name is selected. For example: Additionally, filters can be used from the Web Service that can be generated, required or optional, from the WSDL in the Parameter List. For example:

    Read the article

  • OSB and Coherence Integration

    - by mark.ms.smith
    Anyone who has tried to manage Coherence nodes or tried to cache results in OSB, will appreciate the new functionality now available. As of WebLogic Server 10.3.4, you can use the WebLogic Administration Server, via the Administration Console or WLST, and java-based Node Manager to manage and monitor the life cycle of stand-alone Coherence cache servers. This is a great step forward as the previous options mainly involved writing your own scripts to do this. You can find an excellent description of how this works at James Bayer’s blog. You can also find the WebLogic documentation here.As of Oracle Service Bus 11gR1 (11.1.1.3.0), OSB now supports service result caching for Business Bervices with Coherence. If you use Business Services that return somewhat static results that do not change often, you can configure those Business Services to cache results. For Business Services that use result caching, you can control the time to live for the cached result. After the cached result expires, the next Business Service call results in invoking the back-end service to get the result. This result is then stored in the cache for future requests to access. I’m thinking that this caching functionality would be perfect for some sort of cross reference data that was refreshed nightly by batch. You can find the OSB Business Service documentation here.Result Caching in a dedicated JVMThis example demonstrates these new features by configuring a OSB Business Service to cache results in a separate Coherence JVM managed by WebLogic. The reason why you may want to use a separate, dedicated JVM is that the result cache data could potentially be quite large and you may want to protect your OSB java heap.In this example, the client will call an OSB Proxy Service to get Employee data based on an Employee Id. Using a Business Service, OSB calls an external system. The results are automatically cached and when called again, the respective results are retrieved from the cache rather than the external system.Step 1 – Set up your Coherence Server Via the OSB Administration Server Console, create your Coherence Server to be used as the results cache.Here are the configured Coherence Server arguments from the Server Start tab. Note that I’m using the default Cache Config and Override files in the domain.-Xms256m -Xmx512m -XX:PermSize=128m -XX:MaxPermSize=256m -Dtangosol.coherence.override=/app/middleware/jdev_11.1.1.4/user_projects/domains/osb_domain2/config/osb/coherence/osb-coherence-override.xml -Dtangosol.coherence.cluster=OSB-cluster -Dtangosol.coherence.cacheconfig=/app/middleware/jdev_11.1.1.4/user_projects/domains/osb_domain2/config/osb/coherence/osb-coherence-cache-config.xml -Dtangosol.coherence.distributed.localstorage=true -Dtangosol.coherence.management=all -Dtangosol.coherence.management.remote=true -Dcom.sun.management.jmxremote Just incase you need it, here is my Coherence Server classpath:/app/middleware/jdev_11.1.1.4/oracle_common/modules/oracle.coherence_3.6/coherence.jar: /app/middleware/jdev_11.1.1.4/modules/features/weblogic.server.modules.coherence.server_10.3.4.0.jar: /app/middleware/jdev_11.1.1.4/oracle_osb/lib/osb-coherence-client.jarBy default, OSB will try and create a local result cache instance. You need to disable this by adding the following JVM parameters to each of the OSB Managed Servers:-Dtangosol.coherence.distributed.localstorage=false -DOSB.coherence.cluster=OSB-clusterIf you need more information on configuring a remote result cache, have a look at the configuration documentration under the heading Using an Out-of-Process Coherence Cache Server.Step 2 – Configure your Business Service Under the respective Business Service Message Handling Configuration (Advanced Properties), you need to enable “Result Caching”. Additionally, you need to determine what the cache data will be keyed on. In the example below, I’m keying it on the unique Employee Id.The Results As this test was on my laptop, the actual timings are just an indication that there is a benefit to caching results. Using my test harness, I sent 10,000 requests to OSB, all with the same Employee Id. In this case, I had result caching disabled.You can see that this caused the back end Business Service (BS_GetEmployeeData) to be called for each request. Then after enabling result caching, I sent the same number of identical requests.You can now see the Business Service was only invoked once on the first request. All subsequent requests used the Results Cache.

    Read the article

  • How can I back up my ubuntu system?

    - by Eloff
    I'm sure there's a lot of questions on here similar to this, and I've been reading them, but I still feel this warrants a new question. I want nightly, incremental backups (full disk images would waste a lot of space - unless compressed somehow.) Preferably rotating or deleting old backups when running out of space or after a fixed number of backups. I want to be able to quickly and painlessly restore my system from these backups. This is my first time running ubuntu as my main development machine and I know from my experience with it as a server and in virtual machines that I regularly manage to make it unbootable or damage it to the point of being unable to rescue it. So how would you recommend I do this? There are so many options out there I really don't know where to start. There seems to be a vocal school of thought that it's sufficient to backup your home directory and the list of installed packages from the package manager. I've already installed lots of things from source, or outside of the package manager (development tools, ides, compilers, graphics drivers, etc.) So at the very least, if I do not back up the operating system itself I need to grab all config files, all program binaries, all created but required files, etc. I'd rather backup too much than too little - an ubuntu install is tiny anyway. Also this drastically reduces the restore time, which would cost me more in my time than the extra storage space. I tried using Deja Dup to backup the root partition, excluding some things like /mnt /media /dev /proc etc. Although many websites assured me you can backup a running linux system this way - that seems to be false as it complained that it could not backup the following files: /boot/System.map-3.0.0-17-generic /boot/System.map-3.2.0-22-generic /boot/vmcoreinfo-3.0.0-17-generic /boot/vmlinuz-3.0.0-17-generic /boot/vmlinuz-3.2.0-22-generic /etc/.pwd.lock /etc/NetworkManager/system-connections/LAN Connection /etc/apparmor.d/cache/lightdm-guest-session /etc/apparmor.d/cache/sbin.dhclient /etc/apparmor.d/cache/usr.bin.evince /etc/apparmor.d/cache/usr.lib.telepathy /etc/apparmor.d/cache/usr.sbin.cupsd /etc/apparmor.d/cache/usr.sbin.tcpdump /etc/apt/trustdb.gpg /etc/at.deny /etc/ati/inst_path_default /etc/ati/inst_path_override /etc/chatscripts /etc/cups/ssl /etc/cups/subscriptions.conf /etc/cups/subscriptions.conf.O /etc/default/cacerts /etc/fuse.conf /etc/group- /etc/gshadow /etc/gshadow- /etc/mtab.fuselock /etc/passwd- /etc/ppp/chap-secrets /etc/ppp/pap-secrets /etc/ppp/peers /etc/security/opasswd /etc/shadow /etc/shadow- /etc/ssl/private /etc/sudoers /etc/sudoers.d/README /etc/ufw/after.rules /etc/ufw/after6.rules /etc/ufw/before.rules /etc/ufw/before6.rules /lib/ufw/user.rules /lib/ufw/user6.rules /lost+found /root /run/crond.reboot /run/cups/certs /run/lightdm /run/lock/whoopsie/lock /run/udisks /var/backups/group.bak /var/backups/gshadow.bak /var/backups/passwd.bak /var/backups/shadow.bak /var/cache/apt/archives/lock /var/cache/cups/job.cache /var/cache/cups/job.cache.O /var/cache/cups/ppds.dat /var/cache/debconf/passwords.dat /var/cache/ldconfig /var/cache/lightdm/dmrc /var/crash/_usr_lib_x86_64-linux-gnu_colord_colord.102.crash /var/lib/apt/lists/lock /var/lib/dpkg/lock /var/lib/dpkg/triggers/Lock /var/lib/lightdm /var/lib/mlocate/mlocate.db /var/lib/polkit-1 /var/lib/sudo /var/lib/urandom/random-seed /var/lib/ureadahead/pack /var/lib/ureadahead/run.pack /var/log/btmp /var/log/installer/casper.log /var/log/installer/debug /var/log/installer/partman /var/log/installer/syslog /var/log/installer/version /var/log/lightdm/lightdm.log /var/log/lightdm/x-0-greeter.log /var/log/lightdm/x-0.log /var/log/speech-dispatcher /var/log/upstart/alsa-restore.log /var/log/upstart/alsa-restore.log.1.gz /var/log/upstart/console-setup.log /var/log/upstart/console-setup.log.1.gz /var/log/upstart/container-detect.log /var/log/upstart/container-detect.log.1.gz /var/log/upstart/hybrid-gfx.log /var/log/upstart/hybrid-gfx.log.1.gz /var/log/upstart/modemmanager.log /var/log/upstart/modemmanager.log.1.gz /var/log/upstart/module-init-tools.log /var/log/upstart/module-init-tools.log.1.gz /var/log/upstart/procps-static-network-up.log /var/log/upstart/procps-static-network-up.log.1.gz /var/log/upstart/procps-virtual-filesystems.log /var/log/upstart/procps-virtual-filesystems.log.1.gz /var/log/upstart/rsyslog.log /var/log/upstart/rsyslog.log.1.gz /var/log/upstart/ureadahead.log /var/log/upstart/ureadahead.log.1.gz /var/spool/anacron/cron.daily /var/spool/anacron/cron.monthly /var/spool/anacron/cron.weekly /var/spool/cron/atjobs /var/spool/cron/atspool /var/spool/cron/crontabs /var/spool/cups

    Read the article

  • Enable Full Screen Mode in Media Center Without Trapping the Mouse

    - by DigitalGeekery
    If you have a dual monitor setup and use Windows Media Center, you’re probably aware that when WMC is in full screen mode, it traps the mouse so you can’t work on a second monitor. Here we look at how to solve the annoyance. The Maxifier is an application that allows you to open Media Center in full screen mode without restricting the mouse. It relieves the annoyance of WMC capturing your mouse on a dual monitor setup. Note: If you don’t have two monitors attached, most of The Maxifier’s functions won’t work. Installation and Use Download, extract, and install The Maxifier. (See the download link below) The Maxifier runs minimized in the system tray and you access the options by right-clicking on the icon. If Media Center is not already open, you can choose Start Media Center to start WMC on the main start screen. Or, choose one of the other selections to open another area of Media Center. By default, Maxifier opens Media Center in full screen mode on the secondary monitor. When Media Center is open in full screen mode, you’ll notice you can now freely move your mouse around your multi-monitor setup. When Media Center is open, you’ll see five additional options. The Fit Screen options simply fits Media Center to the full screen, but still show the Windows borders. Full screen options put WMC in full screen mode.   The Maxifier Options allow you to choose from the various start up options. Selecting Watch for Media Center starting will prompt Maxifier to open WMC to the main start page in full screen mode on the secondary monitor automatically, even if you open Media Center without using The Maxifier.  (You may need to restart for this to take effect) If you have more than 2 monitors, you can define on which monitor to open Media Center, and which monitor you consider to be the main screen.   You can also define a number of Hotkeys in The Maxifier settings. First, select the Enable Hotkeys checkbox. To create a Hotkey, click in the text field and then press the keys to use as the Hotkey. To remove a Hotkey, click in the field and press the Delete key.   Conclusion The Maxifier is a simple program that enables Media Center users to take full advantage of a multi-monitor workspace. It works with both Vista and Windows 7. Version 1.4 is a stable application for Vista, and Version 1.5b is a beta application for Windows 7. Looking for more Media Center tips and tweaks? Check out some startup customizations for Windows 7 Media Center, how to automatically mount and view ISO’s in WMC, and how to add background images and themes to Windows 7 Media Center. Link Download the Maxifier Similar Articles Productive Geek Tips Startup Customizations for Media Center in Windows 7Using Netflix Watchnow in Windows Vista Media Center (Gmedia)Lock The Screen While in Full-Screen Mode in Windows Media PlayerSwitch Windows by Hovering the Mouse Over a Window in Windows 7 or VistaIntegrate Boxee with Media Center in Windows 7 TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips HippoRemote Pro 2.2 Xobni Plus for Outlook All My Movies 5.9 CloudBerry Online Backup 1.5 for Windows Home Server Steve Jobs’ iPhone 4 Keynote Video Watch World Cup Online On These Sites Speed Up Windows With ReadyBoost Awesome World Cup Soccer Calendar Nice Websites To Watch TV Shows Online 24 Million Sites

    Read the article

  • MVC 2 jQuery Client-side Validation

    - by nmarun
    Well, I watched Phil Haack’s show What's New in Microsoft ASP.NET MVC 2 and was impressed about the client-side validation (starts at 17:45) that MVC 2 offers. I tried creating the same, but Phil does not show what .js files need to be included and also I was not able to find the source code for the application that he used. In order to find out the required JavaScript file references, I added all of the files in my application to the page and ran it. Of course it worked, but this is definitely not an optimum solution. By removing one at a time and testing the app, I’ve short-listed the following ones: 1: <script src="../../Scripts/jquery-1.4.1.min.js" type="text/javascript"></script 2: <script src="../../Scripts/MicrosoftAjax.js" type="text/javascript"></script> 3: <script src="../../Scripts/MicrosoftMvcValidation.js" type="text/javascript"></script> Now, a little about the feature itself. Say, I’m working with a Book application so my model will look something like: 1: public class Book 2: { 3: [HiddenInput(DisplayValue = false)] 4: public int BookId { get; set; } 5:  6: [DisplayName("Book Title")] 7: [Required(ErrorMessage = "Book title is required")] 8: [StringLength(20, ErrorMessage = "Must be under 20 characters")] 9: public string Title { get; set; } 10:  11: [Required(ErrorMessage = "Author is required")] 12: [StringLength(40, ErrorMessage = "Must be under 40 characters")] 13: public string Author { get; set; } 14:  15: public decimal Price { get; set; } 16: 17: [DisplayName("ISBN")] 18: [StringLength(13, ErrorMessage = "Must be 13 characters")] 19: public string Isbn { get; set; } 20: } This ensures that the data passed will be validated upon post. But what would happen if you add the line (along with the above mentioned .js files): 1: <% Html.EnableClientValidation(); %> Now, this acts as ‘on-the-fly’ or ‘real-time’ validation. Now, when the user types 20 characters for the Title, the error shows up right on the 21st character. Beautiful… and you do not have to create the JavaScript function(s) for this. They’re auto-magically created for you. (Doing a ‘View Source’ on the browser page shows you the JavaScript logic that goes on behind the scenes). I bumped into another post that shows how .net 4 allows us to create custom validation attributes: Dynamic Range validation in MVC 2. This will help us attach virtually any business logic to the model itself. Please see the source code I’ve worked with.

    Read the article

  • ASP.NET Routing not working on IIS 7.0

    - by Rick Strahl
    I ran into a nasty little problem today when deploying an application using ASP.NET 4.0 Routing to my live server. The application and its Routing were working just fine on my dev machine (Windows 7 and IIS 7.5), but when I deployed (Windows 2008 R1 and IIS 7.0) Routing would just not work. Every time I hit a routed url IIS would just throw up a 404 error: This is an IIS error, not an ASP.NET error so this doesn’t actually come from ASP.NET’s routing engine but from IIS’s handling of expressionless URLs. Note that it’s clearly falling through all the way to the StaticFile handler which is the last handler to fire in the typical IIS handler list. In other words IIS is trying to parse the extension less URL and not firing it into ASP.NET but failing. As I mentioned on my local machine this all worked fine and to make sure local and live setups match I re-copied my Web.config, double checked handler mappings in IIS and re-copied the actual application assemblies to the server. It all looked exactly matched. However no workey on the server with IIS 7.0!!! Finally, totally by chance, I remembered the runAllManagedModulesForAllRequests attribute flag on the modules key in web.config and set it to true: <system.webServer> <modules runAllManagedModulesForAllRequests="true"> <add name="ScriptCompressionModule" type="Westwind.Web.ScriptCompressionModule,Westwind.Web" /> </modules> </system.webServer> And lo and behold, Routing started working on the live server and IIS 7.0! This seems really obvious now of course, but the really tricky thing about this is that on IIS 7.5 this key is not necessary. So on my Windows 7 machine ASP.NET Routing was working just fine without the key set. However on IIS 7.0 on my live server the same missing setting was not working. On IIS 7.0 this key must be present or Routing will not work. Oddly on IIS 7.5 it appears that you can’t even turn off the behavior – setting runtAllManagedModuleForAllRequests="false" had no effect at all and Routing continued to work just fine even with the flag set to false, which is NOT what I would have expected. Kind of disappointing too that Windows Server 2008 (R1) can’t be upgraded to IIS 7.5. It sure seems like that should have been possible since the OS server core changes in R2 are pretty minor. For the future I really hope Microsoft will allow updating IIS versions without tying them explicitly to the OS. It looks like that with the release of IIS Express Microsoft has taken some steps to untie some of those tight OS links from IIS. Let’s hope that’s the case for the future – it sure is nice to run the same IIS version on dev and live boxes, but upgrading live servers is too big a deal to do just because an updated OS release came out. Moral of the story – never assume that your dev setup will work as is on the live setup. It took me forever to figure this out because I assumed that because my web.config on the local machine was fine and working and I copied all relevant web.config data to the server it can’t be the configuration settings. I was looking everywhere but in the .config file forever before getting desperate and remembering the flag when I accidentally checked the intellisense settings in the modules key. Never assume anything. The other moral is: Try to keep your dev machine and server OS’s in sync whenever possible. Maybe it’s time to upgrade to Windows Server 2008 R2 after all. More info on Extensionless URLs in IIS Want to find out more exactly on how extensionless Urls work on IIS 7? The check out  How ASP.NET MVC Routing Works and its Impact on the Performance of Static Requests which goes into great detail on the complexities of the process. Thanks to Jeff Graves for pointing me at this article – a great linked reference for this topic!© Rick Strahl, West Wind Technologies, 2005-2011Posted in IIS7  Windows  

    Read the article

  • PDF Converter Elite Giveaway – Lets you create, convert and edit any type of PDF with ease

    - by Gopinath
    Are you looking for a PDF editing software that lets you create, edit and convert  any type of PDF with ease? Then here is a chance for you to win a lifetime free license of PDF Converter Elite software. Tech Dreams in partnership with pdfconverter.com  brings a giveaway contest exclusively for our readers. Continue reading to know the features of the application and giveaway contest details Adobe Acrobat  is the best software for creating, editing and converting PDF files, but you need spend a lot of money to buy it. PDF Converter Elite, which is priced at $100 has a rich set of features that satisfies most of your PDF management needs. Here is a quick run down of the feature of the application Create PDF files from almost every popular Windows file format – You can create a PDF  from almost 300 popular file formats supported by Windows. Want to convert a word document to PDF? It’s just a click away. How about converting Excels, PowerPoint presentations, text files, images, etc? Yes, with a single click you will be able to turn them to PDF Files. Convert PDF to Word, Excel, PowerPoint, Publisher, HTML – This is one of the best features i liked in this software. You can convert a PDF to any MS Office file format without loosing alignment and quality of the document. The converted documents looks exactly same as your PDF documents and you would be surprised to see near 100% layout replication in the converted document. I feel in love with the perfection at which the files are converted. Edit PDF files easily – You can rework with your PDF documents by inserting watermarks, numbers, headers, footers and more. Also you will be able to merge two PDF files, overlay pages, remove unwanted pages, split a single PDF in to multiple files. Secure PDF files by setting password – You can secure PDF files by limiting how others can use them – set password to open the documents, restrict various activities like printing, copy & paste, screen reading, form filling, etc.. If you are looking for an affordable PDF editing application then PDF Converter Elite is there for you. 10 x PDF Converter Elite Licenses Giveaway Here comes the details on wining a free single user license for our readers – we have 10 PDF Converter Elite single user licenses worth of $100 each. To win a license all you need to do is Like Tech Dreams Fan page on Facebook Tweet or Like this post – buttons are available just below the post heading in the top section of this page Finally drop a comment on how you would like to use PDF Converter Elite We will choose 10 winners through a lucky draw and the licenses will be sent to them in a personal email. Names of the winners will also be announced on Tech Dreams. So are you ready to grab a free copy of PDF Converter worth of $100?

    Read the article

  • Java Resources for Windows Azure

    - by BuckWoody
    Windows Azure is a Platform as a Service – a PaaS – that runs code you write. That code doesn’t just mean the languages on the .NET platform – you can run code from multiple languages, including Java. In fact, you can develop for Windows and SQL Azure using not only Visual Studio but the Eclipse Integrated Development Environment (IDE) as well.  Although not an exhaustive list, here are several links that deal with Java and Windows Azure: Resource Link Windows Azure Java Development Center http://www.windowsazure.com/en-us/develop/java/  Java Development Guidance http://msdn.microsoft.com/en-us/library/hh690943(VS.103).aspx  Running a Java Environment on Windows Azure http://blogs.technet.com/b/port25/archive/2010/10/28/running-a-java-environment-on-windows-azure.aspx  Running a Java Environment on Windows Azure http://blogs.technet.com/b/port25/archive/2010/10/28/running-a-java-environment-on-windows-azure.aspx  Run Java with Jetty in Windows Azure http://blogs.msdn.com/b/dachou/archive/2010/03/21/run-java-with-jetty-in-windows-azure.aspx  Using the plugin for Eclipse http://blogs.msdn.com/b/craig/archive/2011/03/22/new-plugin-for-eclipse-to-get-java-developers-off-the-ground-with-windows-azure.aspx  Run Java with GlassFish in Windows Azure http://blogs.msdn.com/b/dachou/archive/2011/01/17/run-java-with-glassfish-in-windows-azure.aspx  Improving experience for Java developers with Windows  Azure http://blogs.msdn.com/b/interoperability/archive/2011/02/23/improving-experience-for-java-developers-with-windows-azure.aspx  Java Access to SQL Azure via the JDBC Driver for SQL  Server http://blogs.msdn.com/b/brian_swan/archive/2011/03/29/java-access-to-sql-azure-via-the-jdbc-driver-for-sql-server.aspx  How to Get Started with Java, Tomcat on Windows Azure http://blogs.msdn.com/b/usisvde/archive/2011/03/04/how-to-get-started-with-java-tomcat-on-windows-azure.aspx  Deploying Java Applications in Azure http://blogs.msdn.com/b/mariok/archive/2011/01/05/deploying-java-applications-in-azure.aspx  Using the Windows Azure Storage Explorer in Eclipse http://blogs.msdn.com/b/brian_swan/archive/2011/01/11/using-the-windows-azure-storage-explorer-in-eclipse.aspx  Windows Azure Tomcat Solution Accelerator http://archive.msdn.microsoft.com/winazuretomcat  Deploying a Java application to Windows Azure with  Command-line Ant http://java.interoperabilitybridges.com/articles/deploying-a-java-application-to-windows-azure-with-command-line-ant  Video: Open in the Cloud: Windows Azure and Java http://channel9.msdn.com/Events/PDC/PDC10/CS10  AzureRunMe  http://azurerunme.codeplex.com/  Windows Azure SDK for Java http://www.interoperabilitybridges.com/projects/windows-azure-sdk-for-java  AppFabric SDK for Java http://www.interoperabilitybridges.com/projects/azure-java-sdk-for-net-services  Information Cards for Java http://www.interoperabilitybridges.com/projects/information-card-for-java  Apache Stonehenge http://www.interoperabilitybridges.com/projects/apache-stonehenge  Channel 9 Case Study on Java and Windows Azure http://www.microsoft.com/casestudies/Windows-Azure/Gigaspaces/Solution-Provider-Streamlines-Java-Application-Deployment-in-the-Cloud/400000000081   

    Read the article

  • Q&A: Oracle's Paul Needham on How to Defend Against Insider Attacks

    - by Troy Kitch
    Source: Database Insider Newsletter: The threat from insider attacks continues to grow. In fact, just since January 1, 2014, insider breaches have been reported by a major consumer bank, a major healthcare organization, and a range of state and local agencies, according to the Privacy Rights Clearinghouse.  We asked Paul Needham, Oracle senior director, product management, to shed light on the nature of these pernicious risks—and how organizations can best defend themselves against the threat from insider risks. Q. First, can you please define the term "insider" in this context? A. According to the CERT Insider Threat Center, a malicious insider is a current or former employee, contractor, or business partner who "has or had authorized access to an organization's network, system, or data and intentionally exceeded or misused that access in a manner that negatively affected the confidentiality, integrity, or availability of the organization's information or information systems."  Q. What has changed with regard to insider risks? A. We are actually seeing the risk of privileged insiders growing. In the latest Independent Oracle Users Group Data Security Survey, the number of organizations that had not taken steps to prevent privileged user access to sensitive information had grown from 37 percent to 42 percent. Additionally, 63 percent of respondents say that insider attacks represent a medium-to-high risk—higher than any other category except human error (by an insider, I might add). Q. What are the dangers of this type of risk? A. Insiders tend to have special insight and access into the kinds of data that are especially sensitive. Breaches can result in long-term legal issues and financial penalties. They can also damage an organization's brand in a way that directly impacts its bottom line. Finally, there is the potential loss of intellectual property, which can have serious long-term consequences because of the loss of market advantage.  Q. How can organizations protect themselves against abuse of privileged access? A. Every organization has privileged users and that will always be the case. The questions are how much access should those users have to application data stored in the database, and how can that default access be controlled? Oracle Database Vault (See image) was designed specifically for this purpose and helps protect application data against unauthorized access.  Oracle Database Vault can be used to block default privileged user access from inside the database, as well as increase security controls on the application itself. Attacks can and do come from inside the organization, and they are just as likely to come from outside as attempts to exploit a privileged account.  Using Oracle Database Vault protection, boundaries can be placed around database schemas, objects, and roles, preventing privileged account access from being exploited by hackers and insiders.  A new Oracle Database Vault capability called privilege analysis identifies privileges and roles used at runtime, which can then be audited or revoked by the security administrators to reduce the attack surface and increase the security of applications overall.  For a more comprehensive look at controlling data access and restricting privileged data in Oracle Database, download Needham's new e-book, Securing Oracle Database 12c: A Technical Primer. 

    Read the article

< Previous Page | 706 707 708 709 710 711 712 713 714 715 716 717  | Next Page >