Search Results

Search found 27890 results on 1116 pages for 'oracle retail documentation team'.

Page 572/1116 | < Previous Page | 568 569 570 571 572 573 574 575 576 577 578 579  | Next Page >

  • Sorting: TransientVO Vs Query/EO based VO

    - by Vijay Mohan
    In ADF, you can do a sorting on VO rows by invoking setSortBy("VOAttrName") API, but the tricky part is that, this API actually appends a clause to VO query at runtime and the actual sorting is performed after doing VO.executeQuery(), this goes fine for Query/EO based VO. But, how about the transient VO, wherein the rows are populated programmatically..?There is a way to it..:)you can actually specify the query mode on your transient VO, so that the sorting happens on already populated VO rows.Here are the steps to go about it..//Populate your transient VO rows.//VO.setSortBy("YourVOAttrName");//VO.setQueryMode(ViewObject.QUERY_MODE_SCAN_VIEW_ROWS);//VO.executeQuery();So, here the executeQuery() is actually the trigger which calls for VO rows sorting.QUERY_MODE_SCAN_VIEW_ROWS flag makes sure that the sorting is performed on the already populated VO cache.

    Read the article

  • Portal And Content - Content Integration - Best Practices

    - by Stefan Krantz
    Lately we have seen an increase in projects that have failed to either get user friendly content integration or non satisfactory performance. Our intention is to mitigate any knowledge gap that our previous post might have left you with, therefore this post will repeat some recommendation or reference back to old useful post. Moreover this post will help you understand ground up how to design, architect and implement business enabled, responsive and performing portals with complex requirements on business centric information publishing. Design the Information Model The key to successful portal deployments is Information modeling, it's a key task to understand the use case you designing for, therefore I have designed a set of question you need to ask yourself or your customer: Question: Who will own the content, IT or Business? Answer: BusinessQuestion: Who will publish the content, IT or Business? Answer: BusinessQuestion: Will there be multiple publishers? Answer: YesQuestion: Are the publishers computer scientist?Answer: NoQuestion: How often do the information changes, daily, weekly, monthly?Answer: Daily, weekly If your answers to the questions matches at least 2, we strongly recommend you design your content with following principles: Divide your pages in to logical sections, where each section is marked with its purpose Assign capabilities to each section, does it contain text, images, formatting and/or is it static and is populated through other contextual information Select editor/design element type WYSIWYG - Rich Text Plain Text - non-format text Image - Image object Static List - static list of formatted informationDynamic Data List - assembled information from multiple data files through CMIS query The result of such design map could look like following below examples: Based on the outcome of the required elements in the design column 3 from the left you will now simply design a data model in WebCenter Content - Site Studio by creating a Region Definition structure matching your design requirements.For more information on how to create a Region definition see following post: Region Definition Post - note see instruction 7 for details. Each region definition can now be used to instantiate data files, a data file will hold the actual data for each element in the region definition. Another way you can see this is to compare the region definition as an extension to the metadata model in WebCenter Content for each data file item. Design content templates With a solid dependable information model we can now proceed to template creation and page design, in this phase focuses on how to place the content sections from the region definition on the page via a Content Presenter template. Remember by creating content presenter templates you will leverage the latest and most integrated technology WebCenter has to offer. This phase is much easier since the you already have the information model and design wire-frames to base the logic on, however there is still few considerations to pay attention to: Base the template on ADF and make only necessary exceptions to markup when required Leverage ADF design components for Tabs, Accordions and other similar components, this way the design in the content published areas will comply with other design areas based on custom ADF taskflows There is no performance impact when using meta data or region definition based data All data access regardless of type, metadata or xml data it can be accessed via the Content Presenter - Node. See below for applied examples on how to access data Access metadata property from Document - #{node.propertyMap['myProp'].value}myProp in this example can be for instance (dDocName, dDocTitle, xComments or any other available metadata) Access element data from data file xml - #{node.propertyMap['[Region Definition Name]:[Element name]'].asTextHtml}Region Definition Name is the expect region definition that the current data file is instantiatingElement name is the element value you like to grab from the data file I recommend you read following  useful post on content template topic:CMIS queries and template creation - note see instruction 9 for detailsStatic List template rendering For more information on templates:Single Item Content TemplateMulti Item Content TemplateExpression Language Internationalization Considerations When integrating content assets via content presenter you by now probably understand that the content item/data file is wired to the page, what is also pretty common at this stage is that the content item/data file only support one language since its not practical or business friendly to mix that into a complex structure. Therefore you will be left with a very common dilemma that you will have to either build a complete new portal for each locale, which is not an good option! However with little bit of information modeling and clear naming convention this can be addressed. Basically you can simply make sure that all content item/data file are named with a predictable naming convention like "Content1_EN" for the English rendition and "Content1_ES" for the Spanish rendition. This way through simple none complex customizations you will be able to dynamically switch the actual content item/data file just before rendering. By following proposed approach above you not only enable a simple mechanism for internationalized content you also preserve the functionality in the content presenter to support business accessible run-time publishing of information on existing and new pages. I recommend you read following useful post on Internationalization topics:Internationalize with Content Presenter Integrate with Review & Approval processes Today the Review and approval functionality and configuration is based out of WebCenter Content - Criteria Workflows. Criteria Workflows uses the metadata of the checked in document to evaluate if the document is under any review/approval process. So for instance if a Criteria Workflow is configured to force any documents with Version = "2" or "higher" and Content Type is "Instructions", any matching content item version on check in will now enter the workflow before getting released for general access. Few things to consider when configuring Criteria Workflows: Make sure to not trigger on version one for Content Items that are Data Files - if you trigger on version 1 you will not only approve an empty document you will also have a content presenter pointing to a none existing document - since the document will only be available after successful completion of the workflow Approval workflows sometimes requires more complex criteria, the recommendation if that is the case is that the meta data triggering such criteria is automatically populated, this can be achieved through many approaches including Content Profiles Criteria workflows are configured and managed in WebCenter Content Administration Applets where you can configure one or more workflows. When you configured Criteria workflows the Content Presenter will support the editors with the approval process directly inline in the "Contribution mode" of the portal. In addition to approve/reject and details of the task, the content presenter natively support the user to view the current and future version of the change he/she is approving. See below for example: Architectural recommendation To support review&approval processes - minimize the amount of data files per page Each CMIS query can consume significant time depending on the complexity of the query - minimize the amount of CMIS queries per page Use Content Presenter Templates based on ADF - this way you minimize the design considerations and optimize the usage of caching Implement the page in as few Data files as possible - simplifies publishing process, increases performance and simplifies release process Named data file (node) or list of named nodes when integrating to pages increases performance vs. querying for data Named data file (node) or list of named nodes when integrating to pages enables business centric page creation and publishing and reduces the need for IT department interaction Summary Just because one architectural decision solves a business problem it doesn't mean its the right one, when designing portals all architecture has to be in harmony and not impacting each other. For instance the most technical complex solution is not always the best since it will most likely defeat the business accessibility, performance or both, therefore the best approach is to first design for simplicity that even a non-technical user can operate, after that consider the performance impact and final look at the technology challenges these brings and workaround them first with out-of-the-box features, after that design and develop functions to complement the short comings.

    Read the article

  • Build Open JDK 7 on Mac OSX (TOTD #172)

    - by arungupta
    The complete requirements, pre-requisites, and steps to build OpenJDK 7 port on Mac OSX are described here. The steps are very clearly explained and here are the exact ones I followed on my MacBook Pro 10.7.2: Confirm the version of pre-installed Java as: > java -versionjava version "1.6.0_26"Java(TM) SE Runtime Environment (build 1.6.0_26-b03-383-11A511c)Java HotSpot(TM) 64-Bit Server VM (build 20.1-b02-383, mixed mode) Download and install Mercurial from mercurial.berkwood.com (zip bundle for 10.7 is here). It gets installed in the /usr/local/bin directory. Get the source code as (commands highlighted in bold): hg clone http://hg.openjdk.java.net/macosx-port/macosx-port destination directory: macosx-port requesting all changes adding changesets adding manifests adding file changes added 437 changesets with 364 changes to 33 files updating to branch default 31 files updated, 0 files merged, 0 files removed, 0 files unresolved cd macosx-port chmod 7555 get_source.sh ./get_source.sh # Repos:  corba jaxp jaxws langtools jdk hotspot Starting on corba Starting on jaxp Starting on jaxws Starting on langtools Starting on jdk Starting on hotspot # hg clone http://hg.openjdk.java.net/macosx-port/macosx-port/corba corba requesting all changes adding changesets adding manifests adding file changes added 396 changesets with 3275 changes to 1379 files . . . # exit code 0 # cd ./corba && hg pull -u pulling from http://hg.openjdk.java.net/macosx-port/macosx-port/corba searching for changes no changes found # exit code 0 # cd ./jaxp && hg pull -u pulling from http://hg.openjdk.java.net/macosx-port/macosx-port/jaxp searching for changes no changes found # exit code 0 Install Xcode from the App Store. Include /Developer/usr/bin in PATH. Note: JDK 1.6.0_26 ame pre-installed on my laptop and I installed Xode after that. The compilation went fine and there was no need to re-install the Java for Mac OS X as mentioned in the original steps. Build the code as: make ALLOW_DOWNLOADS=true SA_APPLE_BOOT_JAVA=true ALWAYS_PASS_TEST_GAMMA=true ALT_BOOTDIR=`/usr/libexec/java_home -v 1.6` HOTSPOT_BUILD_JOBS=`sysctl -n hw.ncpu` The final output is shown as: >>>Finished making images @ Sat Nov 19 00:59:04 WET 2011 ... >>>Finished making images @ Sat Nov 19 00:59:04 WET 2011 ...############################################################################# Leaving jdk for target(s) sanity all docs images ################################################################################## Build time 00:17:42 jdk for target(s) sanity all docs images ############################################################################### Build times ##########Target all_product_buildStart 2011-11-19 00:32:40End 2011-11-19 00:59:0400:01:46 corba00:04:07 hotspot00:00:51 jaxp00:01:21 jaxws00:17:42 jdk00:00:37 langtools00:26:24 TOTAL######################### Change the directory and verify the version: >cd build/macosx-universal/j2sdk-image/1.7.0.jdk/Contents/Home/bin >./java -version openjdk version "1.7.0-internal" OpenJDK Runtime Environment (build 1.7.0-internal-arungup_2011_11_19_00_32-b00) OpenJDK 64-Bit Server VM (build 21.0-b17, mixed mode) Now go fix some bugs, file new bugs, or discuss at the macosx-port-dev mailing list.

    Read the article

  • Solaris 11 pkg fix is my new friend

    - by user12611829
    While putting together some examples of the Solaris 11 Automated Installer (AI), I managed to really mess up my system, to the point where AI was completely unusable. This was my fault as a combination of unfortunate incidents left some remnants that were causing problems, so I tried to clean things up. Unsuccessfully. Perhaps that was a bad idea (OK, it was a terrible idea), but this is Solaris 11 and there are a few more tricks in the sysadmin toolbox. Here's what I did. # rm -rf /install/* # rm -rf /var/ai # installadm create-service -n solaris11-x86 --imagepath /install/solaris11-x86 \ -s [email protected] Warning: Service svc:/network/dns/multicast:default is not online. Installation services will not be advertised via multicast DNS. Creating service from: [email protected] DOWNLOAD PKGS FILES XFER (MB) SPEED Completed 1/1 130/130 264.4/264.4 0B/s PHASE ITEMS Installing new actions 284/284 Updating package state database Done Updating image state Done Creating fast lookup database Done Reading search index Done Updating search index 1/1 Creating i386 service: solaris11-x86 Image path: /install/solaris11-x86 So far so good. Then comes an oops..... setup-service[168]: cd: /var/ai//service/.conf-templ: [No such file or directory] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ This is where you generally say a few things to yourself, and then promise to quit deleting configuration files and directories when you don't know what you are doing. Then you recall that the new Solaris 11 packaging system has some ability to correct common mistakes (like the one I just made). Let's give it a try. # pkg fix installadm Verifying: pkg://solaris/install/installadm ERROR dir: var/ai Group: 'root (0)' should be 'sys (3)' dir: var/ai/ai-webserver Missing: directory does not exist dir: var/ai/ai-webserver/compatibility-configuration Missing: directory does not exist dir: var/ai/ai-webserver/conf.d Missing: directory does not exist dir: var/ai/image-server Group: 'root (0)' should be 'sys (3)' dir: var/ai/image-server/cgi-bin Missing: directory does not exist dir: var/ai/image-server/images Group: 'root (0)' should be 'sys (3)' dir: var/ai/image-server/logs Missing: directory does not exist dir: var/ai/profile Missing: directory does not exist dir: var/ai/service Group: 'root (0)' should be 'sys (3)' dir: var/ai/service/.conf-templ Missing: directory does not exist dir: var/ai/service/.conf-templ/AI_data Missing: directory does not exist dir: var/ai/service/.conf-templ/AI_files Missing: directory does not exist file: var/ai/ai-webserver/ai-httpd-templ.conf Missing: regular file does not exist file: var/ai/service/.conf-templ/AI.db Missing: regular file does not exist file: var/ai/image-server/cgi-bin/cgi_get_manifest.py Missing: regular file does not exist Created ZFS snapshot: 2012-12-11-21:09:53 Repairing: pkg://solaris/install/installadm Creating Plan (Evaluating mediators): | DOWNLOAD PKGS FILES XFER (MB) SPEED Completed 1/1 3/3 0.0/0.0 0B/s PHASE ITEMS Updating modified actions 16/16 Updating image state Done Creating fast lookup database Done In just a few moments, IPS found the missing files and incorrect ownerships/permissions. Instead of reinstalling the system, or falling back to an earlier Live Upgrade boot environment, I was able to create my AI services and now all is well. # installadm create-service -n solaris11-x86 --imagepath /install/solaris11-x86 \ -s [email protected] Warning: Service svc:/network/dns/multicast:default is not online. Installation services will not be advertised via multicast DNS. Creating service from: [email protected] DOWNLOAD PKGS FILES XFER (MB) SPEED Completed 1/1 130/130 264.4/264.4 0B/s PHASE ITEMS Installing new actions 284/284 Updating package state database Done Updating image state Done Creating fast lookup database Done Reading search index Done Updating search index 1/1 Creating i386 service: solaris11-x86 Image path: /install/solaris11-x86 Refreshing install services Warning: mDNS registry of service solaris11-x86 could not be verified. Creating default-i386 alias Setting the default PXE bootfile(s) in the local DHCP configuration to: bios clients (arch 00:00): default-i386/boot/grub/pxegrub Refreshing install services Warning: mDNS registry of service default-i386 could not be verified. # installadm create-service -n solaris11u1-x86 --imagepath /install/solaris11u1-x86 \ -s [email protected] Warning: Service svc:/network/dns/multicast:default is not online. Installation services will not be advertised via multicast DNS. Creating service from: [email protected] DOWNLOAD PKGS FILES XFER (MB) SPEED Completed 1/1 514/514 292.3/292.3 0B/s PHASE ITEMS Installing new actions 661/661 Updating package state database Done Updating image state Done Creating fast lookup database Done Reading search index Done Updating search index 1/1 Creating i386 service: solaris11u1-x86 Image path: /install/solaris11u1-x86 Refreshing install services Warning: mDNS registry of service solaris11u1-x86 could not be verified. # installadm list Service Name Alias Of Status Arch Image Path ------------ -------- ------ ---- ---------- default-i386 solaris11-x86 on i386 /install/solaris11-x86 solaris11-x86 - on i386 /install/solaris11-x86 solaris11u1-x86 - on i386 /install/solaris11u1-x86 This is way way better than pkgchk -f in Solaris 10. I'm really beginning to like this new IPS packaging system.

    Read the article

  • La búsqueda de la eficiencia como Santo Grial de las TIC sanitarias

    - by Eloy M. Rodríguez
    Las XVIII Jornadas de Informática Sanitaria en Andalucía se han cerrado el pasado viernes con 11.500 horas de inteligencia colectiva. Aunque el cálculo supongo que resulta de multiplicar las horas de sesiones y talleres por el número de inscritos, lo que no sería del todo real ya que la asistencia media calculo que andaría por las noventa personas, supongo que refleja el global si incluimos el montante de interacciones informales que el formato y lugar de celebración favorecen. Mi resumen subjetivo es que todos somos conscientes de que debemos conseguir más eficiencia en y gracias a las TIC y que para ello hemos señalado algunas pautas, que los asistentes, en sus diferentes roles debiéramos aplicar y ayudar a difundir. En esa línea creo que destaca la necesidad de tener muy claro de dónde se parte y qué se quiere conseguir, para lo que es imprescindible medir y que las medidas ayuden a retroalimentar al sistema en orden de conseguir sus objetivos. Y en este sentido, a nivel anecdótico, quisiera dejar una paradoja que se presentó sobre la eficiencia: partiendo de que el coste/día de hospitalización es mayor al principio que los últimos días de la estancia, si se consigue ser más eficiente y reducir la estancia media, se liberarán últimos días de estancia que se utilizarán para nuevos ingresos, lo que hará que el número de primeros días de estancia aumente el coste económico total. En este caso mejoraríamos el servicio a los ciudadanos pero aumentaríamos el coste, salvo que se tomasen acciones para redimensionar la oferta hospitalaria bajando el coste y sin mejorer la calidad. También fue tema destacado la posibilidad/necesidad de aprovechar las capacidades de las TIC para realizar cambios estructurales y hacer que la medicina pase de ser reactiva a proactiva mediante alarmas que facilitasen que se actuase antes de ocurra el problema grave. Otro tema que se trató fue la necesidad real de corresponsabilizar de verdad al ciudadano, gracias a las enormes posibilidades a bajo coste que ofrecen las TIC, asumiendo un proceso hacia la salud colaborativa que tiene muchos retos por delante pero también muchas más oportunidades. Y la carpeta del ciudadano, emergente en varios proyectos e ideas, es un paso en ese aspecto. Un tema que levantó pasiones fue cuando la Directora Gerente del Sergas se quejó de que los proyectos TIC eran lentísimos. Desgraciadamente su agenda no le permitió quedarse al debate que fue bastante intenso en el que salieron temas como el larguísimo proceso administrativo, las especificaciones cambiantes, los diseños a medida, etc como factores más allá de la eficiencia especifica de los profesionales TIC involucrados en los proyectos. Y por último quiero citar un tema muy interesante en línea con lo hablado en las jornadas sobre la necesidad de medir: el Índice SEIS. La idea es definir una serie de criterios agrupados en grandes líneas y con un desglose fino que monitorice la aportación de las TIC en la mejora de la salud y la sanidad. Nos presentaron unas versiones previas con debate aún abierto entre dos grandes enfoques, partiendo desde los grandes objetivos hasta los procesos o partiendo desde los procesos hasta los objetivos. La discusión no es sólo académica, ya que influye en los parámetros a establecer. La buena noticia es que está bastante avanzado el trabajo y que pronto los servicios de salud podrán tener una herramienta de comparación basada en la realidad nacional. Para los interesados, varios asistentes hemos ido tuiteando las jornadas, por lo que el que quiera conocer un poco más detalles puede ir a Twitter y buscar la etiqueta #jisa18 y empezando del más antiguo al más moderno se puede hacer un seguimiento con puntos de vista subjetivos sobre lo allí ocurrido. No puedo dejar de hacer un par de autocríticas, ya que soy miembro de la SEIS. La primera es sobre el portal de la SEIS que no ha tenido la interactividad que unas jornadas como estas necesitaban. Pronto empezará a tener documentos y análisis de lo allí ocurrido y luego vendrán las crónicas y análisis más cocinados en la revista I+S. Pero en la segunda década del siglo XXI se necesita bastante más. La otra es sobre la no deseada poca presencia de usuarios de las TIC sanitarias en los roles de profesionales sanitarios y ciudadanos usuarios de los sistemas de información sanitarios. Tenemos que ser proactivos para que acudan en número significativo, ya que si no estamos en riesgo de ser unos TIC-sanitarios absolutistas: todo para los usuarios pero sin los usuarios. Tweet

    Read the article

  • Getit serves up local search in India with Java ME tech

    - by hinkmond
    Did you ever wonder where to get a good lamb vindaloo while you are visiting in Mumbai? Well, you need to get Getit then. See: Getit gets it on Java ME Here's a quote: Getit, the company which provides local search facility and free classifieds services in India, has announced the official release of the Getit Local Search Mobile app for Indian users. The app can be downloaded from the Mobango app store, ... [and]... is available for all platforms like [blah-blah-blah], [yadda-yadda-yadda], Java, Blackberry, Symbian etc... Getit gets it because they ported to the Java ME platform, the most ubiquitous mobile platform out there, and because they know when you want to find a good vindaloo, you want to find a good vindaloo! Hinkmond

    Read the article

  • SOA Checklist

    - by pat.shepherd
    In a recent meeting, the customer brought up a valid question: “How do I know if a problem/system is a good candidate for using SOA (vs. using old but trusted techniques).  I put this checklist together.  If you can answer yes to 2 or more of these, it might well be a good candidate.  This is V1, and I will likely update it over time.  Comments (that are not spam or sales pitches) appreciated. Part of the conversation was also around the fact that SOA has two faces to it; one is around the obvious reuse possibilities. The other, that often gets forgotten, is that SOA provides goodness in terms of simplifying integration even where opportunities to reuse those integrations are small; at least the integrations are standards-based and more flexible.  I did not write a lot of verbiage about each of them, for example “Business Process” implies that there is a set of step-wise actions that need to take place in a coordinated fashion that include integrating with systems (and sometimes people for approvals and other human-only actions) in the process.  

    Read the article

  • Review of ComponentOne Silverlight Controls (Free License Giveaway).

    - by mbcrump
    ComponentOne has several great products that target Silverlight Developers. One of them is their Silverlight Controls and the other is the XAP Optimizer. I decided that I would check out the controls and Xap Optimizer and feature them on my blog. After talking with ComponentOne, they agreed to take part in my Monthly Silverlight giveaway. The details are listed below: ----------------------------------------------------------------------------------------------------------------------------------------------------------- Win a FREE developer’s license of ComponentOne Silverlight Controls + XAP Optimizer! (the winner also gets a license to Silverlight Spy) Random winner will be announced on March 1st, 2011! To be entered into the contest do the following things: Subscribe to my feed. Leave a comment below with a valid email account (I WILL NOT share this info with anyone.) Retweet the following : I just entered to win free #Silverlight controls from @mbcrump and @ComponentOne http://mcrump.me/fTSmB8 ! Don’t change the URL because this will allow me to track the users that Tweet this page. Don’t forget to visit ComponentOne because they made this possible. MichaelCrump.Net provides Silverlight Giveaways every month. You can also see the latest giveaway by bookmarking http://giveaways.michaelcrump.net . ---------------------------------------------------------------------------------------------------------------------------------------------------------- Before we get started with the Silverlight Controls, here is a couple of links to bookmark: The Live Demos of the Silverlight Controls is located here. The XAP Optimizer page is here. One thing that I liked about the help documentation is that you can grab a PDF that only contains documentation for that control. This allows you to get the information you need without going through several hundred pages. You can also download the full documentation from their site.  ComponentOne Silverlight Controls I recently built a hobby project and decided to use ComponentOne Silverlight Controls. The main reason for this is that the controls are heavily documented, they look great and getting help was just a tweet or forum click away. So, the first question that you may ask is, “What is included?” Here is the official list below. I wanted to show several of the controls that I think developers will use the most. 1) ComponentOne’s Image Control – Display animated GIF images on your Silverlight pages as you would in traditional Web apps. Add attractive visuals with minimal effort. 2) HTML Host - Render HTML and arbitrary URI content from within Silverlight. 3) Chart3D - Create 3D surface charts with options for contour levels, zones, a chart legend and more. 4) PDFViewer - View PDF files in Silverlight! That is just a fraction of the controls available. If you want to check out several of them in a “real” application then check out my Silverlight page at http://michaelcrump.info. This brings me to the second part of the giveaway. XAP Optimizer – Is designed to reduce the size of your XAP File. It also includes built-in obfuscation and signing. With my personal project, I decided to use the XAP Optimizer by ComponentOne. It was so easy to use. You basically give it your .XAP file and it provides an output file. If you prefer to prune unused references manually then you can prune your XAP file manually by selecting the option below. I went ahead and added Obfuscation just to try it out and it worked great. You may notice from the screenshot below that I only obfuscated assemblies that I built. The other dlls anyone can grab off the net so we have no reason to obfuscate them. You also have the option to automatically sign your .xap with the SN.exe tool. So how did it turn out? Well, I reduced my XAP size from 2.4 to 1.8 with simply a click of a button. I added obfuscation with a click of a button: Screenshot of no obfuscation on my XAP File   Screenshot of obfuscation on my XAP File with XAP Optimizer.   So, with 2 button clicks, I reduce my XAP file and obfuscated my assembly. What else can you want? Well, they provide a nice HTML report that gives you an optimization summary. So what if you don’t want to launch this tool every time you deploy a Silverlight application? Well the official documentation provided a way to do it in your built event in Visual Studio. Click the Build Events tab on the left side of the Properties window. Enter the following command in the Post-build event command line: $Program Files\ComponentOne\XapOptimizer\XapOptimizer.exe /cmd /p:$(ProjectDir)$(ProjectName).xoproj In the end, this is a great product. I love code that I don’t have to write and utilities that just work. ComponentOne delivers with both the Silverlight Controls and the XAP Optimizer. Don’t forget to leave a comment below in order to win a set of the controls! Subscribe to my feed

    Read the article

  • Is Master Data Management CRM's Secret Sauce?

    - by divya.malik
    This was the title of a recent blog entry by our colleagues in EMEA. Having a good master data management system enables organizations to get a unified, accurate and complete understanding of their customers. Gartner Group's John Radcliffe explains why MDM is destined to be at the heart of future CRM and social CRM projects. Experts are predicting big things for master data management (MDM) in the immediate future. While far from being a new kid on the block, its potential benefits at a time when organisations are drowning in data mean that it is in the right place at the right time. "MDM is not 'nice to have'," explains John Radcliffe, research vice president at Gartner. "If tackled in the right way it can provide near term business value that plays into an organisation's new focus on cost efficiencies, risk management and regulatory compliance, while supporting growth and future transformative strategies." The complete article can be found here.

    Read the article

  • Webcenter book review

    - by angelo.santagata
    Hi all, just had the opportunity to read Peter Moskovits Webcenter Handbook and I must say even for someone who has been involved with webcenter for a couple of years now I was pleasantly pleased with this book and still came away with some nuggets.. checkout my review on amazon.com

    Read the article

  • Class Loading Deadlocks

    - by tomas.nilsson
    Mattis follows up on his previous post with one more expose on Class Loading Deadlocks As I wrote in a previous post, the class loading mechanism in Java is very powerful. There are many advanced techniques you can use, and when used wrongly you can get into all sorts of trouble. But one of the sneakiest deadlocks you can run into when it comes to class loading doesn't require any home made class loaders or anything. All you need is classes depending on each other, and some bad luck. First of all, here are some basic facts about class loading: 1) If a thread needs to use a class that is not yet loaded, it will try to load that class 2) If another thread is already loading the class, the first thread will wait for the other thread to finish the loading 3) During the loading of a class, one thing that happens is that the <clinit method of a class is being run 4) The <clinit method initializes all static fields, and runs any static blocks in the class. Take the following class for example: class Foo { static Bar bar = new Bar(); static { System.out.println("Loading Foo"); } } The first time a thread needs to use the Foo class, the class will be initialized. The <clinit method will run, creating a new Bar object and printing "Loading Foo" But what happens if the Bar object has never been used before either? Well, then we will need to load that class as well, calling the Bar <clinit method as we go. Can you start to see the potential problem here? A hint is in fact #2 above. What if another thread is currently loading class Bar? The thread loading class Foo will have to wait for that thread to finish loading. But what happens if the <clinit method of class Bar tries to initialize a Foo object? That thread will have to wait for the first thread, and there we have the deadlock. Thread one is waiting for thread two to initialize class Bar, thread two is waiting for thread one to initialize class Foo. All that is needed for a class loading deadlock is static cross dependencies between two classes (and a multi threaded environment): class Foo { static Bar b = new Bar(); } class Bar { static Foo f = new Foo(); } If two threads cause these classes to be loaded at exactly the same time, we will have a deadlock. So, how do you avoid this? Well, one way is of course to not have these circular (static) dependencies. On the other hand, it can be very hard to detect these, and sometimes your design may depend on it. What you can do in that case is to make sure that the classes are first loaded single threadedly, for example during an initialization phase of your application. The following program shows this kind of deadlock. To help bad luck on the way, I added a one second sleep in the static block of the classes to trigger the unlucky timing. Notice that if you uncomment the "//Foo f = new Foo();" line in the main method, the class will be loaded single threadedly, and the program will terminate as it should. public class ClassLoadingDeadlock { // Start two threads. The first will instansiate a Foo object, // the second one will instansiate a Bar object. public static void main(String[] arg) { // Uncomment next line to stop the deadlock // Foo f = new Foo(); new Thread(new FooUser()).start(); new Thread(new BarUser()).start(); } } class FooUser implements Runnable { public void run() { System.out.println("FooUser causing class Foo to be loaded"); Foo f = new Foo(); System.out.println("FooUser done"); } } class BarUser implements Runnable { public void run() { System.out.println("BarUser causing class Bar to be loaded"); Bar b = new Bar(); System.out.println("BarUser done"); } } class Foo { static { // We are deadlock prone even without this sleep... // The sleep just makes us more deterministic try { Thread.sleep(1000); } catch(InterruptedException e) {} } static Bar b = new Bar(); } class Bar { static { try { Thread.sleep(1000); } catch(InterruptedException e) {} } static Foo f = new Foo(); }

    Read the article

  • SQL SERVER – An Efficiency Tool to Compare and Synchronize SQL Server Databases

    - by Pinal Dave
    There is no need to reinvent the wheel if it is already invented and if the wheel is already available at ease, there is no need to wait to grab it. Here is the similar situation. I came across a very interesting situation and I had to look for an efficient tool which can make my life easier and solve my business problem. Here is the scenario. One of the developers had deleted few rows from the very important mapping table of our development server (thankfully, it was not the production server). Though it was a development server, the entire development team had to stop working as the application started to crash on every page. Think about the lost of manpower and efficiency which we started to loose.  Pretty much every department had to stop working as our internal development application stopped working. Thankfully, we even take a backup of our development server and we had access to full backup of the entire database at 6 AM morning. We do not take as a frequent backup of development server as production server (naturally!). Even though we had a full backup, the solution was not to restore the database. Think about it, there were plenty of the other operations since the last good full backup and if we restore a full backup, we will pretty much overwrite on the top of the work done by developers since morning. Now, as restoring the full backup was not an option we decided to restore the same database on another server. Once we had restored our database to another server, the challenge was to compare the table from where the database was deleted. The mapping table from where the data were deleted contained over 5000 rows and it was humanly impossible to compare both the tables manually. Finally we decided to use efficiency tool dbForge Data Compare for SQL Server from DevArt. dbForge Data Compare for SQL Server is a powerful, fast and easy to use SQL compare tool, capable of using native SQL Server backups as metadata source. (FYI we Downloaded dbForge Data Compare) Once we discovered the product, we immediately downloaded the product and installed on our development server. After we installed the product, we were greeted with the following screen. We clicked on the New Data Comparision to start our new comparison project. It brought up following screen. Here is the best part of the product, we just had to enter our database connection username and password along with source and destination details and we are done. The entire process is very simple and self intuiting. The best part was that for the source, we can either select database or even backup. This was indeed fantastic feature. Think about this, if you have a very big database, it will take long time to restore on the server. Once it is restored, you will be able to work with it. However, when you are working with dbForge Data Compare it will accept database backup as your source or destination. Once I click on the execute it brought up following screen where it displayed an excellent summary of the data compare. It has dedicated tabs for the what is changing in what table as well had details of the changed data. The best part is that, once we had reviewed the change. We click on the Synchronize button in the menu bar and it brought up following screen. You can see that the screen has very simple straight forward but very powerful features. You can generate a script to synchronize from target to source or even from source to target. Additionally, the database is a very complicated world and there are extensive options to configure various database options on the next screen. We also have the option to either generate script or directly execute the script to target server. I like to play on the safe side and I generated the script for my synchronization and later on after review I deployed the scripts on the server. Well, my team and we were able to get going from our disaster in less than 10 minutes. There were few people in our team were indeed disappointed as they were thinking of going home early that day but in less than 10 minutes they had to get back to work. There are so many other features in  dbForge Data Compare for SQL Server, I am already planning to make this product company wide recommended product for Data Compare tool. Hats off to the team who have build this product. Here are few of the features salient features of the dbForge Data Compare for SQL Server Perform SQL Server database comparison to detect changes Compare SQL Server backups with live databases Analyze data differences between two databases Synchronize two databases that went out of sync Restore data of a particular table from the backup Generate data comparison reports in Excel and HTML formats Copy look-up data from development database to production Automate routine data synchronization tasks with command-line interface Go Ahead and Download the dbForge Data Compare for SQL Server right away. It is always a good idea to get familiar with the important tools before hand instead of learning it under pressure of disaster. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQL Utility, T SQL, Technology

    Read the article

  • What Error Messages Reveal

    - by ultan o'broin
    I love this blog entry Usability doesn't mean UI Especially the part: Ask for a list of all error messages when you do your next vendor evaluation. You will learn more about the vendor's commitment to usability and product quality than you will fathom from a slick demo. Not so sure about the part about error messages not being "hip" or "glamorous" though. I know... I should get out more...:)

    Read the article

  • BIP Debugging to a file

    - by Tim Dexter
    If you use the standalone server or with OBIEE and use OC4J as the web server. Have you ever taken a looksee at the console window (doc/xterm) that you use to start it. Ever turned on debugging to see masses of info flow by that window and want to capture it all? I have been debugging today and watched all that info fly by and on Windoze gets lost before you can see it! The BIP developers use the System.out.println() and System.err.println()methods in the BIP applications to generate debugging formation. Normally the output from these method calls go to the console where the OC4J process is started. However you can specify command line options when starting OC4J to direct the stdout and stderr output directly to files. The ?out and ?err parameters tell OC4J which file to direct the output to. All you need do is modify the oc4j.cmd file used to start BIP. I didnt get fancy and just plugged in the following to the file under the start section. I just modified the line: set CMDARGS=-config "%SERVER_XML%" -userThreads to set CMDARGS=-config "%SERVER_XML%" -out D:\BI\OracleBI\oc4j_bi\j2ee\home\log\oc4j.out -err D:\BI\OracleBI\oc4j_bi\j2ee\home\log\oc4j.err -userThreads Bounced the server and I now have a ballooning pair of debug files that I can pour over to my hearts content. The .out file appears to contain BIP only log info and the .err file, OBIEE messages. If you are using another web server to host BIP, just check out the user docs to find out how to get the log files to write. Note to self, remember to turn off the debug when Im done!

    Read the article

  • Iterative Conversion

    - by stuart ramage
    Question Received: I am toying with the idea of migrating the current information first and the remainder of the history at a later date. I have heard that the conversion tool copes with this, but haven't found any information on how it does. Answer: The Toolkit will support iterative conversions as long as the original master data key tables (the CK_* tables) are not cleared down from Staging (the already converted Transactional Data would need to be cleared down) and the Production instance being migrated into is actually Production (we have migrated into a pre-prod instance in the past and then unloaded this and loaded it into the real PROD instance, but this will not work for your situation. You need to be migrating directly into your intended environment). In this case the migration tool will still know all about the original keys and the generated keys for the primary objects (Account, SA, etc.) and as such it will be able to link the data converted as part of a second pass onto these entities. It should be noted that this may result in the original opening balances potentially being displayed with an incorrect value (if we are talking about Financial Transactions) and also that care will have to be taken to ensure that all related objects are aligned (eg. A Bill must have a set to bill segments, meter reads and a financial transactions, and these entities cannot exist independantly). It should also be noted that subsequent runs of the conversion tool would need to be 'trimmed' to ensure that they are only doing work on the objects affected. You would not want to revalidate and migrate all Person, Account, SA, SA/SP, SP and Premise details since this information has already been processed, but you would definitely want to run the affected transactional record validation and keygen processes. There is no real "hard-and-fast" rule around this processing since is it specific to each implmentations needs, but the majority of the effort required should be detailed in the Conversion Tool section of the online help (under Adminstration/ The Conversion Tool). The major rule is to ensure that you only run the steps and validation/keygen steps that you need and do not do a complete rerun for your subsequent conversion.

    Read the article

  • Download SQL Server 2008 R2 Express (Database Size Limit Increased to 10GB! )

    - by Aamir Hasan
    Yesterday i was researching about SQL Server 2008. i found New release of MS SQL Server 2008 R2, which have many new BI features and enhancements. There is a tiny cute feature that I am sure all of us will appreciate a lot. The product team has increased the Database Size limit for SQL Server 2008 R2 Express from 4 GB to 10 GB. So if you have got a growing SQL Server Express database that is close to the 4 GB Limit, hurry, upgrade to R2 Express. See the announcement from Product Team. SQL Server 2008 R2 Express download. SQL Server 2008 R2 Express Download

    Read the article

  • ADF Bounded Taskflow Activation

    - by Vijay Mohan
    hey guys, It's really been a while since I last blogged. Just came across a hard-to-debug scenario, so thought of sharing it for the benefit of ADF developers.I had a page fragment(jsff) wrapped inside a  bounded taskflow, for which the activation was conditional and was based on a requestScope property (be it a requestScope variable or a property coming from a requestScope bean). As soon as the taskflow activates and page renders the requestScope parameters life span ends. After that, when you raise an event inside the page (click of commandLink, moseHover, valueChange event etc) then for the first time the event gets fired but it fails to affect the change in the page, moreover, for the subsequent times the event itself doesn't get fired. Any guesses as to what could be the culprit..?I guess, I already gave the reason in the initial paragraph. For the first time when the event gets fired, the fwk sees that the page is already lying in inactivate state, so it fails to affect the change and for subsequent times it doesn't even fire the event because it already knew that the page/region is inactive. So, in such a scenario we must use either a pageFlowScope property or transientVO property which could exist till the page's life span.

    Read the article

  • Does Scrum turn active developers into passive developers?

    - by Saeed Neamati
    I'm a web developer working in a team of three developers and one designer. It's now about five months that we've implemented the agile scrum software development methodology. But I have a weird feeling I just wanted to share in this site. One important factor in human life is decision-making process. However, there is a big difference in decisions you make. Some decisions are just the outcome of an internal or external force, while other decisions are completely based on your free will, and some decisions are simply something in between. The more freedom you have in making decisions, the more self-driven your work would become. This seems to be a rule. Because we tend to shape our lives ourselves. There is a big difference between you deciding what to do, or being told what to do. Before scrum, I felt like having more freedom in making the decisions which were related to development, analysis, prioritizing implementation, etc. I had more feeling like I'm deciding what I'm doing. However, due to the scrum methodology, now many decisions simply come from the product owner. He prioritizes PBIs, he analyzes how the software should work, even sometimes how the UI and functionality should be implemented. I know that this is part of the scrum methodology, and I also know that this may result in better sales of product in future. However, I now feel like I'm always getting told to do something, instead of deciding to do something. This syndrome now has made me more passive towards the work. I tend to search less to find a better solution, approach, or technique I don't wake up in the morning expecting to get to an enjoyable work. Rather, I feel like being forced to work in order to live I have more hunger to work on my own hobby projects after work I won't push the team anymore to get to the higher technological levels I spend more time now on dinner, or tea-times and have less enthusiasm to get back to work I'm now willing more for the work to finish sooner, so that I can get home The big problem is, I see and diagnose this behavior in my colleagues too. Is it the outcome of scrum? Does scrum really makes the development team feel like they have no part in forming the overall software, thus making the passive to the project? How can I overcome this feeling?

    Read the article

  • Gamify your Web

    - by Isabel F. Peñuelas
    Yesterday Valencia welcomed the Gamification World Congress that I follow virtually through #GWC2012. BBVA, Iberia, Ligeresa, Axe, Wayra, ESADE, GlaxoSmithKline, Macmillan, Gamisfaction, Nomaders, Blaffin were among the companies presenting success stories on gaming. It has been proved that people remember things easily when an emotion is created. The marketing expectations around Gamification techniques have a lot to do with Neuromarketing theories. There are a lot of expectations on internal enterprise Gamification. In the public Web some sectors are taking the lead on following the trend. The Gartner Analyst Brian Burke opened another Gamification recent event in Madrid remembering that “Gamification is mostly about Engagement”, and this can be applied both to customers or employees. Gamification and Banking The experience of the Spanish Financial Group BBVA that just launched BBVA Game was also presented a week ago at the BBVA Innovation Centre during the event “Gamification & Banking: a fad or a serious business?” . One of the objectives of the BBVA Game was to double the name of registered users. “People like the efficiency of the online channel want to keep a one-to-one contact with the brand”-explained Bernardo Crespo. Another interested data coming out the BBVA presentation was that “only 20% of Spanish users –out of the total holders of Bank Accounts in the country- is familiar with the use of a Web Site to consult their bank accounts”, the project aims also to reverse this situation helping people to learn making a heavy use of the Video in the gaming context. In general Banking presenters seem to agree that Gamification techniques are helping to increase the time spent on the Web. Gamification and Health Using Gamification techniques for chronic illness rehabilitation was another topic of the World Congress. Here you can find some ideas and experiences What can games do for the health (In Spanish) I have personally started my own mental-health gaming project at http://www.lumosity.com/ Gamification in the Enterprise I really recommend Reading this excellent post of Ultan ÓBroin my Introduction to Gamification and Applications. Employee´s motivation and learning are experiencing a 360º turn and it looks than some of us will become soon the Dragon of the year instead of the Employee of the Year. Using Web 2.0 Tools for Gamification Projects  What type of tools do we need for a quick-win Gamification project? To certain extend Gamification can be considered an evolution of the participative Web. Badging, avatars, points and awards, leader boards, progress charts, virtual currencies, gifting and giving challenges and quests are common components and elements. The Web is offering new development frameworks to that purpose as this Avatar Framework from Paypal or Badgeville to include in web applications. Besides, tools to create communities around a game are required to comment, share and vote players as well as for an efficient multimedia management. Due to its entirely open architecture, its community features, and its multimedia and imaging solutions is were I see WebCenter as a tool helping brands to success. Link to Sources & Recommended Readings YouTube Video of BBVAGame presentation Where To Apply Gamification In Your Incentive Jim Calhoun Cancer Challenge Ride and Walkh For my Spanish Readers El aburrimiento es el enemigo número uno del éxito

    Read the article

  • Java Developer Days India Trip Report

    - by reza_rahman
    October 21st through October 25th I spoke at Java Developer Days India. This was three separate but identical one-day events in the cities of Pune (October 21st), Chennai (October 24th) and Bangalore (October 25th). For those with some familiarity with India, other than Hyderabad these cities are India's IT powerhouses. The events were focused on Java EE. I delivered five sessions on Java EE 7, WebSocket, JAX-RS 2, JMS 2 and EclipeLink/NoSQL. The events went extremely well and was packed in all three cities. More details on the sessions and Java Developer Days India, including the slide decks, posted on my personal blog.

    Read the article

  • FREE eBook: .NET Performance Testing and Optimization (Part 1)

    In this this first part of complete guide to performance profiling, Paul Glavich and Chris Farrell explain why performance testing is a good idea and walk you through everything you need to know to set up a test environment. This comprehensive guide to getting started is an essential handbook to any programmer looking to set up a .NET testing environment and get the best results out of it. Download your free copy now span.fullpost {display:none;}

    Read the article

< Previous Page | 568 569 570 571 572 573 574 575 576 577 578 579  | Next Page >