Search Results

Search found 41035 results on 1642 pages for 'object oriented design'.

Page 583/1642 | < Previous Page | 579 580 581 582 583 584 585 586 587 588 589 590  | Next Page >

  • Ubuntu 12.04 // Likewise Open // Unable to ever authenticate AD users

    - by Rob
    So Ubuntu 12.04, Likewise latest from the beyondtrust website. Joins domain fine. Gets proper information from lw-get-status. Can use lw-find-user-by-name to retrieve/locate users. Can use lw-enum-users to get all users. Attempting to login with an AD user via SSH generates the following errors in the auth.log file: Nov 28 19:15:45 hostname sshd[2745]: PAM unable to dlopen(pam_winbind.so): /lib/security/pam_winbind.so: cannot open shared object file: No such file or directory Nov 28 19:15:45 hostname sshd[2745]: PAM adding faulty module: pam_winbind.so Nov 28 19:15:51 hostname sshd[2745]: error: PAM: Authentication service cannot retrieve authentication info for DOMAIN\\user.name from remote.hostname Nov 28 19:16:06 hostname sshd[2745]: Connection closed by 10.1.1.84 [preauth] Attempting to login via the LightDM itself generates similar errors in the auth.log file. Nov 28 19:19:29 hostname lightdm: PAM unable to dlopen(pam_winbind.so): /lib/security/pam_winbind.so: cannot open shared object file: No such file or directory Nov 28 19:19:29 hostname lightdm: PAM adding faulty module: pam_winbind.so Nov 28 19:19:47 hostname lightdm: pam_succeed_if(lightdm:auth): requirement "user ingroup nopasswdlogin" not met by user "DOMAIN\user.name" Nov 28 19:19:52 hostname lightdm: [lsass-pam] [module:pam_lsass]pam_sm_authenticate error [login:DOMAIN\user.name][error code:40022] Nov 28 19:19:54 hostname lightdm: PAM unable to dlopen(pam_winbind.so): /lib/security/pam_winbind.so: cannot open shared object file: No such file or directory Nov 28 19:19:54 hostname lightdm: PAM adding faulty module: pam_winbind.so Attempting to login via a console on the system itself generates slightly different errors: Nov 28 19:31:09 hostname login[997]: PAM unable to dlopen(pam_winbind.so): /lib/security/pam_winbind.so: cannot open shared object file: No such file or directory Nov 28 19:31:09 hostname login[997]: PAM adding faulty module: pam_winbind.so Nov 28 19:31:11 hostname login[997]: [lsass-pam] [module:pam_lsass]pam_sm_authenticate error [login:DOMAIN\user.name][error code:40022] Nov 28 19:31:14 hostname login[997]: FAILED LOGIN (1) on '/dev/tty2' FOR 'DOMAIN\user.name', Authentication service cannot retrieve authentication info Nov 28 19:31:31 hostname login[997]: FAILED LOGIN (2) on '/dev/tty2' FOR 'DOMAIN\user.name', Authentication service cannot retrieve authentication info I am baffled. The errors obviously are correct, the file /lib/security/pam_winbind.so does not exist. If its a dependancy/required, surely it should be part of the package? I've installed/reinstalled, I've used the downloaded package from the beyondtrust website, i've used the repository, nothing seems to work, every method of installing this application generates the same errors for me. UPDATE : Hrmm, I thought likewise didn't use native winbind but its own modules. Installing winbind from apt-get uninstalls pbis-open (likewise) and generates failures when installing if pbis-open is installed first. Uninstalled winbind, reinstalled pbis-open, same issue as above. The file pam_winbind.so does not exist in that location. Setting up pbis-open-legacy (7.0.1.918) ... Installing Packages was successful This computer is joined to DOMAIN.LOCAL New libraries and configurations have been installed for PAM and NSS. Clearly it thinks it has installed it, but it hasn't. It may be a legacy issue with the previous attempt to configure domain integration manually with winbind. Does anyone have a working likewise-open installation and does the /etc/nsswitch.conf include references to winbind? Or do the /etc/pam.d/common-account or /etc/pam.d/common-password reference pam_winbind.so? I'm unsure if those entries are just legacy or setup by likewise. UPDATE 2 : Complete reinstall of OS fixed it and it worked seamlessly, like it was meant to and those 2 PAM files did NOT include entries for pam_winbind.so, so that was the underlying problem. Thanks for the assist.

    Read the article

  • Podcast Show Notes: SOA Made Simple

    - by Bob Rhubart
    My guests for the latest OTN ArchBeat Podcast are Lonneke Dikmans and Ronald van Luttikhuizen, managing partners at Vennster (http://www.vennster.nl/) an  IT consultancy based in the Netherlands. Lonneke and Ronald are Oracle ACE Directors, very active members of the OTN architect community, and they have participated as panelists in previous ArchBeat podcasts. But given their collaboration on an upcoming book on service oriented architecture, I thought it was time to let them have the program to themselves. Listen to Part 1 Listen to Part 2 (Nov 30) Listen to Part 3 (Dec 7) Get Connected Lonneke and Ronald are very active in social media. Strike up your own conversation with them via the following links: Lonneke Dikmans Ronald van Luttikhuizen Coming Soon  A panel discussion with three members of the product team behind the upcoming release of WebLogic Server 12c. Stay tuned: RSS

    Read the article

  • Simple task framework - building software from reusable pieces

    - by RuslanD
    I'm writing a web service with several APIs, and they will be sharing some of the implementation code. In order not to copy-paste, I would like to ideally implement each API call as a series of tasks, which are executed in a sequence determined by the business logic. One obvious question is whether that's the best strategy for code reuse, or whether I can look at it in a different way. But assuming I want to go with tasks, several issues arise: What's a good task interface to use? How do I pass data computed in one task to another task in the sequence that might need it? In the past, I've worked with task interfaces like: interface Task<T, U> { U execute(T input); } Then I also had sort of a "task context" object which had getters and setters for any kind of data my tasks needed to produce or consume, and it gets passed to all tasks. I'm aware that this suffers from a host of problems. So I wanted to figure out a better way to implement it this time around. My current idea is to have a TaskContext object which is a type-safe heterogeneous container (as described in Effective Java). Each task can ask for an item from this container (task input), or add an item to the container (task output). That way, tasks don't need to know about each other directly, and I don't have to write a class with dozens of methods for each data item. There are, however, several drawbacks: Each item in this TaskContext container should be a complex type that wraps around the actual item data. If task A uses a String for some purpose, and task B uses a String for something entirely different, then just storing a mapping between String.class and some object doesn't work for both tasks. The other reason is that I can't use that kind of container for generic collections directly, so they need to be wrapped in another object. This means that, based on how many tasks I define, I would need to also define a number of classes for the task items that may be consumed or produced, which may lead to code bloat and duplication. For instance, if a task takes some Long value as input and produces another Long value as output, I would have to have two classes that simply wrap around a Long, which IMO can spiral out of control pretty quickly as the codebase evolves. I briefly looked at workflow engine libraries, but they kind of seem like a heavy hammer for this particular nail. How would you go about writing a simple task framework with the following requirements: Tasks should be as self-contained as possible, so they can be composed in different ways to create different workflows. That being said, some tasks may perform expensive computations that are prerequisites for other tasks. We want to have a way of storing the results of intermediate computations done by tasks so that other tasks can use those results for free. The task framework should be light, i.e. growing the code doesn't involve introducing many new types just to plug into the framework.

    Read the article

  • Programa Talleres FMW Mayo y Junio 2010

    - by [email protected]
    PROGRAMA TALLERES FMW Mayo y Junio 2010 Enterprise 2.0 TALLER FECHA LOCALIZACIÓN Enterprise 2.0 y Redes Sociales Empresariales (Webcenter Spaces) 03/05/10 Madrid Digitalización (IP/M) 10/05/10 Madrid Gestión Documental y Records Management (UCM/URM) 11/05/10 Barcelona Gestión de Contenidos Web y portales (UCM + WC Suite) 25/05/10 Barcelona Gestión Documental y Records Management (UCM/URM) 19/05/10 Madrid Gestión de Contenidos Web y portales (UCM + WC Suite) 31/05/10 Madrid Service Oriented Architecture (SOA) TALLER FECHA LOCALIZACIÓN Construccion de Modelos de Negocio con BPEL 11g 13/05/10 Madrid Automatización de Procesos de Negocio con Oracle BPM 20/05/10 Madrid Oracle WebLogic 27/05/10 Madrid Gestión de Ciclo de Vida SOA Sobre un Repositorio Empresarial 11/05/10 Madrid Desarrollo de Aplicaciones de Alto Rendimiento con Oracle Coherence 18/05/10 Madrid Plataforma de Integración de Datos (ODI) 25/05/10 Madrid Business Activity Monitoring 11g (BAM) 13/05/10 Barcelona Inscribirse:

    Read the article

  • SharePoint 2010 Hosting :: Setting Default Column Values on a Folder Programmatically

    - by mbridge
    The reason I write this post today is because my initial searches on the Internet provided me with nothing on the topic.  I was hoping to find a reference to the SDK but I didn’t have any luck.  What I want to do is set a default column value on an existing folder so that new items in that folder automatically inherit that value.  It’s actually pretty easy to do once you know what the class is called in the API.  I did some digging and discovered that class is MetadataDefaults. It can be found in Microsoft.Office.DocumentManagement.dll.  Note: if you can’t find it in the GAC, this DLL is in the 14/CONFIG/BIN folder and not the 14/ISAPI folder.  Add a reference to this DLL in your project.  In my case, I am building a console application, but you might put this in an event receiver or workflow. In my example today, I have simple custom folder and document content types.  I have one shared site column called DocumentType.  I have a document library which each of these content types registered.  In my document library, I have a folder named Test and I want to set its default column values using code.  Here is what it looks like.  Start by getting a reference to the list in question.  This assumes you already have a SPWeb object.  In my case I have created it and it is called site. SPList customDocumentLibrary = site.Lists["CustomDocuments"]; You then pass the SPList object to the MetadataDefaults constructor. MetadataDefaults columnDefaults = new MetadataDefaults(customDocumentLibrary); Now I just need to get my SPFolder object in question and pass it to the meethod SetFieldDefault.  This takes a SPFolder object, a string with the name of the SPField to set the default on, and finally the value of the default (in my case “Memo”). SPFolder testFolder = customDocumentLibrary.RootFolder.SubFolders["Test"]; columnDefaults.SetFieldDefault(testFolder, "DocumentType", "Memo"); You can set multiple defaults here.  When you’re done, you will need to call .Update(). columnDefaults.Update(); Here is what it all looks like together. using (SPSite siteCollection = new SPSite("http://sp2010/sites/ECMSource")) {     using (SPWeb site = siteCollection.OpenWeb())     {         SPList customDocumentLibrary = site.Lists["CustomDocuments"];         MetadataDefaults columnDefaults = new MetadataDefaults(customDocumentLibrary);          SPFolder testFolder = customDocumentLibrary.RootFolder.SubFolders["Test"];         columnDefaults.SetFieldDefault(testFolder, "DocumentType", "Memo");         columnDefaults.Update();     } } You can verify that your property was set correctly on the Change Default Column Values page in your list This is something that I could see used a lot on an ItemEventReceiver attached to a folder to do metadata inheritance.  Whenever, the user changed the value of the folder’s property, you could have it update the default.  Your code might look something columnDefaults.SetFieldDefault(properties.ListItem.Folder, "MyField", properties.ListItem[" This is a great way to keep the child items updated any time the value a folder’s property changes.  I’m also wondering if this can be done via CAML.  I tried saving a site template, but after importing I got an error on the default values page.  I’ll keep looking and let you know what I find out.

    Read the article

  • How to mount a disk that supports Samba sharing (Using Disk Utility)

    - by Luis Alvarado - The Wolverine
    This might be a tricky question but here is the objective: Manage to mount a disk/partition automatically without (or at least trying to avoid): Editing any Samba configuration file Editing the fstab file and to make it a little bit harder, this needs to be done with the options for "Mount Options" in the Disk Utility: Note that if left as it is, every time a user mounts a partition/disk and then tries to share a folder in it, Windows users can see the share but can not access it, with a permission warning appearing. The point of all of this is to find the most user friendly (Oriented towards a GUI) way of enabling a partition to be mounted, accessed by the local user (Read, Write, Execute) and to also be able to, when needed, share a folder and have no problems reading/writing on it from another Ubuntu/Windows/Mac remote computer (Assuming both are in the same LAN network).

    Read the article

  • Two Free Training Webcasts Open for Registration

    - by KKline
    We've got two sessions that you need to sign up for right away. The upcoming webcast for Oracle-oriented folks has huge registration numbers. So get in while you still can before we hit the limit of what LiveMeeting can handle. Pain of the Week: SQL Server for the Oracle DBA Webcast: SQL Server for the Oracle DBA Date: Thursday, May 27, 2010 (Just a couple days hence!) Time: 8 a.m. Pacific / 11 a.m. Eastern / 4 p.m. United Kingdom / 5 p.m. Central Europe Duration: 45-60 minutes Cost: FREE In enterprise...(read more)

    Read the article

  • [EF + Oracle] Entities

    - by JTorrecilla
    Prologue Following with the Serie I started yesterday about Entity Framework with Oracle, Today I am going to start talking about Entities. What is an Entity? A Entity is an object of the EF model corresponding to a record in a DB table. For example, let’s see, in Image 1 we can see one Entity from our model, and in the second one we can see the mapping done with the DB. (Image 1) (Image 2) More in depth a Entity is a Class inherited from the abstract class “EntityObject”, contained by the “System.Data.Objects.DataClasses” namespace. At the same time, this class inherits from the following Class and interfaces: StructuralObject: It is an Abstract class that inherits from INotifyPropertyChanging and INotifyPropertyChanged interfaces, and it exposes the events that manage the Changes of the class, and the functions related to check the data types of the Properties from our Entity.  IEntityWithKey: Interface which exposes the Key of the entity. IEntityWithChangeTracker: Interface which lets indicate the state of the entity (Detached, Modified, Added…) IEntityWithRelationships: Interface which indicates the relations about the entity. Which is the Content of a Entity? A Entity is composed by: Properties, Navigation Properties and Methods. What is a Property? A Entity Property is an object that represents a column from the mapped table from DB. It has a data type equivalent in .Net Framework to the DB Type. When we create the EF model, VS, internally, create the code for each Entity selected in the Tables step, such all methods that we will see in next steps. For each property, VS creates a structure similar to: · Private variable with the mapped Data type. · Function with a name like On{Property_Name}Changing({dataType} value): It manages the event which happens when we try to change the value. · Function with a name like On{Property_Name}Change: It manages the event raised when the property has changed successfully. · Property with Get and Set methods: The Set Method manages the private variable and do the following steps: Raise Changing event. Report the Entity is Changing. Set the prívate variable. For it, Use the SetValidValue function of the StructuralObject. There is a function for each datatype, and the functions takes 2 params: the value, and if the prop allow nulls. Invoke that the entity has been successfully changed. Invoke the Changed event of the Prop. ReportPropertyChanging and ReportPropertyChanged events, let, respectively, indicate that there is pending changes in the Entity, and the changes have success correctly. While the ReportPropertyChanged is raised, the Track State of the Entity will be changed. What is a Navigation Property? Navigation Properties are a kind of property of the type: EntityCollection<TEntity>, where TEntity is an Entity type from the model related with the current one, it is said, is a set of record from a related table in the DB. The EntityCollection class inherits from: · RelatedEnd: There is an abstract class that give the functions needed to obtein the related objects. · ICollection<TEntity> · IEnumerable<TEntity> · IEnumerable · IListSource For the previous interfaces, I wish recommend the following post from Jose Miguel Torres. Navigation properties allow us, to get and query easily objects related with the Entity. Methods? There is only one method in the Entity object. “Create{Entity}”, that allow us to create an object of the Entity by sending the parameters needed to create it. Finally After this chapter, we know what is an Entity, how is related to the DB and the relation to other Entities. In following chapters, we will se CRUD operations(Create, Read, Update, Delete).

    Read the article

  • Java2Days 2012 Trip Report

    - by reza_rahman
    Java2Days 2012 was held in beautiful Sofia, Bulgaria on October 25-26. For those of you not familiar with it, this is the third installment of the premier Java conference for the Balkan region. It is an excellent effort by admirable husband and wife team Emo Abadjiev and Iva Abadjieva as well as the rest of the Java2Days team including Yoana Ivanova and Nadia Kostova. Thanks to their hard work, the conference continues to grow vigorously with almost a thousand enthusiastic, bright young people attending this year and no less than three tracks on Java, the Cloud and Mobile. The conference is a true gem in this region of the world and I am very proud to have been a part of it again, along with the other world class speakers the event rightfully attracts. It was my honor to present the first talk of the conference. It was a full-house session on Java EE 7 and 8 titled "JavaEE.Next(): Java EE 7, 8, and Beyond". The talk was primarily along the same lines as Arun Gupta's JavaOne 2012 technical keynote. I covered the changes in JMS 2, the Java API for WebSocket (JSR 356), the Java API for JSON Processing (JSON-P), JAX-RS 2, JCache, JPA 2.1, JTA 1.2, JSF 2.2, Java Batch, Bean Validation 1.1 and the rest of the APIs in Java EE 7. I also briefly talked about the possible contents of Java EE 8. My stretch goal was to gather some feedback on some open issues in the Java EE EG (more on that soon) but I ran out of time in the short format forty-five minute session. The talk was received well and I had some pretty good discussions afterwards. The slides for the talk are here: JavaEE.Next(): Java EE 7, 8, and Beyond from reza_rahman To my delight, the Java2Days folks were very interested in my domain-driven design/Java EE 6 talk (titled "Domain Driven Design with Java EE 6"). I've had this talk in my inventory for a long time now but it always gets overridden by less theoretical talks on APIs, tools, etc. The talk has three parts -- a brief overview of DDD theory, mapping DDD to Java EE and actual running DDD code in Java EE 6/GlassFish. For the demo, I converted the well-known DDD sample application (http://dddsample.sourceforge.net/) written mostly in Spring 2 and Hibernate 2 to Java EE 6. My eventual plan is to make the code available via a top level java.net project. Even despite the broad topic and time constraints, the talk went very well. It was a full house, the Q & A was excellent and one of the other speakers even told me they thought this was the best talk of the conference! The slides for the talk are here: Domain Driven Design with Java EE 6 from Reza Rahman The code examples are available here: https://blogs.oracle.com/reza/resource/dddsample.zip for now, as a simple zip file. Give me a shout if you would like to get it up and running. It was also a great honor to present the last session of the conference. It was a talk on the Java API for WebSocket/JSR 356 titled "Building HTML5/WebSocket Applications with JSR 356 and GlassFish". The talk is based on Danny Coward's JavaOne 2012 talk. The talk covers the basic of WebSocket, the JSR 356 API and a simple demo using Tyrus/GlassFish. The talk went very well and there were some very good questions afterwards. The slides for the talk are here: Building HTML5/WebSocket Applications with GlassFish and JSR 356 from Reza Rahman The code samples are available here: https://blogs.oracle.com/arungupta/resource/totd183-HelloWebSocket.zip. You'll need the latest promoted GlassFish 4 build to run the code. Give me a shout if you need help. Besides presenting my talks, I got to attend some great sessions on OSGi, HTML5, cloud, agile and Java 8. I got an invite to speak at the Macedonia JUG when possible. Victor Grazi of InfoQ wrote about my sessions and Java2Days here: http://www.infoq.com/news/2012/11/Java2DaysConference. Stoyan Rachev was very kind to blog about my sessions here: http://www.stoyanr.com/2012/11/java2days-2012-java-ee.html. I definitely enjoyed Java2Days 2012 and hope to be part of the conference next year!

    Read the article

  • MDM for Tax Authorities

    - by david.butler(at)oracle.com
    In last week’s MDM blog, we discussed MDM in the Public Sector. I want to continue that thread. After all, no industry faces tougher data quality problems than governmental organizations, and few industries suffer more significant down side consequences to poor operations than local, state and federal governments. One key challenge area is taxation. Tax Authorities face a multitude of IT challenges. Firstly, the data used in tax calculations is increasing in volume and complexity. They must improve service by introducing multi-channel contact centers and self-service capabilities. Security concerns necessitate increasingly sophisticated data protection procedures. And cost constraints are driving Tax Authorities to rely on off-the-shelf software for many of their functional areas. Compounding these issues is the fact that the IT architectures in operation at most revenue and collections agencies are very complex. They typically include multiple, disparate operational and analytical systems across which the sum total of data about individual constituents is fragmented. To make matters more complicated, taxation is not carried out by a single jurisdiction, and often sources of income including employers, investments and other sources of taxable income and deductions must also be tracked and shared among tax authorities. Collectively, these systems are involved in tax assessment and collections, risk analysis, scoring, tracking, auditing and investigation case management. The Problem of Constituent Data Management The infrastructure described above makes it very difficult to create a consolidated representation of a given party. Differing formats and data models mean that a constituent may be represented in one way in one system and in a different way in another. Individual records are frequently inaccurate, incomplete, out of date and/or inconsistent with other records relating to the same constituent. When constituent data must be aggregated and scored, information within each system must be rationalized and normalized so the agency can produce a constituent information file (CIF) that provides a single source of truth about that party. If information about that constituent changes, each system in turn must be updated. There have been many attempts to solve this problem with technology: from consolidating transactional systems to conducting manual systems integration projects and superimposing layers of business intelligence and analytics. All these approaches can be successful in solving a portion of the problem at a specific point in time, but without an enterprise perspective, anything gained is quickly lost again. Oracle Constituent Data Mastering for Tax Authorities: A Single View of the Constituent Oracle has a flexible and long-term solution to the problem of securely integrating and managing constituent data. The Oracle Solution for mastering Constituent Data for Tax Authorities is based on two core product offerings: Oracle Customer Hub and – optionally – Oracle Application Integration Architecture (AIA). Customer Hub is a master data management (MDM) product that centralizes, de-duplicates, and enriches constituent data. It unifies fragmented information without disrupting existing business processes or IT investments. Role based data access and privacy rules guarantee maximum security and privacy. Data is continuously and automatically synchronized with all source systems. With the Oracle Customer Hub managing the master constituent identity, every department can capture transaction activity against the same record, improving reporting accuracy, employee productivity, reliability of constituent analytics, and day-to-day constituent relationships. Oracle Application Integration Architecture provides a collection of core pre-built processes to support out of the box Master Data Governance across Oracle Customer Hub, Siebel CRM, and Oracle E-Business Suite. It also provides a framework to enable MDM integrations with other Oracle and non-Oracle applications. Oracle AIA removes some of the key inhibitors to implementing a service-oriented architecture (SOA) by providing a pre-built SOA-based middleware foundation as well as industry-optimized service oriented applications, all built around a SOA governance model that encourages effective design and reuse. I encourage you to read Oracle Solution for Mastering Constituents Data for Public Sector – Tax Authorities by Roberto Negro. It is an outstanding whitepaper that describes how the Oracle MDM solution allows you to create a unified, reconciled source of high-quality constituent data and gain an accurate single view of each constituent. This foundation enables you to lower the costs associated with data quality and integration and create a tax organization that is efficient, secure and constituent-centric. Also, don’t forget the upcoming webcast on Thursday, February 10th: Deliver Improved Services to Citizens at Lower Cost to your Organization Our Guest Speaker is Ruben Spekle, from Capgemini. He will also provide insight into Public Sector Master Data Management and Case Management implementations including one that was executed for a Dutch Government Agency. If you are interested in how governmental organizations from around the world are using MDM to advance their cause, click here to register for the webcast.

    Read the article

  • Threading Overview

    - by ACShorten
    One of the major features of the batch framework is the ability to support multi-threading. The multi-threading support allows a site to increase throughput on an individual batch job by splitting the total workload across multiple individual threads. This means each thread has fine level control over a segment of the total data volume at any time. The idea behind the threading is based upon the notion that "many hands make light work". Each thread takes a segment of data in parallel and operates on that smaller set. The object identifier allocation algorithm built into the product randomly assigns keys to help ensure an even distribution of the numbers of records across the threads and to minimize resource and lock contention. The best way to visualize the concept of threading is to use a "pie" analogy. Imagine the total workset for a batch job is a "pie". If you split that pie into equal sized segments, each segment would represent an individual thread. The concept of threading has advantages and disadvantages: Smaller elapsed runtimes - Jobs that are multi-threaded finish earlier than jobs that are single threaded. With smaller amounts of work to do, jobs with threading will finish earlier. Note: The elapsed runtime of the threads is rarely proportional to the number of threads executed. Even though contention is minimized, some contention does exist for resources which can adversely affect runtime. Threads can be managed individually – Each thread can be started individually and can also be restarted individually in case of failure. If you need to rerun thread X then that is the only thread that needs to be resubmitted. Threading can be somewhat dynamic – The number of threads that are run on any instance can be varied as the thread number and thread limit are parameters passed to the job at runtime. They can also be configured using the configuration files outlined in this document and the relevant manuals.Note: Threading is not dynamic after the job has been submitted Failure risk due to data issues with threading is reduced – As mentioned earlier individual threads can be restarted in case of failure. This limits the risk to the total job if there is a data issue with a particular thread or a group of threads. Number of threads is not infinite – As with any resource there is a theoretical limit. While the thread limit can be up to 1000 threads, the number of threads you can physically execute will be limited by the CPU and IO resources available to the job at execution time. Theoretically with the objects identifiers evenly spread across the threads the elapsed runtime for the threads should all be the same. In other words, when executing in multiple threads theoretically all the threads should finish at the same time. Whilst this is possible, it is also possible that individual threads may take longer than other threads for the following reasons: Workloads within the threads are not always the same - Whilst each thread is operating on the roughly the same amounts of objects, the amount of processing for each object is not always the same. For example, an account may have a more complex rate which requires more processing or a meter has a complex amount of configuration to process. If a thread has a higher proportion of objects with complex processing it will take longer than a thread with simple processing. The amount of processing is dependent on the configuration of the individual data for the job. Data may be skewed – Even though the object identifier generation algorithm attempts to spread the object identifiers across threads there are some jobs that use additional factors to select records for processing. If any of those factors exhibit any data skew then certain threads may finish later. For example, if more accounts are allocated to a particular part of a schedule then threads in that schedule may finish later than other threads executed. Threading is important to the success of individual jobs. For more guidelines and techniques for optimizing threading refer to Multi-Threading Guidelines in the Batch Best Practices for Oracle Utilities Application Framework based products (Doc Id: 836362.1) whitepaper available from My Oracle Support

    Read the article

  • Versioning Strategy for Service Interfaces JAR

    - by Colin Morelli
    I'm building a service oriented architecture composed (mostly) of Java-based services, each of which is a Maven project (in an individual repository) with two submodules: common, and server. The common module contains the service's interfaces that clients can include in their project to make service calls. The server submodule contains the code that actually powers the service. I'm now trying to figure out an appropriate versioning strategy for the interfaces, such that each interface change results in a new common jar, but changes to the server (so long as they don't impact the contract of the interfaces) receive the same common jar. I know this is pretty simple to do manually (simply increment the server version and don't touch the common one), but this project will be built and deployed by a CI server, and I'd like to come up with a strategy for automatically versioning these. The only thing I have been able to come up with so far is to have the CI server md5 the service interfaces.

    Read the article

  • Free Universal Construction Kit Links Different Construction Toy Systems

    - by Jason Fitzpatrick
    If you or a young tinker in your household is disappointed that there is no way to link LEGO bricks and Lincoln Logs (or other construction toys on the market) this project is for you. Free Universal Construction Kit is project oriented around creating inter-operable linking bricks that allow the user to link previously non-compatible building system. Using the bricks you can, for example, attach LEGO bricks to a K’Nex construction. The adapter bricks are all available as free 3D printer models–download them, fire up your 3D printer, and start mish-mashing your construction sets. Free Universal Construction Kits [via Make] The HTG Guide to Hiding Your Data in a TrueCrypt Hidden Volume Make Your Own Windows 8 Start Button with Zero Memory Usage Reader Request: How To Repair Blurry Photos

    Read the article

  • Open Source Bulletin Board with Facebook Group Integration

    - by Brian
    I'm working on a an open-source community-oriented project which needs a highly social component where users can post discussion topics and questions and interact with each other. It would be ideal to facilitate discussion seamlessly between a bulletin board and Facebook. Has anyone seen such an integration? I'm talking about something that goes beyond a simple FB OAuth and actually synchronizes both forum posts / topics / OAuth / comments. Pretty please if a moderator is going to delete this tell me which StackExchange forum is the appropriate place for posting such an inquiry. :)

    Read the article

  • Application Does Not Start in Windows 7

    - by Jim Fell
    I recently installed a new 60GB SSD as my primary hard drive and re-installed Windows 7 Professional 64-bit. I then installed SSD Fresh from Abelssoft to optimize Windows to run on the SSD. It seemed to install okay, but when I try to run the utility, its splash screen appears briefly before it quietly closes. No errors are displayed; the utility just fails to launch. I have run SSD Fresh on another SSD-equipped Windows 7 Pro x64 computer in the past without any problems. Does anyone know what might be preventing the program from running? I tried running sfc /scannow from the command line (with administrator privileges), shutting down the Spybot Resident, and disabling the firewall and virus scanner. I also tried running the tool as administrator; I even tried reinstalling it, running the installer as administrator. No luck. Every time I try to launch the program the Event Viewer logs this same set of errors: Error 4/2/2012 11:35:44 PM Application Error 1000 (100) Faulting application name: SSDFresh.exe, version: 1.0.0.0, time stamp: 0x4f2a45d8 Faulting module name: unknown, version: 0.0.0.0, time stamp: 0x00000000 Exception code: 0xc0000005 Fault offset: 0x000007ff0016dbba Faulting process id: 0x994 Faulting application start time: 0x01cd11fd9fe978df Faulting application path: C:\Program Files (x86)\SSD Fresh\SSDFresh.exe Faulting module path: unknown Report Id: dfeed551-7df0-11e1-a2c7-002522c47ec0 Error 4/2/2012 11:35:43 PM .NET Runtime 1026 None Application: SSDFresh.exe Framework Version: v4.0.30319 Description: The process was terminated due to an unhandled exception. Exception Info: System.NullReferenceException Stack: at AbBugReporter.BugForm.InitLanguage() at AbBugReporter.BugForm..ctor(AbFlexTrans.LanguageInfo, AbBugReporter.BugReportManager, Boolean) at AbBugReporter.BugReportManager.Show(System.Exception) at SSDFresh.App.App_DispatcherUnhandledException(System.Object, System.Windows.Threading.DispatcherUnhandledExceptionEventArgs) at System.Windows.Threading.Dispatcher.CatchException(System.Exception) at MS.Internal.Threading.ExceptionFilterHelper.TryCatchWhen(System.Object, System.Delegate, System.Object, Int32, System.Delegate) at System.Windows.Threading.Dispatcher.WrappedInvoke(System.Delegate, System.Object, Int32, System.Delegate) at System.Windows.Threading.Dispatcher.InvokeImpl(System.Windows.Threading.DispatcherPriority, System.TimeSpan, System.Delegate, System.Object, Int32) at MS.Win32.HwndSubclass.SubclassWndProc(IntPtr, Int32, IntPtr, IntPtr) at MS.Win32.UnsafeNativeMethods.DispatchMessage(System.Windows.Interop.MSG ByRef) at System.Windows.Threading.Dispatcher.PushFrameImpl(System.Windows.Threading.DispatcherFrame) at System.Windows.Application.RunInternal(System.Windows.Window) at System.Windows.Application.Run() at SSDFresh.App.Main() Error 4/2/2012 11:35:39 PM SideBySide 59 None Activation context generation failed for "C:\Windows\Microsoft.NET\Framework64\v4.0.30319\csc.exe".Error in manifest or policy file "C:\Windows\Microsoft.NET\Framework64\v4.0.30319\csc.exe.Config" on line 0. Invalid Xml syntax. Error 4/2/2012 11:35:39 PM SideBySide 59 None Error 4/2/2012 11:35:39 PM SideBySide 59 None Error 4/2/2012 11:35:39 PM SideBySide 59 None Error 4/2/2012 11:35:39 PM SideBySide 59 None Error 4/2/2012 11:35:39 PM SideBySide 59 None Error 4/2/2012 11:35:39 PM SideBySide 59 None Error 4/2/2012 11:35:39 PM SideBySide 59 None Error 4/2/2012 11:35:39 PM SideBySide 59 None Error 4/2/2012 11:35:39 PM SideBySide 59 None Error 4/2/2012 11:35:39 PM SideBySide 59 None Error 4/2/2012 11:35:39 PM SideBySide 59 None Error 4/2/2012 11:35:39 PM SideBySide 59 None Error 4/2/2012 11:35:39 PM SideBySide 59 None Error 4/2/2012 11:35:39 PM SideBySide 59 None Error 4/2/2012 11:35:39 PM SideBySide 59 None Error 4/2/2012 11:35:39 PM SideBySide 59 None Error 4/2/2012 11:35:39 PM SideBySide 59 None Error 4/2/2012 11:35:39 PM SideBySide 59 None Error 4/2/2012 11:35:39 PM SideBySide 59 None Error 4/2/2012 11:35:39 PM SideBySide 59 None Error 4/2/2012 11:35:39 PM SideBySide 59 None Error 4/2/2012 11:35:39 PM SideBySide 59 None Error 4/2/2012 11:35:39 PM SideBySide 59 None Error 4/2/2012 11:35:39 PM SideBySide 59 None Error 4/2/2012 11:35:39 PM SideBySide 59 None Error 4/2/2012 11:35:39 PM SideBySide 59 None Error 4/2/2012 11:35:39 PM SideBySide 59 None For those who are interested, here is my system configuration: ASRock M3A770DE AM3 AMD 770 ATX AMD Motherboard AMD Athlon II X3 455 Rana 3.3GHz Socket AM3 95W Triple-Core Desktop Processor ADX455WFGMBOX G.SKILL Value Series 8GB (2 x 4GB) 240-Pin DDR3 SDRAM DDR3 1333 (PC3 10600) Desktop Memory Model F3-10600CL9D-8GBNT Mushkin Enhanced Chronos Deluxe MKNSSDCR60GB-DX 2.5" 60GB SATA III Synchronous MLC Internal Solid State Drive (SSD) (Primary/Boot HD) Western Digital Caviar Blue RFHWD1600AAJS 160GB 7200 RPM SATA 3.0Gb/s 3.5" Internal Hard Drive -Bare Drive (Secondary HD) Sony Optiarc CD/DVD Burner Black SATA Model AD-7261S-0B LightScribe Support RAIDMAX RX-850AE 850W ATX12V v2.3 / EPS12V SLI Certified CrossFire Ready 80 PLUS GOLD Certified Modular Active PFC Power Supply ASUS HD7850-DC2-2GD5 Radeon HD 7850 2GB 256-bit GDDR5 PCI Express 3.0 x16 HDCP Ready CrossFireX Support Video Card Asus ML228H 21.5" Full HD LED BackLight LED Monitor Slim Design (x3)

    Read the article

  • Do employers prefer software engineering over CS majors?

    - by Joey Green
    I'm in grad school at a university that was one of the first to have a software engineering accredited program. My undergrad is in CS. An employer recently recruited at our university and hired 5 SE majors. None of them were CS. Do employers prefer software engineering majors? The reason I ask is because I can focus on many different areas during my graduate studies and really want to take the classes that will help me land a great job. Right now I'm either going to use CUDA and parallelize an advanced ray-tracer for a graduate project or do research on non-photo-realistic rendering in augmented reality. Pursuing these would leave very little SE classes in my schedule. If I went the software engineering route, I would probably either do research into data-oriented programming or software design complexity. Sometimes I think when I'm 40 and look back will it matter at all? For some reason I'm thinking not.

    Read the article

  • Build-time dependency resolving coming to Entity Framework. Now, how about those BI tools too?

    - by jamiet
    Three months ago I wrote a blog post entitled Some thoughts on Visual Studio database references and how they should be used for SQL Server BI where I shared some thoughts on a feature available to database developers in Visual Studio 2010 that I would love to see added to SQL Server Integration Services (SSIS), Analysis Services (SSAS) and Reporting Services (SSRS). In there I said: Over the past few weeks I have been making heavy use of the Database tools in Visual Studio 2010 and one of the features that has most impressed me has been database references.   Database references allow you to have stored procedures in your database project that refer to objects (tables, views, stored procedures etc…) that exist in other database projects and hence when you build your database project it is able to resolve those references.   It occurred to me that similar functionality would be incredibly useful for SQL Server Integration Services(SSIS), Analysis Services (SSAS) & Reporting Services (SSRS) projects. After all reports, packages and data source views are rife with references to database objects – why shouldn’t we be able to have design-time dependency checking in our BI projects the same way that database and .Net developers do? In that blog post I shared links to three Connect submissions where I requested this feature be added to SSIS, SSAS & SSRS. In addition I also submitted a request that the feature be extended to .Net projects so that any reference to a database object in a .Net assembly can be resolved at build time. That Connect submission is at [Entity FX] Use database references to constrain the EDM and overnight it received this comment from Microsoft: We have been working on this feature for a while and and will be available soon This is really good news - it improves the Microsoft developer ecosystem by ensuring invalid references to database references get caught at build time (ideally as part of a Continuous integration build) rather than run time. [Hopefully it might nip this code-first nonsense in the bud too (Ooo...way to incite flame comments :) ) ]. If you want to see this feature in action then check out a video from Teched Europe last month entitled SQL Server Developer Tools Code-named "Juneau" where it is demo'd by Lance Delano and Tim Laverty.   The point of this blog post though is not just to draw attention to this forthcoming feature for .Net developers, it is to ask you to petition Microsoft to get this feature added to SSIS/SSAS/SSRS too. After all, we already know (from the video above) that the feature is coming to this new code-name Juneau development environment plus we also know that Juneau will be the development environment for SSIS/SSAS/SSRS as well - is it really much of a stretch to expect the BI tools to have access to this great feature too? I don't think so and if you agree with me then I urge you to vote and add a comment to the Connection submissions that are requesting this feature. They are at: [SSAS] Declare Object Dependancies [SSRS] Declare Object Dependancies [SSIS] Declare Object Dependancies (Update, Apparently someone at Microsoft has deemed it necassary to set this to private and I am not able to change it back even though I submitted it. You can still vote on the other two though.) Let's close that SQL Developer Gap!   @Jamiet    

    Read the article

  • The Best Application Launchers and Docks for Organizing Your Desktop

    - by Lori Kaufman
    Is your desktop so cluttered you can’t find anything? Is your Start menu so long you have to scroll to see what programs are there? If so, you probably need an application launcher to organize your desktop and make your life easier. We’ve created a list of many useful application launchers in different forms. You can choose from dock programs, portable application launchers, Start menu and Taskbar replacements, and keyboard-oriented launchers. HTG Explains: What Is RSS and How Can I Benefit From Using It? HTG Explains: Why You Only Have to Wipe a Disk Once to Erase It HTG Explains: Learn How Websites Are Tracking You Online

    Read the article

  • Azure Blob storage defrag

    - by kaleidoscope
    The Blob Storage is really handy for storing temporary data structures during a scaled-out distributed processing. Yet, the lifespan of those data structures should not exceed the one of the underlying operation, otherwise clutter and dead data could potentially start filling up your Blob Storage Temporary data in cloud computing is very similar to memory collection in object oriented languages, when it's not done automatically by the framework, temp data tends to leak. In particular, in cloud computing,  it's pretty easy to end up with storage leaks due to: Collection omission. App crash. Service interruption. All those events cause garbage to accumulate into your Blob Storage. Then, it must be noted that for most cloud apps, I/O costs are usually predominant compared to pure storage costs. Enumerating through your whole Blob Storage to clean the garbage is likely to be an expensive solution. Lokesh, M

    Read the article

  • Gauging Maturity of your BPM Strategy - part 2 / 2

    - by Sanjeev Sharma
    In my earlier post I had discussed the essence of maturity assessment and the business imperative for doing the same in the context of BPM. In this post I will discuss Oracle’s BPM Maturity assessment methodology. Oracle’s BPM Maturity model comprises of the following components: Maturity – represents stages of evolution of your BPM capability with 0 being the lowest level and 5 being the highest level  Domain – represents multiple perspectives both technical and business oriented against which your BPM capability can be assessed Adoption – represents scale of BPM rollout starting at the project level to the enterprise level Note: Your BPM capability can be at different levels of maturity for the different domains. Oracle’s BPM assessment methodology measures the maturity of your BPM capability at the individual domain level as well as the aggregate level. The output of Oracle’s BPM assessment benefits you in two ways: Gap Analysis by comparing the “As-Is” BPM capability with the desired “To-Be” BPM capability along the various domains  (see Figure 1) Systematic Adoption by aligning evolution of BPM capability with its rollout in multiple phases (see Figure 2)

    Read the article

  • Looking for a Python UI library comparable with Windows Forms [on hold]

    - by Mitten
    I am looking for a Python UI library which I could use to develop a desktop GUI comparable to what can be done with .NET Windows Forms. I have no previous experience programming UI in Python, so I would rather choose (if there is a choice) something simple. The application I am building would be a document oriented - rich texts, lists and grids, I don't expect to use much graphics - mostly formatted texts. Any pointers, and if there is more than one major GUI library available for Python - how could I quickly test them to see which one is a better fit for my needs?

    Read the article

  • How to Apply a Business Card Template to a Contact and Customize it in Outlook 2013

    - by Lori Kaufman
    If you want to add a business card template to an existing contact in Outlook, you can do so without having to enter all of the information again. We will also show you how to customize the layout and format of the text on the card. Microsoft provides a couple of business card templates you can use. We will use their Blue Sky template as an example. To open the archive file for the template you downloaded, double-click on the .cab file. NOTE: You can also use a tool like 7-Zip to open the archive. A new Extract tab becomes available under Compressed Folder Tools and the files in the archive are listed. Select the .vcf file in the list of files. This automatically activates the Extract tab. Click Extract To and select a location or select Choose location if the desired location is not on the drop-down menu. Select a folder in which you want to save the .vcf file on the Copy Items dialog box and click Copy. NOTE: Use the Make New Folder button to create a new folder for the location, if desired. Double-click on the .vcf file that you copied out of the .cab archive file. By default, .vcf files are associated with Outlook so, when you double-click on a .vcf file, it automatically opens in a Contact window in Outlook. Change the Full Name to match the existing contact to which you want to apply this template. Delete the other contact info from the template. If you want to add any additional information not in the existing contact, enter it. Click Save & Close to save the contact with the new template. The Duplicate Contact Detected dialog box displays. To update the existing contact, select the Update information of selected Contact option. Click Update. NOTE: If you want to create a new contact from this template, select the Add new contact option. With the Contacts folder open (the People link on the Navigation Bar), click Business Card in the Current View section of the Home tab. You may notice that not all the fields from your contact display on the business card you just updated. Double-click on the contact to update the contact and the business card. On the Contact window, right-click on the image of the business card and select Edit Business Card from the popup menu. The Edit Business Card dialog box displays. You can change the design of the card, including changing he background color or image. The Fields box allows you to specify which fields display on the business card and in what order. Notice, in our example, that Company is listed below the Full Name, but no text displays on the business card below the name. That’s because we did not enter any information for Company in the Contact. We have information in Job Title. So, we select Company and click Remove to remove that field. Now, we want to add Job Title. First, select the field below which you want to add the new field. We select Full Name to add the Job Title below that. Then, we click Add and select Organization | Job Title from the popup menu to insert the Job Title. To make the Job Title white like the name, we select Job Title in the list of Fields and click the Font Color button in the Edit section. On the Color dialog box, select the color you want to use for the text in the selected field. Click OK. You can also make text bold, italic, or underlined. We chose to make the Job Title bold and the Full Name bold and italic. We also need to remove the Business Phone because this contact only has a mobile phone number. So, we add a Mobile Phone from the Phone submenu. Then, we need to remove enough blank lines so the Mobile Phone is visible on the card. We also added a website and email address and removed more blank lines so they are visible. You can also move text to the right side of the card or make it centered on the card. We also changed the color of the bottom three lines to blue. Click OK to accept your changes and close the dialog box. Your new business card design displays on the Contact window. Click Save & Close to save the changes you made to the business card for this contact and close the Contact window. The final design of the business card displays in the Business Card view on the People screen. If you have a signature that contains the business card for the contact you just updated, you will also need to update the signature by removing the business card and adding it again using the Business Card button in the Signature editor. You can also add the updated Business Card to a signature without the image or without the vCard (.vcf) file.     

    Read the article

  • UDDI vs SO-Aware: Why SO-Aware is the More Efficient and Interoperable Alternative

    - by Vishal
    Hello folks,   If you are implementing a service oriented architecture, and are unsure of the best governance approach to follow, then this webinar is a must-attend event for you.  We will discuss why SO-Aware is the more efficient and interoperable alternative to traditional UDDI-based SOA-governance.   Specifically, we will address the differences between UDDI and SO-Aware in terms of service discovery, configuration, and policy resolution.  Finally, we will address why the REST/Odata based model implemented by SO-Aware enables the most efficient governance not only for WCF but for BizTalk, the Windows Server AppFabric and the Windows Azure AppFabric as well.   Join us on January 26th at 2:00 ET - to register, click here    Thanks,   Vishal

    Read the article

  • Forbes Article on Big Data and Java Embedded Technology

    - by hinkmond
    Whoa, cool! Forbes magazine has an online article about what I've been blogging about all this time: Big Data and Java Embedded Technology, tying it all together with a big bow, connecting small devices to the data center. See: Billions of Java Embedded Devices Here's a quote: By the end of the decade we could see tens of billions of new Internet-connected devices... with billions of Internet- connected devices generating Big Data, are the next big thing. ... That’s why Oracle has put together an ecosystem of solutions for this new, Big Data-oriented device-to-data center world: secure, powerful, and adaptable embedded Java for intelligent devices, integrated middleware... This is the next big thing. Java SE Embedded Technology is something to watch for in the new year. Start developing for it now to get a head-start... Hinkmond

    Read the article

  • Have you downloaded the Oracle SOA Governance Resource Kit yet? By Yogesh Sontakke

    - by JuergenKress
    Effective Service-Oriented Architecture (SOA) Governance is an essential element in any enterprise transformation strategy. Oracle's SOA Governance solution eases the transition of an organization to SOA by providing a means to reduce risk, maintain business alignment, and show the business value of SOA investments. Whether just embarking on an SOA initiative, or expanding a project or pilot for a broader deployment, this SOA Governance resource kit will guide you along the path to a measurable SOA success. The SOA Governance resource kit includes: White papers, data sheets, analyst reports and customer success stories Webcasts, podcasts and other interactive resources Software downloads from Oracle Technology Network (OTN) Additional information from Oracle.com and OTN Get it here now!” SOA & BPM Partner Community For regular information on Oracle SOA Suite become a member in the SOA & BPM Partner Community for registration please visit www.oracle.com/goto/emea/soa (OPN account required) If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Facebook Wiki Mix Forum Technorati Tags: SOA Governance,SOA Community,Oracle SOA,Oracle BPM,Community,OPN,Jürgen Kress

    Read the article

< Previous Page | 579 580 581 582 583 584 585 586 587 588 589 590  | Next Page >