Search Results

Search found 71449 results on 2858 pages for 'oracle data integration'.

Page 313/2858 | < Previous Page | 309 310 311 312 313 314 315 316 317 318 319 320  | Next Page >

  • ?????? ??????????! ?Bronze???? vol.4 <??>

    - by M.Morozumi
    ???ORACLE MASTER Bronze Oracle Database 11g?????????????? ?????????????????????? ------------------------------- ????: Oracle Database 11g ???READ ONLY ?????????????????????????? ???????????????????????????1????????? a.?????????? DML ???????? b.?????????? SELECT ... FOR UPDATE ???????? c.??????? DROP ??????????? d.????????????????????ALTER TABLE ... READ WRITE ????????? ??????????????? ------------------------------- ??:c.??????? DROP ??????????? ??: ??????????? DROP ???????????????????????

    Read the article

  • ?????? ??????????! ?Bronze???? vol.4

    - by M.Morozumi
    ??????????????????????????????????????????????????????????????????? ???ORACLE MASTER Bronze Oracle Database 11g??????????????????????? ------------------------------- ????: Oracle Database 11g ???READ ONLY ?????????????????????????? ???????????????????????????1????????? a.?????????? DML ???????? b.?????????? SELECT ... FOR UPDATE ???????? c.??????? DROP ??????????? d.????????????????????ALTER TABLE ... READ WRITE ????????? ???????????????

    Read the article

  • ???

    - by ???02
    ???Oracle Advanced SecurityOracle Advanced Security??Oracle Database???????????????????????????????????????????????????????????????? ????????????????????????????????????????????????????????????Oracle Advanced Security??????????????????????????????????????????????????????? ????????????????????????????????1. Network Encryption (?????????)Oracle Advanced Security?Network Encryption??Oracle Database???????????????????????????????????????????/??????????????SSL??? ???????????????????????????????????????????????????????????·????????? (sqlnet.ora)???????????????????????????????????????????????????????????? ?????????????????????????????2. Transparent Data Encryption (?????????)Transparent Data Encryption?????Oracle Database??????DBMS_CRYPTO??????????????(??????????????????????)????????? ???????????????????????????????????????????????????????????????????????? ??????????????????????????????SQL???????????????????????????????????? Oracle Database??????????3. Backup Encryption (??????????)Oracle Advanced Security??RMAN????????????????????Data Pump???????????????????????????????????????????????????????????????????? ???????????????????????????????·?????????????????????????????????????? ?????? Oracle Direct

    Read the article

  • Force Oracle error on fetch

    - by Dan
    I am trying to debug a strange behavior in my application. In order to do so, I need to reproduce a scenario where an SQL SELECT query will throw an error, but only while actually fetching from the cursor, not while executing the query itself. Can this be done? Any error will do, but ORA-01722: invalid number seems like the obvious one to try. I created a table with the follwing: KEYCOL INTEGER PRIMARY KEY OTHERCOL VARCHAR2(100) I then created a few hundred rows with unique values for the primary key and the value l for the othercol. I then ran a SELECT * query, picked a row somewhere in the middle, and updated it to the string abcd. I ran the query SELECT KEYCOL, TO_NUMBER(OTHERCOL) FROM SOMETABLE hoping to get some rows of good data an then an error later. But I keep getting ORA-01722: invalid number on the execute step itself. I have gotten this behavior programmatically using ADO (with server-side cursor) and JDBC, as well as from PL/SQL Developer. How can I get the result I'm looking for? thanks Edit - meant to add, when using ADO, I am only calling Command.Execute. I am not creating or opening a Recordset.

    Read the article

  • Odd 'UNION' behavior in an Oracle SQL query

    - by RenderIn
    Here's my query: SELECT my_view.* FROM my_view WHERE my_view.trial in (select 2 as trial_id from dual union select 3 from dual union select 4 from dual) and my_view.location like ('123-%') When I execute this query it returns results which do not conform to the my_view.location like ('123-%') condition. It's as if that condition is being ignored completely. I can even change it to my_view.location IS NULL and it returns the same results, despite that field being not-nullable. I know this query seems ridiculous with the selects from dual, but I've structured it this way to replicate a problem I have when I use a 'WITH' clause (the results of that query are where the selects from dual inline view are). I can modify the query like so and it returns the expected results: SELECT my_view.* FROM my_view WHERE my_view.trial in (2, 3, 4) and my_view.location like ('123-%') Unfortunately I do not know the trial values up front (they are queried for in a 'WITH' clause) so I cannot structure my query this way. What am I doing wrong? I will say that the my_view view is composed of 3 other views whose results are UNION ALL and each of which retrieve some data over a DB Link. Not that I believe that should matter, but in case it does.

    Read the article

  • Error: "Suite Integration Toolkit Executable has stopped working" during VS2010 installation

    - by Daniel
    After uinstalling VS2012 and installing back Visual Studio 2010 C# Express I was getting strane warning: 2008 is not a valid number so I decided to reinstall VS2010, but I couldn't uinstall it. I was getting error: Suite Integration Toolkit Executable has stopped working. I managed to uinstall VS2010 with VS2010 Uinstall Tool, but now I want to install Visual Studio 2010 C# Express back. I'm getting Suite Integration Toolkit[...] error again when I try to install it using web installer. Could you help me resolve my problem? And I've got Tablet PC Components function disabled in Control Panel / Programs and Utilites.

    Read the article

  • Integration of SharePoint 2010 with TFS2010

    - by Kabir Rao
    We have performed following steps as of now- Install TFS2010 10.0.30319.1 (RTM) on Windows Server 2008 R2 Enterprise(app tier) SQL 2008 SP1 with Cumulative update 2 on Windows Server 2008 R2 Enterprise(data tier) Reporting Service is installed on app tier. After this installation worked fine we installed SharePoint 2010 on app tier. After installation we followed http://blogs.msdn.com/b/team_foundation/archive/2010/03/06/configuring-sharepoint-server-2010-beta-for-dashboard-compatibility-with-tfs-2010-beta2-rc.aspx for configuration. We are not able to perform the last step described in the link as following error occured- TF249063: The following Web service is not available: http://apptier:31254/_vti_bin/TeamFoundationIntegrationService.asmx. This Web service is used for the Team Foundation Server Extensions for SharePoint Products. The underlying error is: The remote server returned an error: (404) Not Found.. Verify that the following URL points to a valid SharePoint Web application and that the application is available: http://apptier:31254. If the URL is correct and the Web application is operating normally, verify that a firewall is not blocking access to the Web application. We have also noticed that Document Folder in Team project also have red x. Please help. Thanks upfront.

    Read the article

  • TFS Integration with Rational ClearQuest and Requirement Manager

    - by Kangkan
    I am working on an integration approach for integrating Rationa (IBM Jazz) Requirement Manager (RM) and Clear Quest (CQ) with TFS. As the teams are moving from ClearCase to TFS, what we are looking at is still being able to manage the requirements in RM and manage testing using CQ. The flow will be something like: Requirements are planned and detailed in RM Create work items in TFS connected to the Requirements in RM Create design and code using VS2010 and managing the version control in TFS Creating test plans and test cases in CQ (connected to requirements in RM) Run test against builds in TFS Publish test results in CQ against builds in TFS Run reports in RM, CQ and TFS that links up the items across the platforms. I have started looking at TFS Integration platform. But shall like to have your guidance for an early resolution and better solution approach.

    Read the article

  • TeamCity for continuous integration with Visual Studio 2010 solutions/projects

    - by JeffryEngberg
    I am running TeamCity build 5.1.1 on a virtual machine that also hosts our SVN environment. A team I support has recently made the move from Visual Studio 2008/Silverlight 3.0 to Visual Studio 2010/Silverlight 4.0 and when investigating how to do continuous integration with Visual Studio 2010 solutions/projects, it is not as cut and dried as it appeared to be in Visual Studio 2008. Previously I was using Web Deployment Projects and targeting different Release Configurations in TeamCity, which would use the Web Deployment Project to package/deploy the code to our various environments. However when checking out the new Publish ability in Visual Studio 2010 I cannot find a way to specify which location to deploy to. Does everything need to be done in MSBuild now (in the solution file or maybe the Web project file?). If anyone has any examples of how they've done Continuous Integration using TeamCity and Visual Studio 2010, it would be greatly appreciated as I am coming up blank at the moment.

    Read the article

  • Integration of SharePoint 2010 with TFS2010

    - by Kabir Rao
    We have performed following steps as of now- Install TFS2010 10.0.30319.1 (RTM) on Windows Server 2008 R2 Enterprise(app tier) SQL 2008 SP1 with Cumulative update 2 on Windows Server 2008 R2 Enterprise(data tier) Reporting Service is installed on app tier. After this installation worked fine we installed SharePoint 2010 on app tier. After installation we followed http://blogs.msdn.com/b/team_foundation/archive/2010/03/06/configuring-sharepoint-server-2010-beta-for-dashboard-compatibility-with-tfs-2010-beta2-rc.aspx for configuration. We are not able to perform the last step described in the link as following error occured- TF249063: The following Web service is not available: http://apptier:31254/_vti_bin/TeamFoundationIntegrationService.asmx. This Web service is used for the Team Foundation Server Extensions for SharePoint Products. The underlying error is: The remote server returned an error: (404) Not Found.. Verify that the following URL points to a valid SharePoint Web application and that the application is available: http://apptier:31254. If the URL is correct and the Web application is operating normally, verify that a firewall is not blocking access to the Web application. We have also noticed that Document Folder in Team project also have red x. Please help. Thanks upfront.

    Read the article

  • Can I make my drives visible and change their partition type without losing my data?

    - by user165408
    I have made a lot of mistakes and now I cannot see my hard disk nor I can start my operating system on my laptop. All my passwords and important files on my hdd without any backup. I followed this course of action Changed my hard disk partitions to dynamic just for getting 5th partition. (1st mistake) Decreased partitions to 4 again. Backed up operating system from 4th to 3rd partition with Norton Ghost. Booted from a live CD for Windows XP. Formatted 4th partition and moved my all important data from 1st and 2nd partitions to the 4th partition. Deleted 1st and 2nd partitions and got 1 partition from half of empty space. So I have just 3 partitions and empty space between 1st and 2nd partitions. Tried to install Windows 8 to the first partition but it did not allow because it is dynamic. Also it did not allow to install to other partitions. Tried to install Windows XP to the 1st partition but it said if I continue I cannot use other drivers. Therefore I escaped from installing it. Booted from the Windows XP live CD then increased 1st partiton to less than 400mb of empty space. Therefore I thought it will be adjacent but it was shown as 2 partitions. In my computer I see just 3 drivers. Using Norton Ghost I recovered my OS to the 1st partition. (2nd mistake it was on 4th partition originally) Booted from a Windows XP live CD I tried to install bcdedit to the Windows XP live CD but it did not work. Then I tried to install EaseUS Partition Master Home Edition. It was installed with errors then I start it and it showed me an error like there is no hard disk. I looked to my PC and my drivers were not there. Booted from the Norton Ghost CD and it did not show me my drivers either, but before I was able to see them. I checked numbers of partition shown by the Norton Ghost utility and they are still have same numbers so I have to see my drivers but I cannot see them now. My hard disk is shown as extarnal dynamic now so I cannot see any drive in my PC in the live Windows XP. There are two options; first one is import extarnal disk and second one is convert disk to basic. Will they delete my data? I fear booting from CDs like Windows XP live CD, Norton Ghost CD, and the operating system CD/DVD, because they may overwrite a few MB their data to my data. These recover tools are already exist in Windows XP live CD by The Ultimate Boot CD for Windows. Can any of them help me? CompuAppa SwissKnife V3 DBXtract Disk Investigator Fab's AutoBackup 2.0 FileRecovery Floppy Repair Free Undelete Handy Recovery Recovery Manager Restorastion Restorastion Help File by UBCD4Win UnChk Unstoppable Copier Finally How can I make it so that my drives are visible again without losing my data? How can I convert my dynamic partitions to basic without losing my data?

    Read the article

  • Maven project is in a subfolder, can't get Eclipse integration to work

    - by tputkonen
    Inside the folder 'ProjectName' exists several subfolders, and of them contains java program: ProjectName Specifications JavaCode Gfx ... JavaCode folder contains pom.xml. I have installed m2eclipse (0.10.x) to Eclipse and imported whole ProjectName folder to Eclipse. Subfolders are displayed correctly but maven integration is not working correctly - for example I don't see src/main/java "shortcut" folder in Eclipse, but I have to click to open all folders. If I create a new maven project with Eclipse from scratch, the integration works well. What could be the issue?

    Read the article

  • Spring Hibernate Integration

    - by Aj
    I am new to Spring Hibernate. I was trying Spring Hibernate integration tutorial from http://www.vaannila.com/spring/spring-hibernate-integration-1.html and i was able to run the example.This example deals with one table. Now i am trying with one more table. I have few question As per my understanding we need to add following things DAOinterface DAOimpl table POJO so Is this the only way to add more tables ? Do we need to add one more controller for the new table if it belongs to new form. How we will add this new table entry to dispatcher-servlet.xml Thanks in advance.

    Read the article

  • Are Oracle Database CPU license limits enforced by software, and how do I check them?

    - by DrStalker
    I've inherited a windows VM running Oracle Database 10g. Currently the VM has only one CPU assigned to it, but I can boost this up to 4 with our VMWare licenses. What I'm not yet certain about is if the Oracle Software will get upset. Are Oracle DB CPU limits enforced by software, and if so how do I find out what they? If it's just a legal enforcement I'll hunt through the mass of unsorted paperwork I have left from previous managers to find what we're licensed for, but a quick software check would be nice.

    Read the article

  • Is there a more easy way to create a WCF/OData Data Service Query Provider?

    - by routeNpingme
    I have a simple little data model resembling the following: InventoryContext { IEnumerable<Computer> GetComputers() IEnumerable<Printer> GetPrinters() } Computer { public string ComputerName { get; set; } public string Location { get; set; } } Printer { public string PrinterName { get; set; } public string Location { get; set; } } The results come from a non-SQL source, so this data does not come from Entity Framework connected up to a database. Now I want to expose the data through a WCF OData service. The only way I've found to do that thus far is creating my own Data Service Query Provider, per this blog tutorial: http://blogs.msdn.com/alexj/archive/2010/01/04/creating-a-data-service-provider-part-1-intro.aspx ... which is great, but seems like a pretty involved undertaking. The code for the provider would be 4 times longer than my whole data model to generate all of the resource sets and property definitions. Is there something like a generic provider in between Entity Framework and writing your own data source from zero? Maybe some way to build an object data source or something, so that the magical WCF unicorns can pick up my data and ride off into the sunset without having to explicitly code the provider?

    Read the article

  • Conceptual data modeling: Is RDF the right tool? Other solutions?

    - by paprika
    I'm planning a system that combines various data sources and lets users do simple queries on these. A part of the system needs to act as an abstraction layer that knows all connected data sources: the user shouldn't [need to] know about the underlying data "providers". A data provider could be anything: a relational DBMS, a bug tracking system, ..., a weather station. They are hooked up to the query system through a common API that defines how to "offer" data. The type of queries a certain data provider understands is given by its "offer" (e.g. I know these entities, I can give you aggregates of type X for relationship Y, ...). My concern right now is the unification of the data: the various data providers need to agree on a common vocabulary (e.g. the name of the entity "customer" could vary across different systems). Thus, defining a high level representation of the entities and their relationships is required. So far I have the following requirements: I need to be able to define objects and their properties/attributes. Further, arbitrary relations between these objects need to be represented: a verb that defines the nature of the relation (e.g. "knows"), the multiplicity (e.g. 1:n) and the direction/navigability of the relation. It occurs to me that RDF is a viable option, but is it "the right tool" for this job? What other solutions/frameworks do exist for semantic data modeling that have a machine readable representation and why are they better suited for this task? I'm grateful for every opinion and pointer to helpful resources.

    Read the article

  • Bogus InvalidOperationException (in a DataServiceRequestException)

    - by Andrei Rinea
    I am having a hard time with ADO.NET Data Services (formerly code-named Astoria) as it gives me a bogus exception when I try to insert a new entity from the silverlight client and trying in a clean project (the same code) doesn't. In both cases, however, data is correctly inserted into the database. Using Fiddler (an HTTP debugger I could see that there is no problem in the HTTP communication as I will show later in this question. The code : var ctx = new MyProject123Entities(new Uri("http://andreiri/MyProject.Data/Data.svc")); var i = new Zone() { Data = DateTime.Now, IdElement = 1 }; ctx.AddToZone(i); i.StareZone = new StareZone() { IdStareZone = 1 }; ctx.AttachTo("StareZone", i.StareZone); ctx.SetLink(i, "StareZone", i.StareZone); i.TipZone = new TipZone() { IdTipZone = 1 }; ctx.AttachTo("TipZone", i.TipZone); ctx.SetLink(i, "TipZone", i.TipZone); i.User = new User() { IdUser = 2 }; ctx.AttachTo("User", i.User); ctx.SetLink(i, "User", i.User); ctx.BeginSaveChanges(r =] ctx.EndSaveChanges(r), null); when run the last line (ctx.EndSaveChanges(r)) will throw the following exception : System.Data.Services.Client.DataServiceRequestException was unhandled by user code Message="An error occurred while processing this request." StackTrace: at System.Data.Services.Client.DataServiceContext.SaveAsyncResult.HandleBatchResponse() at System.Data.Services.Client.DataServiceContext.SaveAsyncResult.EndRequest() at System.Data.Services.Client.DataServiceContext.EndSaveChanges(IAsyncResult asyncResult) at MyProject.MainPage.[]c__DisplayClassd6.[]c__DisplayClassd8.[dashboard_PostZoneCurent]b__d5(IAsyncResult r) at System.Data.Services.Client.BaseAsyncResult.HandleCompleted() at System.Data.Services.Client.DataServiceContext.SaveAsyncResult.HandleCompleted(PerRequest pereq) at System.Data.Services.Client.DataServiceContext.SaveAsyncResult.AsyncEndRead(IAsyncResult asyncResult) at System.IO.Stream.BeginRead(Byte[] buffer, Int32 offset, Int32 count, AsyncCallback callback, Object state) at System.Data.Services.Client.DataServiceContext.SaveAsyncResult.AsyncEndGetResponse(IAsyncResult asyncResult) InnerException: System.InvalidOperationException Message="The context is already tracking a different entity with the same resource Uri." StackTrace: at System.Data.Services.Client.DataServiceContext.AttachTo(Uri identity, Uri editLink, String etag, Object entity, Boolean fail) at System.Data.Services.Client.MaterializeAtom.MoveNext() at System.Data.Services.Client.DataServiceContext.HandleResponsePost(ResourceBox entry, MaterializeAtom materializer, Uri editLink, String etag) at System.Data.Services.Client.DataServiceContext.SaveAsyncResult.[HandleBatchResponse]d__1d.MoveNext() InnerException: (there is no further information regarding the exception although the ADo.NET Data Service is configured to return detailed informations) However the row is inserted correctly and completely in the database. Using fiddler I can see that the request : <?xml version="1.0" encoding="utf-8" standalone="yes"?> <entry xmlns:d="http://schemas.microsoft.com/ado/2007/08/dataservices" xmlns:m="http://schemas.microsoft.com/ado/2007/08/dataservices/metadata" xmlns="http://www.w3.org/2005/Atom"> <category scheme="http://schemas.microsoft.com/ado/2007/08/dataservices/scheme" term="MyProject123Model.Zone" /> <title /> <updated>2009-09-11T13:36:46.917157Z</updated> <author> <name /> </author> <id /> <link href="http://andreiri/MyProject.Data/Data.svc/StareZone(1)" rel="http://schemas.microsoft.com/ado/2007/08/dataservices/related/StareZone" type="application/atom+xml;type=entry" /> <link href="http://andreiri/MyProject.Data/Data.svc/TipZone(4)" rel="http://schemas.microsoft.com/ado/2007/08/dataservices/related/TipZone" type="application/atom+xml;type=entry" /> <link href="http://andreiri/MyProject.Data/Data.svc/User(4)" rel="http://schemas.microsoft.com/ado/2007/08/dataservices/related/User" type="application/atom+xml;type=entry" /> <content type="application/xml"> <m:properties> <d:Data m:type="Edm.DateTime">2009-09-11T16:36:40.588951+03:00</d:Data> <d:Detalii>aslkdfjasldkfj</d:Detalii> <d:IdElement m:type="Edm.Int32">1</d:IdElement> <d:IdZone m:type="Edm.Int32">0</d:IdZone> <d:X_Post m:type="Edm.Decimal">587647.4705</d:X_Post> <d:X_Repost m:type="Edm.Decimal" m:null="true" /> <d:Y_Post m:type="Edm.Decimal">325783.077599999</d:Y_Post> <d:Y_Repost m:type="Edm.Decimal" m:null="true" /> </m:properties> </content> </entry> is well accepted and a successful response is returned : HTTP/1.1 201 Created Date: Fri, 11 Sep 2009 13:36:47 GMT Server: Microsoft-IIS/6.0 X-Powered-By: ASP.NET X-AspNet-Version: 2.0.50727 DataServiceVersion: 1.0; Location: http://andreiri/MyProject.Data/Data.svc/Zone(75) Cache-Control: no-cache Content-Type: application/atom+xml;charset=utf-8 Content-Length: 2213 <?xml version="1.0" encoding="utf-8" standalone="yes"?> <entry xml:base="http://andreiri/MyProject.Data/Data.svc/" xmlns:d="http://schemas.microsoft.com/ado/2007/08/dataservices" xmlns:m="http://schemas.microsoft.com/ado/2007/08/dataservices/metadata" xmlns="http://www.w3.org/2005/Atom"> <id>http://andreiri/MyProject.Data/Data.svc/Zone(75)</id> <title type="text"></title> <updated>2009-09-11T13:36:47Z</updated> <author> <name /> </author> <link rel="edit" title="Zone" href="Zone(75)" /> <link rel="http://schemas.microsoft.com/ado/2007/08/dataservices/related/CenterZone" type="application/atom+xml;type=feed" title="CenterZone" href="Zone(75)/CenterZone" /> <link rel="http://schemas.microsoft.com/ado/2007/08/dataservices/related/ZoneMobil" type="application/atom+xml;type=feed" title="ZoneMobil" href="Zone(75)/ZoneMobil" /> <link rel="http://schemas.microsoft.com/ado/2007/08/dataservices/related/StareZone" type="application/atom+xml;type=entry" title="StareZone" href="Zone(75)/StareZone" /> <link rel="http://schemas.microsoft.com/ado/2007/08/dataservices/related/TipZone" type="application/atom+xml;type=entry" title="TipZone" href="Zone(75)/TipZone" /> <link rel="http://schemas.microsoft.com/ado/2007/08/dataservices/related/User" type="application/atom+xml;type=entry" title="User" href="Zone(75)/User" /> <category term="MyProject123Model.Zone" scheme="http://schemas.microsoft.com ado/2007/08/dataservices/scheme" /> <content type="application/xml"> <m:properties> <d:IdZone m:type="Edm.Int32">75</d:IdZone> <d:X_Post m:type="Edm.Decimal">587647.4705</d:X_Post> <d:Y_Post m:type="Edm.Decimal">325783.077599999</d:Y_Post> <d:X_Repost m:type="Edm.Decimal" m:null="true" /> <d:Y_Repost m:type="Edm.Decimal" m:null="true" /> <d:Data m:type="Edm.DateTime">2009-09-11T16:36:40.588951+03:00</d:Data> <d:Detalii>aslkdfjasldkfj</d:Detalii> <d:IdElement m:type="Edm.Int32">1</d:IdElement> </m:properties> </content> </entry> Why do I get an exception? And, using this in a clean project does not throw the exception..

    Read the article

  • javafx tableview get selected data from ObservableList

    - by user3717821
    i am working on a javafx project and i need your help . while i am trying to get selected data from table i can get selected data from normal cell but can't get data from ObservableList inside tableview. code for my database: -- phpMyAdmin SQL Dump -- version 4.0.4 -- http://www.phpmyadmin.net -- -- Host: localhost -- Generation Time: Jun 10, 2014 at 06:20 AM -- Server version: 5.1.33-community -- PHP Version: 5.4.12 SET SQL_MODE = "NO_AUTO_VALUE_ON_ZERO"; SET time_zone = "+00:00"; /*!40101 SET @OLD_CHARACTER_SET_CLIENT=@@CHARACTER_SET_CLIENT */; /*!40101 SET @OLD_CHARACTER_SET_RESULTS=@@CHARACTER_SET_RESULTS */; /*!40101 SET @OLD_COLLATION_CONNECTION=@@COLLATION_CONNECTION */; /*!40101 SET NAMES utf8 */; -- -- Database: `test` -- -- -------------------------------------------------------- -- -- Table structure for table `customer` -- CREATE TABLE IF NOT EXISTS `customer` ( `col0` int(11) NOT NULL, `col1` varchar(255) DEFAULT NULL, `col2` int(11) DEFAULT NULL, PRIMARY KEY (`col0`) ) ENGINE=InnoDB DEFAULT CHARSET=latin1; -- -- Dumping data for table `customer` -- INSERT INTO `customer` (`col0`, `col1`, `col2`) VALUES (12, 'adasdasd', 231), (22, 'adasdasd', 231), (212, 'adasdasd', 231); /*!40101 SET CHARACTER_SET_CLIENT=@OLD_CHARACTER_SET_CLIENT */; /*!40101 SET CHARACTER_SET_RESULTS=@OLD_CHARACTER_SET_RESULTS */; /*!40101 SET COLLATION_CONNECTION=@OLD_COLLATION_CONNECTION */; my javafx codes: import java.sql.Connection; import java.sql.DriverManager; import java.sql.ResultSet; import java.sql.SQLException; import java.util.Map; import javafx.application.Application; import javafx.beans.property.SimpleStringProperty; import javafx.beans.value.ChangeListener; import javafx.beans.value.ObservableValue; import javafx.collections.FXCollections; import javafx.collections.ObservableList; import javafx.event.ActionEvent; import javafx.event.EventHandler; import javafx.scene.Scene; import javafx.scene.control.Button; import javafx.scene.control.TableCell; import javafx.scene.control.TableColumn; import javafx.scene.control.TableColumn.CellDataFeatures; import javafx.scene.control.TablePosition; import javafx.scene.control.TableView; import javafx.scene.control.TableView.TableViewSelectionModel; import javafx.scene.control.cell.ChoiceBoxTableCell; import javafx.scene.control.cell.TextFieldTableCell; import javafx.scene.layout.BorderPane; import javafx.stage.Stage; import javafx.util.Callback; import javafx.util.StringConverter; class DBConnector { private static Connection conn; private static String url = "jdbc:mysql://localhost/test"; private static String user = "root"; private static String pass = "root"; public static Connection connect() throws SQLException{ try{ Class.forName("com.mysql.jdbc.Driver").newInstance(); }catch(ClassNotFoundException cnfe){ System.err.println("Error: "+cnfe.getMessage()); }catch(InstantiationException ie){ System.err.println("Error: "+ie.getMessage()); }catch(IllegalAccessException iae){ System.err.println("Error: "+iae.getMessage()); } conn = DriverManager.getConnection(url,user,pass); return conn; } public static Connection getConnection() throws SQLException, ClassNotFoundException{ if(conn !=null && !conn.isClosed()) return conn; connect(); return conn; } } public class DynamicTable extends Application{ Object newValue; //TABLE VIEW AND DATA private ObservableList<ObservableList> data; private TableView<ObservableList> tableview; //MAIN EXECUTOR public static void main(String[] args) { launch(args); } //CONNECTION DATABASE public void buildData(){ tableview.setEditable(true); Callback<TableColumn<Map, String>, TableCell<Map, String>> cellFactoryForMap = new Callback<TableColumn<Map, String>, TableCell<Map, String>>() { @Override public TableCell call(TableColumn p) { return new TextFieldTableCell(new StringConverter() { @Override public String toString(Object t) { return t.toString(); } @Override public Object fromString(String string) { return string; } }); } }; Connection c ; data = FXCollections.observableArrayList(); try{ c = DBConnector.connect(); //SQL FOR SELECTING ALL OF CUSTOMER String SQL = "SELECT * from CUSTOMer"; //ResultSet ResultSet rs = c.createStatement().executeQuery(SQL); /********************************** * TABLE COLUMN ADDED DYNAMICALLY * **********************************/ for(int i=0 ; i<rs.getMetaData().getColumnCount(); i++){ //We are using non property style for making dynamic table final int j = i; TableColumn col = new TableColumn(rs.getMetaData().getColumnName(i+1)); if(j==1){ final ObservableList<String> logLevelList = FXCollections.observableArrayList("FATAL", "ERROR", "WARN", "INFO", "INOUT", "DEBUG"); col.setCellFactory(ChoiceBoxTableCell.forTableColumn(logLevelList)); tableview.getColumns().addAll(col); } else{ col.setCellValueFactory(new Callback<CellDataFeatures<ObservableList,String>,ObservableValue<String>>(){ public ObservableValue<String> call(CellDataFeatures<ObservableList, String> param) { return new SimpleStringProperty(param.getValue().get(j).toString()); } }); tableview.getColumns().addAll(col); } if(j!=1) col.setCellFactory(cellFactoryForMap); System.out.println("Column ["+i+"] "); } /******************************** * Data added to ObservableList * ********************************/ while(rs.next()){ //Iterate Row ObservableList<String> row = FXCollections.observableArrayList(); for(int i=1 ; i<=rs.getMetaData().getColumnCount(); i++){ //Iterate Column row.add(rs.getString(i)); } System.out.println("Row [1] added "+row ); data.add(row); } //FINALLY ADDED TO TableView tableview.setItems(data); }catch(Exception e){ e.printStackTrace(); System.out.println("Error on Building Data"); } } @Override public void start(Stage stage) throws Exception { //TableView Button showDataButton = new Button("Add"); showDataButton.setOnAction(new EventHandler<ActionEvent>() { public void handle(ActionEvent event) { ObservableList<String> row = FXCollections.observableArrayList(); for(int i=1 ; i<=3; i++){ //Iterate Column row.add("asdasd"); } data.add(row); //FINALLY ADDED TO TableView tableview.setItems(data); } }); tableview = new TableView(); buildData(); //Main Scene BorderPane root = new BorderPane(); root.setCenter(tableview); root.setBottom(showDataButton); Scene scene = new Scene(root,500,500); stage.setScene(scene); stage.show(); tableview.getSelectionModel().selectedItemProperty().addListener(new ChangeListener() { @Override public void changed(ObservableValue observableValue, Object oldValue, Object newValue) { //Check whether item is selected and set value of selected item to Label if (tableview.getSelectionModel().getSelectedItem() != null) { TableViewSelectionModel selectionModel = tableview.getSelectionModel(); ObservableList selectedCells = selectionModel.getSelectedCells(); TablePosition tablePosition = (TablePosition) selectedCells.get(0); Object val = tablePosition.getTableColumn().getCellData(newValue); System.out.println("Selected Value " + val); System.out.println("Selected row " + newValue); } } }); } } please help me..

    Read the article

  • SQL Monitor’s data repository: Alerts

    - by Chris Lambrou
    In my previous post, I introduced the SQL Monitor data repository, and described how the monitored objects are stored in a hierarchy in the data schema, in a series of tables with a _Keys suffix. In this post I had planned to describe how the actual data for the monitored objects is stored in corresponding tables with _StableSamples and _UnstableSamples suffixes. However, I’m going to postpone that until my next post, as I’ve had a request from a SQL Monitor user to explain how alerts are stored. In the SQL Monitor data repository, alerts are stored in tables belonging to the alert schema, which contains the following five tables: alert.Alert alert.Alert_Cleared alert.Alert_Comment alert.Alert_Severity alert.Alert_Type In this post, I’m only going to cover the alert.Alert and alert.Alert_Type tables. I may cover the other three tables in a later post. The most important table in this schema is alert.Alert, as each row in this table corresponds to a single alert. So let’s have a look at it. SELECT TOP 100 AlertId, AlertType, TargetObject, [Read], SubType FROM alert.Alert ORDER BY AlertId DESC;  AlertIdAlertTypeTargetObjectReadSubType 165550397:Cluster,1,4:Name,s29:srp-mr03.testnet.red-gate.com,9:SqlServer,1,4:Name,s0:,10 265549387:Cluster,1,4:Name,s29:srp-mr03.testnet.red-gate.com,7:Machine,1,4:Name,s0:,10 365548187:Cluster,1,4:Name,s7:granger,9:SqlServer,1,4:Name,s0:,8:Database,1,4:Name,s15:FavouriteThings,00 465547157:Cluster,1,4:Name,s7:granger,9:SqlServer,1,4:Name,s0:,8:Database,1,4:Name,s15:FavouriteThings,00 565546147:Cluster,1,4:Name,s7:granger,9:SqlServer,1,4:Name,s0:,8:Database,1,4:Name,s15:FavouriteThings,00 665545187:Cluster,1,4:Name,s7:granger,9:SqlServer,1,4:Name,s0:,8:Database,1,4:Name,s14:SqlMonitorData,00 765544157:Cluster,1,4:Name,s7:granger,9:SqlServer,1,4:Name,s0:,8:Database,1,4:Name,s14:SqlMonitorData,00 865543147:Cluster,1,4:Name,s7:granger,9:SqlServer,1,4:Name,s0:,8:Database,1,4:Name,s14:SqlMonitorData,00 965542187:Cluster,1,4:Name,s7:granger,9:SqlServer,1,4:Name,s0:,8:Database,1,4:Name,s4:msdb,00 1065541147:Cluster,1,4:Name,s7:granger,9:SqlServer,1,4:Name,s0:,8:Database,1,4:Name,s4:msdb,00 11…     So what are we seeing here, then? Well, AlertId is an auto-incrementing identity column, so ORDER BY AlertId DESC ensures that we see the most recent alerts first. AlertType indicates the type of each alert, such as Job failed (6), Backup overdue (14) or Long-running query (12). The TargetObject column indicates which monitored object the alert is associated with. The Read column acts as a flag to indicate whether or not the alert has been read. And finally the SubType column is used in the case of a Custom metric (40) alert, to indicate which custom metric the alert pertains to. Okay, now lets look at some of those columns in more detail. The AlertType column is an easy one to start with, and it brings use nicely to the next table, data.Alert_Type. Let’s have a look at what’s in this table: SELECT AlertType, Event, Monitoring, Name, Description FROM alert.Alert_Type ORDER BY AlertType;  AlertTypeEventMonitoringNameDescription 1100Processor utilizationProcessor utilization (CPU) on a host machine stays above a threshold percentage for longer than a specified duration 2210SQL Server error log entryAn error is written to the SQL Server error log with a severity level above a specified value. 3310Cluster failoverThe active cluster node fails, causing the SQL Server instance to switch nodes. 4410DeadlockSQL deadlock occurs. 5500Processor under-utilizationProcessor utilization (CPU) on a host machine remains below a threshold percentage for longer than a specified duration 6610Job failedA job does not complete successfully (the job returns an error code). 7700Machine unreachableHost machine (Windows server) cannot be contacted on the network. 8800SQL Server instance unreachableThe SQL Server instance is not running or cannot be contacted on the network. 9900Disk spaceDisk space used on a logical disk drive is above a defined threshold for longer than a specified duration. 101000Physical memoryPhysical memory (RAM) used on the host machine stays above a threshold percentage for longer than a specified duration. 111100Blocked processSQL process is blocked for longer than a specified duration. 121200Long-running queryA SQL query runs for longer than a specified duration. 131400Backup overdueNo full backup exists, or the last full backup is older than a specified time. 141500Log backup overdueNo log backup exists, or the last log backup is older than a specified time. 151600Database unavailableDatabase changes from Online to any other state. 161700Page verificationTorn Page Detection or Page Checksum is not enabled for a database. 171800Integrity check overdueNo entry for an integrity check (DBCC DBINFO returns no date for dbi_dbccLastKnownGood field), or the last check is older than a specified time. 181900Fragmented indexesFragmentation level of one or more indexes is above a threshold percentage. 192400Job duration unusualThe duration of a SQL job duration deviates from its baseline duration by more than a threshold percentage. 202501Clock skewSystem clock time on the Base Monitor computer differs from the system clock time on a monitored SQL Server host machine by a specified number of seconds. 212700SQL Server Agent Service statusThe SQL Server Agent Service status matches the status specified. 222800SQL Server Reporting Service statusThe SQL Server Reporting Service status matches the status specified. 232900SQL Server Full Text Search Service statusThe SQL Server Full Text Search Service status matches the status specified. 243000SQL Server Analysis Service statusThe SQL Server Analysis Service status matches the status specified. 253100SQL Server Integration Service statusThe SQL Server Integration Service status matches the status specified. 263300SQL Server Browser Service statusThe SQL Server Browser Service status matches the status specified. 273400SQL Server VSS Writer Service statusThe SQL Server VSS Writer status matches the status specified. 283501Deadlock trace flag disabledThe monitored SQL Server’s trace flag cannot be enabled. 293600Monitoring stopped (host machine credentials)SQL Monitor cannot contact the host machine because authentication failed. 303700Monitoring stopped (SQL Server credentials)SQL Monitor cannot contact the SQL Server instance because authentication failed. 313800Monitoring error (host machine data collection)SQL Monitor cannot collect data from the host machine. 323900Monitoring error (SQL Server data collection)SQL Monitor cannot collect data from the SQL Server instance. 334000Custom metricThe custom metric value has passed an alert threshold. 344100Custom metric collection errorSQL Monitor cannot collect custom metric data from the target object. Basically, alert.Alert_Type is just a big reference table containing information about the 34 different alert types supported by SQL Monitor (note that the largest id is 41, not 34 – some alert types have been retired since SQL Monitor was first developed). The Name and Description columns are self evident, and I’m going to skip over the Event and Monitoring columns as they’re not very interesting. The AlertId column is the primary key, and is referenced by AlertId in the alert.Alert table. As such, we can rewrite our earlier query to join these two tables, in order to provide a more readable view of the alerts: SELECT TOP 100 AlertId, Name, TargetObject, [Read], SubType FROM alert.Alert a JOIN alert.Alert_Type at ON a.AlertType = at.AlertType ORDER BY AlertId DESC;  AlertIdNameTargetObjectReadSubType 165550Monitoring error (SQL Server data collection)7:Cluster,1,4:Name,s29:srp-mr03.testnet.red-gate.com,9:SqlServer,1,4:Name,s0:,00 265549Monitoring error (host machine data collection)7:Cluster,1,4:Name,s29:srp-mr03.testnet.red-gate.com,7:Machine,1,4:Name,s0:,00 365548Integrity check overdue7:Cluster,1,4:Name,s7:granger,9:SqlServer,1,4:Name,s0:,8:Database,1,4:Name,s15:FavouriteThings,00 465547Log backup overdue7:Cluster,1,4:Name,s7:granger,9:SqlServer,1,4:Name,s0:,8:Database,1,4:Name,s15:FavouriteThings,00 565546Backup overdue7:Cluster,1,4:Name,s7:granger,9:SqlServer,1,4:Name,s0:,8:Database,1,4:Name,s15:FavouriteThings,00 665545Integrity check overdue7:Cluster,1,4:Name,s7:granger,9:SqlServer,1,4:Name,s0:,8:Database,1,4:Name,s14:SqlMonitorData,00 765544Log backup overdue7:Cluster,1,4:Name,s7:granger,9:SqlServer,1,4:Name,s0:,8:Database,1,4:Name,s14:SqlMonitorData,00 865543Backup overdue7:Cluster,1,4:Name,s7:granger,9:SqlServer,1,4:Name,s0:,8:Database,1,4:Name,s14:SqlMonitorData,00 965542Integrity check overdue7:Cluster,1,4:Name,s7:granger,9:SqlServer,1,4:Name,s0:,8:Database,1,4:Name,s4:msdb,00 1065541Backup overdue7:Cluster,1,4:Name,s7:granger,9:SqlServer,1,4:Name,s0:,8:Database,1,4:Name,s4:msdb,00 Okay, the next column to discuss in the alert.Alert table is TargetObject. Oh boy, this one’s a bit tricky! The TargetObject of an alert is a serialized string representation of the position in the monitored object hierarchy of the object to which the alert pertains. The serialization format is somewhat convenient for parsing in the C# source code of SQL Monitor, and has some helpful characteristics, but it’s probably very awkward to manipulate in T-SQL. I could document the serialization format here, but it would be very dry reading, so perhaps it’s best to consider an example from the table above. Have a look at the alert with an AlertID of 65543. It’s a Backup overdue alert for the SqlMonitorData database running on the default instance of granger, my laptop. Each different alert type is associated with a specific type of monitored object in the object hierarchy (I described the hierarchy in my previous post). The Backup overdue alert is associated with databases, whose position in the object hierarchy is root → Cluster → SqlServer → Database. The TargetObject value identifies the target object by specifying the key properties at each level in the hierarchy, thus: Cluster: Name = "granger" SqlServer: Name = "" (an empty string, denoting the default instance) Database: Name = "SqlMonitorData" Well, look at the actual TargetObject value for this alert: "7:Cluster,1,4:Name,s7:granger,9:SqlServer,1,4:Name,s0:,8:Database,1,4:Name,s14:SqlMonitorData,". It is indeed composed of three parts, one for each level in the hierarchy: Cluster: "7:Cluster,1,4:Name,s7:granger," SqlServer: "9:SqlServer,1,4:Name,s0:," Database: "8:Database,1,4:Name,s14:SqlMonitorData," Each part is handled in exactly the same way, so let’s concentrate on the first part, "7:Cluster,1,4:Name,s7:granger,". It comprises the following: "7:Cluster," – This identifies the level in the hierarchy. "1," – This indicates how many different key properties there are to uniquely identify a cluster (we saw in my last post that each cluster is identified by a single property, its Name). "4:Name,s14:SqlMonitorData," – This represents the Name property, and its corresponding value, SqlMonitorData. It’s split up like this: "4:Name," – Indicates the name of the key property. "s" – Indicates the type of the key property, in this case, it’s a string. "14:SqlMonitorData," – Indicates the value of the property. At this point, you might be wondering about the format of some of these strings. Why is the string "Cluster" stored as "7:Cluster,"? Well an encoding scheme is used, which consists of the following: "7" – This is the length of the string "Cluster" ":" – This is a delimiter between the length of the string and the actual string’s contents. "Cluster" – This is the string itself. 7 characters. "," – This is a final terminating character that indicates the end of the encoded string. You can see that "4:Name,", "8:Database," and "14:SqlMonitorData," also conform to the same encoding scheme. In the example above, the "s" character is used to indicate that the value of the Name property is a string. If you explore the TargetObject property of alerts in your own SQL Monitor data repository, you might find other characters used for other non-string key property values. The different value types you might possibly encounter are as follows: "I" – Denotes a bigint value. For example, "I65432,". "g" – Denotes a GUID value. For example, "g32116732-63ae-4ab5-bd34-7dfdfb084c18,". "d" – Denotes a datetime value. For example, "d634815384796832438,". The value is stored as a bigint, rather than a native SQL datetime value. I’ll describe how datetime values are handled in the SQL Monitor data repostory in a future post. I suggest you have a look at the alerts in your own SQL Monitor data repository for further examples, so you can see how the TargetObject values are composed for each of the different types of alert. Let me give one further example, though, that represents a Custom metric alert, as this will help in describing the final column of interest in the alert.Alert table, SubType. Let me show you the alert I’m interested in: SELECT AlertId, a.AlertType, Name, TargetObject, [Read], SubType FROM alert.Alert a JOIN alert.Alert_Type at ON a.AlertType = at.AlertType WHERE AlertId = 65769;  AlertIdAlertTypeNameTargetObjectReadSubType 16576940Custom metric7:Cluster,1,4:Name,s7:granger,9:SqlServer,1,4:Name,s0:,8:Database,1,4:Name,s6:master,12:CustomMetric,1,8:MetricId,I2,02 An AlertType value of 40 corresponds to the Custom metric alert type. The Name taken from the alert.Alert_Type table is simply Custom metric, but this doesn’t tell us anything about the specific custom metric that this alert pertains to. That’s where the SubType value comes in. For custom metric alerts, this provides us with the Id of the specific custom alert definition that can be found in the settings.CustomAlertDefinitions table. I don’t really want to delve into custom alert definitions yet (maybe in a later post), but an extra join in the previous query shows us that this alert pertains to the CPU pressure (avg runnable task count) custom metric alert. SELECT AlertId, a.AlertType, at.Name, cad.Name AS CustomAlertName, TargetObject, [Read], SubType FROM alert.Alert a JOIN alert.Alert_Type at ON a.AlertType = at.AlertType JOIN settings.CustomAlertDefinitions cad ON a.SubType = cad.Id WHERE AlertId = 65769;  AlertIdAlertTypeNameCustomAlertNameTargetObjectReadSubType 16576940Custom metricCPU pressure (avg runnable task count)7:Cluster,1,4:Name,s7:granger,9:SqlServer,1,4:Name,s0:,8:Database,1,4:Name,s6:master,12:CustomMetric,1,8:MetricId,I2,02 The TargetObject value in this case breaks down like this: "7:Cluster,1,4:Name,s7:granger," – Cluster named "granger". "9:SqlServer,1,4:Name,s0:," – SqlServer named "" (the default instance). "8:Database,1,4:Name,s6:master," – Database named "master". "12:CustomMetric,1,8:MetricId,I2," – Custom metric with an Id of 2. Note that the hierarchy for a custom metric is slightly different compared to the earlier Backup overdue alert. It’s root → Cluster → SqlServer → Database → CustomMetric. Also notice that, unlike Cluster, SqlServer and Database, the key property for CustomMetric is called MetricId (not Name), and the value is a bigint (not a string). Finally, delving into the custom metric tables is beyond the scope of this post, but for the sake of avoiding any future confusion, I’d like to point out that whilst the SubType references a custom alert definition, the MetricID value embedded in the TargetObject value references a custom metric definition. Although in this case both the custom metric definition and custom alert definition share the same Id value of 2, this is not generally the case. Okay, that’s enough for now, not least because as I’m typing this, it’s almost 2am, I have to go to work tomorrow, and my alarm is set for 6am – eek! In my next post, I’ll either cover the remaining three tables in the alert schema, or I’ll delve into the way SQL Monitor stores its monitoring data, as I’d originally planned to cover in this post.

    Read the article

  • It's not just “Single Sign-on” by Steve Knott (aurionPro SENA)

    - by Greg Jensen
    It is true that Oracle Enterprise Single Sign-on (Oracle ESSO) started out as purely an application single sign-on tool but as we have seen in the previous articles in this series the product has matured into a suite of tools that can do more than just automated single sign-on and can also provide rapidly deployed, cost effective solution to many demanding password management problems. In the last article of this series I would like to discuss three cases where customers faced password scenarios that required more than just single sign-on and how some of the less well known tools in the Oracle ESSO suite “kitbag” helped solve these challenges. Case #1 One of the issues often faced by our customers is how to keep their applications compliant. I had a client who liked the idea of automated single sign-on for most of his applications but had a key requirement to actually increase the security for one specific SOX application. For the SOX application he wanted to secure access by using two-factor authentication with a smartcard. The problem was that the application did not support two-factor authentication. The solution was to use a feature from the Oracle ESSO suite called authentication manager. This feature enables you to have multiple authentication methods for the same user which in this case was a smartcard and the Windows password.  Within authentication manager each authenticator can be configured with a security grade so we gave the smartcard a high grade and the Windows password a normal grade. Security grading in Oracle ESSO can be configured on a per application basis so we set the SOX application to require the higher grade smartcard authenticator. The end result for the user was that they enjoyed automated single sign-on for most of the applications apart from the SOX application. When the SOX application was launched, the user was required by ESSO to present their smartcard before being given access to the application. Case #2 Another example solving compliance issues was in the case of a large energy company who had a number of core billing applications. New regulations required that users change their password regularly and use a complex password. The problem facing the customer was that the core billing applications did not have any native user password change functionality. The customer could not replace the core applications because of the cost and time required to re-develop them. With a reputation for innovation aurionPro SENA were approached to provide a solution to this problem using Oracle ESSO. Oracle ESSO has a password expiry feature that can be triggered periodically based on the timestamp of the users’ last password creation therefore our strategy here was to leverage this feature to provide the password change experience. The trigger can launch an application change password event however in this scenario there was no native change password feature that could be launched therefore a “dummy” change password screen was created that could imitate the missing change password function and connect to the application database on behalf of the user. Oracle ESSO was configured to trigger a change password event every 60 days. After this period if the user launched the application Oracle ESSO would detect the logon screen and invoke the password expiry feature. Oracle ESSO would trigger the “dummy screen,” detect it automatically as the application change password screen and insert a complex password on behalf of the user. After the password event had completed the user was logged on to the application with their new password. All this was provided at a fraction of the cost of re-developing the core applications. Case #3 Recent popular initiatives such as the BYOD and working from home schemes bring with them many challenges in administering “unmanaged machines” and sometimes “unmanageable users.” In a recent case, a client had a dispersed community of casual contractors who worked for the business using their own laptops to access applications. To improve security the around password management the security goal was to provision the passwords directly to these contractors. In a previous article we saw how Oracle ESSO has the capability to provision passwords through Provisioning Gateway but the challenge in this scenario was how to get the Oracle ESSO agent to the casual contractor on an unmanaged machine. The answer was to use another tool in the suite, Oracle ESSO Anywhere. This component can compile the normal Oracle ESSO functionality into a deployment package that can be made available from a website in a similar way to a streamed application. The ESSO Anywhere agent does not actually install into the registry or program files but runs in a folder within the user’s profile therefore no local administrator rights are required for installation. The ESSO Anywhere package can also be configured to stay persistent or disable itself at the end of the user’s session. In this case the user just needed to be told where the website package was located and download the package. Once the download was complete the agent started automatically and the user was provided with single sign-on to their applications without ever knowing the application passwords. Finally, as we have seen in these series Oracle ESSO not only has great utilities in its own tool box but also has direct integration with Oracle Privileged Account Manager, Oracle Identity Manager and Oracle Access Manager. Integrated together with these tools provides a complete and complementary platform to address even the most complex identity and access management requirements. So what next for Oracle ESSO? “Agentless ESSO available in the cloud” – but that will be a subject for a future Oracle ESSO series!                                                                                                                               

    Read the article

  • ?RAC????????????

    - by Allen Gao
    Normal 0 7.8 ? 0 2 false false false EN-US ZH-CN X-NONE DefSemiHidden="true" DefQFormat="false" DefPriority="99" LatentStyleCount="267" UnhideWhenUsed="false" QFormat="true" Name="Normal"/ UnhideWhenUsed="false" QFormat="true" Name="heading 1"/ UnhideWhenUsed="false" QFormat="true" Name="Title"/ UnhideWhenUsed="false" QFormat="true" Name="Subtitle"/ UnhideWhenUsed="false" QFormat="true" Name="Strong"/ UnhideWhenUsed="false" QFormat="true" Name="Emphasis"/ UnhideWhenUsed="false" Name="Table Grid"/ UnhideWhenUsed="false" QFormat="true" Name="No Spacing"/ UnhideWhenUsed="false" Name="Light Shading"/ UnhideWhenUsed="false" Name="Light List"/ UnhideWhenUsed="false" Name="Light Grid"/ UnhideWhenUsed="false" Name="Medium Shading 1"/ UnhideWhenUsed="false" Name="Medium Shading 2"/ UnhideWhenUsed="false" Name="Medium List 1"/ UnhideWhenUsed="false" Name="Medium List 2"/ UnhideWhenUsed="false" Name="Medium Grid 1"/ UnhideWhenUsed="false" Name="Medium Grid 2"/ UnhideWhenUsed="false" Name="Medium Grid 3"/ UnhideWhenUsed="false" Name="Dark List"/ UnhideWhenUsed="false" Name="Colorful Shading"/ UnhideWhenUsed="false" Name="Colorful List"/ UnhideWhenUsed="false" Name="Colorful Grid"/ UnhideWhenUsed="false" Name="Light Shading Accent 1"/ UnhideWhenUsed="false" Name="Light List Accent 1"/ UnhideWhenUsed="false" Name="Light Grid Accent 1"/ UnhideWhenUsed="false" Name="Medium Shading 1 Accent 1"/ UnhideWhenUsed="false" Name="Medium Shading 2 Accent 1"/ UnhideWhenUsed="false" Name="Medium List 1 Accent 1"/ UnhideWhenUsed="false" QFormat="true" Name="List Paragraph"/ UnhideWhenUsed="false" QFormat="true" Name="Quote"/ UnhideWhenUsed="false" QFormat="true" Name="Intense Quote"/ UnhideWhenUsed="false" Name="Medium List 2 Accent 1"/ UnhideWhenUsed="false" Name="Medium Grid 1 Accent 1"/ UnhideWhenUsed="false" Name="Medium Grid 2 Accent 1"/ UnhideWhenUsed="false" Name="Medium Grid 3 Accent 1"/ UnhideWhenUsed="false" Name="Dark List Accent 1"/ UnhideWhenUsed="false" Name="Colorful Shading Accent 1"/ UnhideWhenUsed="false" Name="Colorful List Accent 1"/ UnhideWhenUsed="false" Name="Colorful Grid Accent 1"/ UnhideWhenUsed="false" Name="Light Shading Accent 2"/ UnhideWhenUsed="false" Name="Light List Accent 2"/ UnhideWhenUsed="false" Name="Light Grid Accent 2"/ UnhideWhenUsed="false" Name="Medium Shading 1 Accent 2"/ UnhideWhenUsed="false" Name="Medium Shading 2 Accent 2"/ UnhideWhenUsed="false" Name="Medium List 1 Accent 2"/ UnhideWhenUsed="false" Name="Medium List 2 Accent 2"/ UnhideWhenUsed="false" Name="Medium Grid 1 Accent 2"/ UnhideWhenUsed="false" Name="Medium Grid 2 Accent 2"/ UnhideWhenUsed="false" Name="Medium Grid 3 Accent 2"/ UnhideWhenUsed="false" Name="Dark List Accent 2"/ UnhideWhenUsed="false" Name="Colorful Shading Accent 2"/ UnhideWhenUsed="false" Name="Colorful List Accent 2"/ UnhideWhenUsed="false" Name="Colorful Grid Accent 2"/ UnhideWhenUsed="false" Name="Light Shading Accent 3"/ UnhideWhenUsed="false" Name="Light List Accent 3"/ UnhideWhenUsed="false" Name="Light Grid Accent 3"/ UnhideWhenUsed="false" Name="Medium Shading 1 Accent 3"/ UnhideWhenUsed="false" Name="Medium Shading 2 Accent 3"/ UnhideWhenUsed="false" Name="Medium List 1 Accent 3"/ UnhideWhenUsed="false" Name="Medium List 2 Accent 3"/ UnhideWhenUsed="false" Name="Medium Grid 1 Accent 3"/ UnhideWhenUsed="false" Name="Medium Grid 2 Accent 3"/ UnhideWhenUsed="false" Name="Medium Grid 3 Accent 3"/ UnhideWhenUsed="false" Name="Dark List Accent 3"/ UnhideWhenUsed="false" Name="Colorful Shading Accent 3"/ UnhideWhenUsed="false" Name="Colorful List Accent 3"/ UnhideWhenUsed="false" Name="Colorful Grid Accent 3"/ UnhideWhenUsed="false" Name="Light Shading Accent 4"/ UnhideWhenUsed="false" Name="Light List Accent 4"/ UnhideWhenUsed="false" Name="Light Grid Accent 4"/ UnhideWhenUsed="false" Name="Medium Shading 1 Accent 4"/ UnhideWhenUsed="false" Name="Medium Shading 2 Accent 4"/ UnhideWhenUsed="false" Name="Medium List 1 Accent 4"/ UnhideWhenUsed="false" Name="Medium List 2 Accent 4"/ UnhideWhenUsed="false" Name="Medium Grid 1 Accent 4"/ UnhideWhenUsed="false" Name="Medium Grid 2 Accent 4"/ UnhideWhenUsed="false" Name="Medium Grid 3 Accent 4"/ UnhideWhenUsed="false" Name="Dark List Accent 4"/ UnhideWhenUsed="false" Name="Colorful Shading Accent 4"/ UnhideWhenUsed="false" Name="Colorful List Accent 4"/ UnhideWhenUsed="false" Name="Colorful Grid Accent 4"/ UnhideWhenUsed="false" Name="Light Shading Accent 5"/ UnhideWhenUsed="false" Name="Light List Accent 5"/ UnhideWhenUsed="false" Name="Light Grid Accent 5"/ UnhideWhenUsed="false" Name="Medium Shading 1 Accent 5"/ UnhideWhenUsed="false" Name="Medium Shading 2 Accent 5"/ UnhideWhenUsed="false" Name="Medium List 1 Accent 5"/ UnhideWhenUsed="false" Name="Medium List 2 Accent 5"/ UnhideWhenUsed="false" Name="Medium Grid 1 Accent 5"/ UnhideWhenUsed="false" Name="Medium Grid 2 Accent 5"/ UnhideWhenUsed="false" Name="Medium Grid 3 Accent 5"/ UnhideWhenUsed="false" Name="Dark List Accent 5"/ UnhideWhenUsed="false" Name="Colorful Shading Accent 5"/ UnhideWhenUsed="false" Name="Colorful List Accent 5"/ UnhideWhenUsed="false" Name="Colorful Grid Accent 5"/ UnhideWhenUsed="false" Name="Light Shading Accent 6"/ UnhideWhenUsed="false" Name="Light List Accent 6"/ UnhideWhenUsed="false" Name="Light Grid Accent 6"/ UnhideWhenUsed="false" Name="Medium Shading 1 Accent 6"/ UnhideWhenUsed="false" Name="Medium Shading 2 Accent 6"/ UnhideWhenUsed="false" Name="Medium List 1 Accent 6"/ UnhideWhenUsed="false" Name="Medium List 2 Accent 6"/ UnhideWhenUsed="false" Name="Medium Grid 1 Accent 6"/ UnhideWhenUsed="false" Name="Medium Grid 2 Accent 6"/ UnhideWhenUsed="false" Name="Medium Grid 3 Accent 6"/ UnhideWhenUsed="false" Name="Dark List Accent 6"/ UnhideWhenUsed="false" Name="Colorful Shading Accent 6"/ UnhideWhenUsed="false" Name="Colorful List Accent 6"/ UnhideWhenUsed="false" Name="Colorful Grid Accent 6"/ UnhideWhenUsed="false" QFormat="true" Name="Subtle Emphasis"/ UnhideWhenUsed="false" QFormat="true" Name="Intense Emphasis"/ UnhideWhenUsed="false" QFormat="true" Name="Subtle Reference"/ UnhideWhenUsed="false" QFormat="true" Name="Intense Reference"/ UnhideWhenUsed="false" QFormat="true" Name="Book Title"/ /* Style Definitions */ table.MsoNormalTable {mso-style-name:????; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.5pt; mso-bidi-font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi; mso-font-kerning:1.0pt;} ????????????????????????????????????????,??????????????Oracle RAC?????????????????????????????,???????????????????,??????RAC???????????,????????????????????????????????????,????3???RAC????????????? ????????MOS ??"Top 11 Things to do NOW to Stabilize your RAC Cluster Environment”(DOC ID 1344678.1)???,???,??????3???????????,?????????????????????????????,???,?????????????????,??,??????????????,??????????????????????,?????????????,??????????????????????,???????RAC DBA???? ??????? (PSU)??,?????????PSU? ???????????,???????Oracle???????????(PSU)???PSU?????????????????,??PSU???????????????????PSU????????,???????????????PSU,????????6????????????????????BUG????,??????????,?????????????????????,???9???,???RAC???(Cluster)BUG,??7%??BUG??????,??????????BUG??????????????????????PSU??????RAC???,PSU????????: PSU?????Grid Infrastructure(GI)home,???????????RDBMS home???????,??GI home????PSU,?????home?????,??????????GI????????????,??????,??RDBMS PSU,GI PSU??????????GI home??????PSU,???????RDBMS??PSU? RAC????PSU????rolling????? –?????????GI? RDBMS?????????????????,??PSU???????,???????????????? ???????????PSU,????????????,?????????PSU????,???RAC?????????????PSU???,???????????????????? ??PSU?????, ??????MOS??: NOTE 854428.1   Intro to Patch Set Updates (PSU) NOTE 1082394.1 11.2.0.X Grid Infrastructure PSU Known Issues NOTE 756671.1   Oracle Recommended Patches -- Oracle Database NOTE 161549.1   Oracle Database, Networking and Grid Agent Patches for Microsoft Platforms NOTE 810394.1   RAC and Oracle Clusterware Best Practices and Starter Kit 11gR2???????,?Diagwait???13? ?2012?,??45%????????11gR2???????,????diagwait?13????RAC???????????,????diagwait??????????????,????????????????, diagwait??RAC?????????????: ?????,??????OPROCD?????1??0.5?????,????,??OPROCD??? 1.5????,?????????diagwait????13??OPROCD??????????10?( diagwait - CSS????[???3?]),????????OPROCD???????????????'?'?????????????,1.5??????????????????OPROCD?????????????11?(1?????+10????)? ?????/???????,??diagwait,??????????????????????,??,????????????? ?11g?2?(11.2.0.1?????)??,?????????????,???????,??????????????????,????????????????,?????????????????????diagwait????????,????????????????????,????????Oracle?????(OCR),?????????OCR???????????,?????????diagwai?????????????????: # $CLUSTERWARE_HOME\bin\crsctl get css diagwait ????DIAGWAIT???,??????MOS??: NOTE 567730.1  Changes in Oracle Clusterware on Linux with the 10.2.0.4 Patchset NOTE 559365.1  Using Diagwait as a diagnostic to get more information for diagnosing Oracle Clusterware Node evictions NOTE 810394.1 RAC and Oracle Clusterware Best Practices and Starter Kit ??OS Watcher Black Box(OSWbb) ? Cluster Health Monitor(CHM) ????????OS??????????????,??,??????OS Watcher Black Box(OSWbb)(??OS Watcher)?Cluster Health Monitor(CHM)????????OS???,??DBA????????????????????????????,?????????????,??????????,?????????????????????????OS????????,????????????,???????????????????? OSWbb?????????,??????,????OS??????????????,????OS??????OSWbb???????: ?????,??30??????????OS?????????????(??5??)????????????????????,?1???5????????????????????????30????????,Oracle???????????????OS?????????????,Oracle??????OSWbb?20???????? OSWbb?????????????????Oracle???????????????????OS????,??,?????????????????????????Oracle???????,?????????????,????????????????? ???11.2.0.3??,??????(HP-UX??)?,Oracle GI?????????,Cluster Health Monitor (CHM)?CHM??????,?????OSW????,??,???????OSW????,?????????? Oracle??????????????????OSWbb?/?CHM,?????????,????????????????????,??????????OSWbb,???????????RAC??,??????????????????(???NOTE 580513.1“How To Start OS Watcher Black Box Every System Boot”??????)? ??OSWbb?CHM?????, ??????MOS??: NOTE 301137.1   OS Watcher Black Box User Guide NOTE 1328466.1 Cluster Health Monitor (CHM) FAQ NOTE 810394.1   RAC and Oracle Clusterware Best Practices and Starter Kit ?? ?????????RAC/ Oracle?????????????3???????????3?,?????RAC??????,?????????????????,?????MOS??: NOTE 1344678.1 Top 11 Things to do NOW to Stabilize your RAC Cluster Environment ????,???MOS-RAC/Scalability community??,?Oracle???????????,????RAC/ Oracle?????

    Read the article

  • Why "Algorithms" and "Data Structures" are treated as separate disciplines?

    - by Pavel Shved
    This question was the last straw; and I've been wondering for a long time about it, Why do people think about "Algorithms" and "Data structures" as about something that can be separated from each other? I see a lot of evidence that they're separated in programmers' minds. they request "Data Structures & Algorithms" books they refer to "Data Structures" and "Algorithms" as separate university courses they "know Algorithms", but are "weak in Data Structures" (can't find the link, sorry). etc. In my opinion "Data Structures" are algorithms, since the concept of "Data Structure" is about Algorithms to operate data that go in and out of the structures. But the opinion seems not a mainstream. What do I miss?

    Read the article

  • What are the lesser known but cool data structures ?

    - by f3lix
    There a some data structures around that are really cool but are unknown to most programmers. Which are they? Everybody knows linked lists, binary trees, and hashes, but what about Skip lists, Bloom filters for example. I would like to know more data structures that are not so common, but are worth knowing because they rely on great ideas and enrich a programmer's tool box. PS: I am also interested on techniques like Dancing links which make interesting use of the properties of a common data structure. EDIT: Please try to include links to pages describing the data structures in more detail. Also, try to add a couple of words on why a data structures is cool (as Jonas Kölker already pointed out). Also, try to provide one data-structure per answer. This will allow the better data structures to float to the top based on their votes alone.

    Read the article

  • Is there a standard practice for storing default application data?

    - by Rox Wen
    Our application includes a default set of data. The default data includes coefficients and other factors that are unlikely to ever change but still need to be update-able by the user. Currently, the original default data is stored as a populated class within the application. Data updates are stored to an external XML file. This design allows us to include a "reset" feature to restore the original default data. Our rationale for not storing defaults externally [e.g. XML file] was to minimize the risk of being altered. The overall volume of data doesn't warrant a database. Is there a standard practice for storing "default" application data?

    Read the article

< Previous Page | 309 310 311 312 313 314 315 316 317 318 319 320  | Next Page >