A critical vulnerability was discovered in React Server Components (Next.js). Our systems remain protected but we advise to update packages to newest version. Learn More

Creating huge number of pages at one time.

Vote:
 
Hello, I need to import data from external source, and create coresponding pages. It's about 3000 records, so it will be over 3000 pages. I would like to do it in most efficient way. Now the operation takes litle bit to long. Here is the code: (I am using Microsoft Data Application Block) string sqlCommandString; sqlCommandString = "SELECT Title, SubTitle, Text FROM tblText"; Database database = DatabaseFactory.CreateDatabase("DefaultConnection"); DbCommand command = lazuriteDatabase.GetSqlStringCommand(sqlCommandString); // Here is the crucial part using (IDataReader reader = lazuriteDatabase.ExecuteReader(command)) { PageType pageType = PageType.Load("Pigment Term"); PageReference parent = new PageReference(5); Global.EPDataFactory.DeleteChildren(parent, true); while (reader.Read()) { PageData newDataPage = Global.EPDataFactory.GetDefaultPageData(parent, pageType.ID); newDataPage.PageName = Convert.ToString(reader["Title"]); newDataPage.Property["Title"].Value = Convert.ToString(reader["Title"]); newDataPage.Property["Subtitle"].Value = Convert.ToString(reader["SubTitle"]); newDataPage.Property["BodyText"].Value = Convert.ToString(reader["Text"]); Global.EPDataFactory.Save(newDataPage, EPiServer.DataAccess.SaveAction.Save); } reader.Close(); } Is there any better way? Maybe if I first create all Pages in PageDataCollection, and after that call Save mathod for each of them it will be faster? Any ideas? Thank's a lot.
#13172
Aug 23, 2007 11:10
Vote:
 
(corrected code) Hello, I need to import data from external source, and create coresponding pages. It's about 3000 records, so it will be over 3000 pages. I would like to do it in most efficient way. Now the operation takes litle bit to long. Here is the code: (I am using Microsoft Data Application Block) string sqlCommandString; sqlCommandString = "SELECT Title, SubTitle, Text FROM tblText"; Database database = DatabaseFactory.CreateDatabase("DefaultConnection"); DbCommand command = database.GetSqlStringCommand(sqlCommandString); // Here is the crucial part using (IDataReader reader = database.ExecuteReader(command)) { PageType pageType = PageType.Load("Pigment Term"); PageReference parent = new PageReference(5); Global.EPDataFactory.DeleteChildren(parent, true); while (reader.Read()) { PageData newDataPage = Global.EPDataFactory.GetDefaultPageData(parent, pageType.ID); newDataPage.PageName = Convert.ToString(reader["Title"]); newDataPage.Property["Title"].Value = Convert.ToString(reader["Title"]); newDataPage.Property["Subtitle"].Value = Convert.ToString(reader["SubTitle"]); newDataPage.Property["BodyText"].Value = Convert.ToString(reader["Text"]); Global.EPDataFactory.Save(newDataPage, EPiServer.DataAccess.SaveAction.Save); } reader.Close(); } Is there any better way? Maybe if I first create all Pages in PageDataCollection, and after that call Save mathod for each of them it will be faster? Any ideas? Thank's a lot.
#15492
Aug 23, 2007 11:11
Vote:
 
Creating pages is never fast. Is this something you do regulary in a job? If you are importing 3000 pages as one time deal, does it matter if it takes a minute or two? I wouldn't think creating the pages and batch saving them will improve performance, but please try it and let us know what you find out.
#15493
Aug 27, 2007 12:10
* You are NOT allowed to include any hyperlinks in the post because your account hasn't associated to your company. User profile should be updated.