Success!!!
Here is the final code below ... the line which made the final difference was "string[] webs = web.Webs.Names;"
I changed it from "string[] webs = site.Webs.Names;" which was causing it to dive into the child sites to begin validating instead of staying at the current site and validating against its own children. By changing the object from "site.Webs.Names" to "web.Webs.Names" the code performs correctly by checking the requested string name against the current sub-sites. I also added a redirect once a new site was created because it was trying to iterate back through and create the site again after it was already created.
protected void btnCreateSite_Click(object sender, EventArgs e)
{
SPWeb web = SPContext.Current.Site.OpenWeb();
SPUser user = web.CurrentUser;
SPUserToken token = user.UserToken;
web.Close();
web.Dispose();
Guid guid = SPContext.Current.Site.ID;
SPContext.Current.Web.AllowUnsafeUpdates = true;
string siteUrl = string.Empty;
siteUrl = "http://" + "localdev/" + txtSite.Text;
bool bSiteExists = false;
bool bSiteCreated = false;
SPWeb childWeb = null;
string siteAlreadyExists = "This site name already exists";
string childName = txtSite.Text;
using (SPSite impersonatedSiteCollection = new SPSite(guid, token))
{
try
{
foreach (SPSite mysite in SPContext.Current.Site.WebApplication.Sites)
{
try
{
SPWebCollection sites = impersonatedSiteCollection.AllWebs;
using (web = SPContext.Current.Web)
{
try
{
foreach (SPWeb site in web.Webs)
{
try
{
string[] webs = web.Webs.Names;
if (webs != null && Array.IndexOf(webs, childName) >= 0)
{
childWeb = site.Webs[childName];
Response.Redirect(siteUrl);
}
if (childWeb == null)
{
using (SPWeb impersonatedSite = impersonatedSiteCollection.OpenWeb())
{
impersonatedSite.AllowUnsafeUpdates = true;
SPWebCollection subSites = impersonatedSite.Webs;
subSites.Add(txtSite.Text, txtSite.Text, txtDescriptipon.Text, 1033, "STS#0", true, false);
impersonatedSite.AllowUnsafeUpdates = false;
}
SPContext.Current.Web.AllowUnsafeUpdates = false;
siteLink.Text = txtSite.Text;
siteLink.NavigateUrl = "http://" + "localdev/" + txtSite.Text;
Response.Redirect(siteLink.NavigateUrl);
}
else
{
Label7.Text = "What you talkin' bout Willis?";
}
}
finally
{
site.Dispose();
}
}
}
finally
{
}
}
}
finally
{
impersonatedSiteCollection.Dispose();
}
}
}
finally
{
}
}
}
Now I need to consider:
A.) Why, after the site was originally created, did the code wrap back? Was it because the "foreach" loop? Is there a better way?
B.) What is the performance hit for this? Right now during testing there is minimal traffic and only 4 sub-sites to check against. In production, there could be 10, 20, 100, 1000 sub-sites. How would this code function under that type of load?
Thanks for the pointers and assistance.
No comments:
Post a Comment