Search This Blog

Thursday, April 13, 2017

Based on the culture name load the resource files



Add file to project with name : App_GlobalResources
Create resource files in above folder like as following :
enUS.resx
esEs.resx
--
--ect
write the below code in test.aspx.cs:
-----------------------------------------
  protected void Page_PreInit(object sender, EventArgs e)
        {          
            try
            {
                string SPLanguage = Request.QueryString["SPLanguage"];
                string[] arrSPLanguageSplit = SPLanguage.Split('-');
                if (arrSPLanguageSplit.Length > 0)
                {
                    string languageNameAfterSplit = arrSPLanguageSplit[0] + arrSPLanguageSplit[1];
                    if (File.Exists(MapPath(@"~\App_GlobalResources\" + languageNameAfterSplit + ".resx")) == true)
                    {
                        HttpContext.Current.Session["resourceFileName"] = languageNameAfterSplit;
                    }
                    else {
                        HttpContext.Current.Session["resourceFileName"] = "enUS";                  
                    }                
                }
                else {
                    HttpContext.Current.Session["resourceFileName"] = "enUS";              
                }
            }
            catch (Exception ex)
            {
    HttpContext.Current.Session["resourceFileName"] = "enUS";
                EventLog.WriteEntry("Page_PreInit:resourceFileName", ex.Message, EventLogEntryType.Information);
                   
            }
         }

write the below code in test.aspx:
--------------------------------------
<script language="javascript" type="text/javascript">
 $(document).ready(function ()
{
  var TargetSite = '<%= HttpContext.GetGlobalResourceObject(HttpContext.Current.Session["resourceFileName"].ToString(), "TargetSite") %>';
  var TargetLibrary = '<%= HttpContext.GetGlobalResourceObject(HttpContext.Current.Session["resourceFileName"].ToString(), "TargetLibrary") %>';
         
            $('#lblTargetSite').text(TargetSite);
            $('#lblTargetLibrary').text(TargetLibrary);
});
</script>

UI:
----
    <div>
         <label id="lblTargetSite" runat="server"></label>        
        </div>
        <br />
        <div >
           <label id="lblTargetLibrary" runat="server"></label>    
        </div>

Crawling vs Indexing: What’s The Difference?

In search engine optimization lingo, there are different terms we need to understand. If we know what these terms mean, we can easily tell whether someone trying to preach the gospel of search knows what they are talking about, at least on a basic level.
Two of the often used terms in the business are crawling and indexing. These two words are sometimes, if not often, interchanged with one used to represent the other. With their seemingly “interchangeable” nature, many of us may dismiss the difference and declare that both crawling and indexing share the same definition.
So in unison, we ask what is the meaning of crawling, and what is the meaning of indexing, and what’s the difference between the two?
CRAWLING
Crawling takes place when there is a successful fetching of unique URIs which can be traced from valid links from other web pages. It’s like Pacman following all those dots and eating them, only that in the case of crawling, it’s the search engine robots that follow the links.
I say crawling takes place on a successful fetching because not all links we see on the Web are crawlable. The following cases may be the reasons why links can’t be crawled:
1. Link coding is in JavaScript format which is otherwise known as spider trap
2. Link was marked for exclusion via robots.txt‘s disallow directive
3. Orphaned link (no one linked to it and the absence of sitemaps.xml that include such link)
4. Link is found within a page that contains the nofollow directive
5. Server was down when link was supposed to be crawled.
INDEXING
Indexing takes place after a crawled URIs are processed. Note that there may be several URIs that are crawled but there could be fewer of them whose content will be processed through indexing. The following reasons could be the causes of non-indexing of a previously crawled page:
1. A noindex directive in the page (<meta name=”robots” content=”noindex” />)
2. Duplicate content: a page that has the same content as with an indexed page may not be indexed.
Other reasons such as link age and link popularity may also play a role but I am less inclined to lean on them.
To check if the page has been indexed, we can use the “site:” operator. For example, if I would like to find out if seo-hongkong.com is indexed I can use site:seo-hongkong.com and Google would show which pages sit currently in its present index.

Validate XSS reflection attacking page URL


Url:
https://test.com/Support/Employee/AllItems.aspx/?--%3E%3C/script%3E%3Cscript%3Ealert(235213)%3C/script%3E

In the above url it shows alert bydefault
Solution for this:
<script type="text/javascript">
       var pageUrl = window.location.href;
       var htmlTags = ["script", "style", "img", "font"];
       for (i = 0; i < htmlTags.length; i++) {
           var tagName = htmlTags[i].toString();
           if (pageUrl.indexOf(tagName) > -1) {
               window.location.href = pageUrl.split("?")[0];            
           }
       }
 </script>

the above code remove alert.

Read xml fille using LINQ Query


Xml file: DomainList.xml

<?xml version="1.0" encoding="utf-8"?>
<DomainList>
  <DomainName name="https://google.com"/>
  <DomainName name="https://gmail.com"/>
  <DomainName name="fonts.googleapis.com"/>
  <DomainName name="fonts.gstatic.com"/>
</DomainList>

C# Code: below code check redirecting url contains in the xml file or Not

   Uri sReturnUrl = new Uri(ReturnUrl);
            XElement main = XElement.Load(HttpContext.Current.Server.MapPath("~/ExcelTemplates/DomainList.xml"));

            var query = (from param in main.Descendants("DomainName")
                         where ((string)param.Attribute("name")).Contains(sReturnUrl.Host)
                         select new
                         {
                             code = (string)param.Attribute("name")
                         }).FirstOrDefault();

Recursively walking through a directory tree and listing file names

 

public partial class MainWindow : Window
{
    private void Button_Click_1(object sender, RoutedEventArgs e)
    {
        string sourcePath = @"C:\MasterPageArtifacts\";            

        static void DirSearch(string sourcePath)
        {
            try
            {
                foreach (string d in Directory.GetDirectories(sourcePath))
                {
                    foreach (string f in Directory.GetFiles(d))
                    {
                        listBox1.Items.Add(f);
                    }
                    DirSearch(d);
                }
            }                      
            catch (Exception ex)
            {
                listBox1.Items.Add(ex.Message);
            }
        }
    }
}

Content type hub in share point 2013


Procedure to work with content type hub:

1.      Create a web application

2.       Create root site collection for the new web application, I choose developer template but it work for any template.

3.       Create consumer site collection for checking the content type is received or not.

4.       Activate the “Content type syndication Hub” of the site collection. Go to site settings -> Site Collection Administrator-> site collection features

5.       Create Managed Meta data service (MMS).
a.       Go to Central Administration, click the Application Management, under Service Applications, and click the Manage Service Applications and click on new and select Managed metadata service and enter required fields and give subscriber site collection url.
b.       Click on the Managed Metadata service. Make sure there is not error and Verify the properties of the Managed metadata service are entered properly especially Content Type Hub URL.

6.       Now, go to our content type hub site [which created in first step],
a.       Go to Site Actions and then Site Settings. Under Galleries click on Site Content Types, you will find default Content Type Available.
b.       Create a new content type and add few columns and click on manage content type publishing and Check the Publish Radio button and say OK.

7.       Now go to consumer site collection for checking the content type is received or not.
a.       Go to Site Actions and then Site Settings. Under Galleries click on Site Content Types
Here you can observe the recently created content type, if content type is not there run the timer services from the central admin i.e. Content Type Hub and Content Type Subscriber.


8.       Add one new column to content type in subscriber site and republish the content type, if you are not able to see the new column in consumer site collection content type then run the two services: Go to central admin ->monitoring -> timer jobs ->Review job definitions, run the two services  Content Type Hub   and  Content Type Subscriber 

Provider Hosted Apps in SharePoint 2013


Create Certificate (Cer):
Go to IIS
Click on server certificates
Next click on the right pane select the self-signed certificate and give the name: TrustedCertificate then
Click on OK
Export Certificate (pfx):
In IIS Click on the certificate next click on the export then select location and give password
Copy Certificate (Cer):
In IIS double click the certificate next select details tab click on CopytoFile
Next click on next button select the No do not export the private key
Next click on next button select the DER encoded binary
Next click on next button choose the file where you want save
Next click on finish button
Run the below PowerShell command:
$cert=Get-PfxCertificate -FilePath "C:\Certs\TrustedCertificate.cer"
 $issuerid=[System.Guid]::NewGuid().ToString() // 35f57afc-2895-4825-9c60-bc37a91e2a51
 $realmid=Get-SPAuthenticationRealm -ServiceContext "http://SPSite:41127/sites/DeveloperSite/"
 //7d99f2ba-4f1e-4962-8a71-174f8df35404
$registeredName=$issuerid + "@" + $realmid
New-SPTrustedSecurityTokenIssuer -Name "DevelopmentApp" -RegisteredIssuerName $registeredName -IsTrustBroker -Certificate $cert
Note:
$issuerId will give the Issuer id
$realmid will give the real Id
$registeredName will give the registered name.
Go to Visual studio:
Select the App for SharePoint the app name: PagesCreationUsingPageViewerWebpart click on OK.
Next give developer site URL and select provider hosted
Click on next select the ASP.NET web forms Application
Click on next select use certificate and give the below details: pfx certificate location, password and realmid
Check in Web.config is done properly or not.
Step 9:Run the project in visual studio and check whether client Id is generated or not in web.config.
Step 10:Change the Permission in AppManifest.xml
Step 12:Run the below script  for generating APP ID
$clientid = " bded8c2e-da2d-4c2a-a8c3-0dbcf3e42f8e"
 $appId = $clientid + "@" + $realmid
 Register-SPAppPrincipal -Site "http://SPSite:41127/sites/DeveloperSite/"  -NameIdentifier $appId
 
Step 13:Run the below script
$config = (Get-SPSecurityTokenServiceConfig)
$config.AllowOAuthOverHttp = $true
$config.Update()
 
Step 14 : Deploy it finally.

Wednesday, April 12, 2017

Execute Parameterize PowerShell Script With C#(Reset Possword )

  


param(
[string]$a
)
foreach ($i in $a
{
    write  "Error: $i" >>c:\logfile.txt
}
write  "Error: $a" >>c:\logfile.txt
write  "Error: first" >>c:\logfile.txt
if ((Get-PSSnapin "Microsoft.SharePoint.PowerShell" -ErrorAction SilentlyContinue) -eq $null) {
       Add-PSSnapin "Microsoft.SharePoint.PowerShell"
}
write  "Error: pa" >>c:\logfile.txt
$username = 'NGO\ngoadmin'
$Password = 'NGO@dmin1' | ConvertTo-SecureString -Force -AsPlainText
$credential = New-Object System.Management.Automation.PsCredential($username, $Password)
try
{
   write  "Error: $a" >>c:\logfile.txt
   $job = Start-Job -scriptblock {
            $sk = 'CN=Exit User1,OU=ExitUsersOU,OU=NGO-TEST_OU,DC=ngo,DC=NGO,DC=com'
            Set-ADAccountPassword  $args[0] -Server "NGO.com" -Reset -NewPassword (ConvertTo-SecureString-AsPlainText "Vinay@4435" -Force)
         }-ArgumentList $a -credential $credential
         Receive-Job $job
         Wait-Job $job
         write  "Error: $job" >>c:\logfile.txt 
 write  "Error: r" >>c:\logfile.txt

  
}
catch [Exception] {

       write  "Error: $_.Exception.Message" >>c:\logfile.txt

       
}


C# Console code

public static string dynamicReplacementParameterInScript = "";
dynamicReplacementParameterInScript = GetCNUserWise(LoginName);
                        if (dynamicReplacementParameterInScript != null)
                        {
                            // start Exucute Powerhell Script
                         string PowerShellScriptOutPutResult = "";
                         try
                            {
                                PowerShell ps = PowerShell.Create();
                                ps.AddScript(@"C:\temp\Ram.ps1 '" + dynamicReplacementParameterInScript + "'");
                                ps.Invoke();
                                PowerShellScriptOutPutResult = "Success";
                            }
                         catch (Exception ex)
                            {
                                PowerShellScriptOutPutResult = "Failed";
                            }
                }



public static string GetCNUserWise(string loginName)
        {
            string strCN = "";
            try
            {
            string sa = System.Security.Principal.WindowsIdentity.GetCurrent().Name;
            txtEmp = sa.Split('\\')[1];
            string strDLdap = ConfigurationManager.AppSettings["myLDAPString"].ToString();
            DirectoryEntry connection = new DirectoryEntry(strDLdap);
            string strDLdAPUser = ConfigurationManager.AppSettings["myLDAPUser"].ToString();
            string strDLdAPPass = ConfigurationManager.AppSettings["myLDAPPassword"].ToString();
            connection.AuthenticationType = AuthenticationTypes.Secure;
            connection.Username = strDLdAPUser;
            connection.Password = strDLdAPPass;
            DirectorySearcher dssearch = new DirectorySearcher(connection);
            dssearch.Filter = "(sAMAccountName=" + txtEmp + ")";
            SearchResult sresult = dssearch.FindOne();
            DirectoryEntry dsresult = sresult.GetDirectoryEntry();
            string strfirstName = dsresult.Properties["givenName"][0].ToString();
            string strlastName = dsresult.Properties["sn"][0].ToString();
            string Name = strfirstName + "" + strlastName;
            string strRequestedBy = dsresult.Properties["mail"][0].ToString();
            string strComapnay = dsresult.Properties["company"][0].ToString();
            displayName = dsresult.Properties["displayName"][0].ToString();
            InsertUserInfo(displayName, strComapnay, DateTime.Now.ToString("MM-dd-yyyy"), strRequestedBy);

           }
           
             catch (Exception ex)
            {
                // SPUser user = web.CurrentUser;
                DBLayer_LogErrors objLogError = new DBLayer_LogErrors();
                Global_LogErrorsItem objLogErrorItem = new Global_LogErrorsItem();
                objLogErrorItem.Method = "GetCNUserWise()";
                objLogErrorItem.Class = "NGOTaskExecutionProcess" + "" + "Not get DistinguishedName from Active Directive ";
                objLogErrorItem.ErrorDescription = ex.Message;
                objLogErrorItem.StackTrace = ex.StackTrace;
                objLogErrorItem.UserName = System.Security.Principal.WindowsIdentity.GetCurrent().Name;
                //objLogErrorItem.UserName = user.LoginName;
                objLogError.AddLogError(objLogErrorItem);
              
            }
            return strCN;
        }