Mar 18 16:43:16.162308 ip-10-0-131-5 systemd[1]: Starting Kubernetes Kubelet... Mar 18 16:43:16.734442 ip-10-0-131-5 kubenswrapper[2562]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 16:43:16.734442 ip-10-0-131-5 kubenswrapper[2562]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 18 16:43:16.734442 ip-10-0-131-5 kubenswrapper[2562]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 16:43:16.734442 ip-10-0-131-5 kubenswrapper[2562]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 18 16:43:16.734442 ip-10-0-131-5 kubenswrapper[2562]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 16:43:16.738506 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.738430 2562 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 18 16:43:16.740851 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.740836 2562 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:43:16.740889 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.740852 2562 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:43:16.740889 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.740856 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:43:16.740889 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.740859 2562 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:43:16.740889 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.740862 2562 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:43:16.740889 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.740865 2562 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:43:16.740889 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.740868 2562 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:43:16.740889 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.740871 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:43:16.740889 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.740874 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:43:16.740889 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.740877 2562 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:43:16.741113 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.740905 2562 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:43:16.741113 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.740910 2562 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:43:16.741113 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.740913 2562 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:43:16.741113 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.740916 2562 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:43:16.741113 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.740919 2562 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:43:16.741113 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.740922 2562 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:43:16.741113 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.740924 2562 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:43:16.741113 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.740927 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:43:16.741113 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.740930 2562 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:43:16.741113 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.740933 2562 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:43:16.741113 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.740949 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:43:16.741113 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.740952 2562 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:43:16.741113 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.740955 2562 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:43:16.741113 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.740958 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:43:16.741113 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.740962 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:43:16.741113 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.740964 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:43:16.741113 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.740967 2562 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:43:16.741113 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.740970 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:43:16.741113 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.740972 2562 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:43:16.741113 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.740975 2562 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:43:16.741668 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.740978 2562 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:43:16.741668 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.740980 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:43:16.741668 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.740983 2562 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:43:16.741668 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.740986 2562 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:43:16.741668 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.740988 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:43:16.741668 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.740991 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:43:16.741668 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.740994 2562 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:43:16.741668 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.740997 2562 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:43:16.741668 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741000 2562 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:43:16.741668 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741003 2562 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:43:16.741668 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741006 2562 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:43:16.741668 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741008 2562 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:43:16.741668 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741011 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:43:16.741668 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741016 2562 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:43:16.741668 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741020 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:43:16.741668 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741022 2562 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:43:16.741668 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741025 2562 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:43:16.741668 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741027 2562 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:43:16.741668 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741030 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:43:16.741668 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741032 2562 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:43:16.742173 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741035 2562 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:43:16.742173 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741038 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:43:16.742173 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741041 2562 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:43:16.742173 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741043 2562 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:43:16.742173 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741046 2562 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:43:16.742173 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741048 2562 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:43:16.742173 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741051 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:43:16.742173 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741054 2562 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:43:16.742173 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741057 2562 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:43:16.742173 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741061 2562 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:43:16.742173 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741064 2562 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:43:16.742173 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741067 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:43:16.742173 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741070 2562 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:43:16.742173 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741073 2562 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:43:16.742173 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741076 2562 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:43:16.742173 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741079 2562 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:43:16.742173 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741082 2562 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:43:16.742173 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741084 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:43:16.742173 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741087 2562 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:43:16.742668 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741091 2562 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:43:16.742668 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741094 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:43:16.742668 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741097 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:43:16.742668 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741100 2562 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:43:16.742668 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741102 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:43:16.742668 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741105 2562 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:43:16.742668 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741108 2562 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:43:16.742668 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741111 2562 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:43:16.742668 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741113 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:43:16.742668 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741116 2562 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:43:16.742668 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741120 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:43:16.742668 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741123 2562 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:43:16.742668 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741126 2562 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:43:16.742668 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741129 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:43:16.742668 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741131 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:43:16.742668 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741134 2562 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:43:16.742668 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741137 2562 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:43:16.742668 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741498 2562 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:43:16.742668 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741503 2562 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:43:16.742668 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741505 2562 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:43:16.743167 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741508 2562 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:43:16.743167 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741511 2562 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:43:16.743167 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741514 2562 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:43:16.743167 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741516 2562 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:43:16.743167 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741519 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:43:16.743167 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741521 2562 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:43:16.743167 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741524 2562 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:43:16.743167 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741526 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:43:16.743167 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741529 2562 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:43:16.743167 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741531 2562 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:43:16.743167 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741534 2562 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:43:16.743167 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741536 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:43:16.743167 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741540 2562 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:43:16.743167 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741542 2562 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:43:16.743167 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741545 2562 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:43:16.743167 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741547 2562 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:43:16.743167 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741550 2562 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:43:16.743167 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741554 2562 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:43:16.743167 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741557 2562 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:43:16.743167 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741559 2562 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:43:16.743635 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741562 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:43:16.743635 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741564 2562 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:43:16.743635 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741567 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:43:16.743635 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741570 2562 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:43:16.743635 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741572 2562 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:43:16.743635 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741575 2562 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:43:16.743635 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741579 2562 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:43:16.743635 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741583 2562 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:43:16.743635 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741586 2562 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:43:16.743635 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741589 2562 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:43:16.743635 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741592 2562 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:43:16.743635 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741594 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:43:16.743635 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741597 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:43:16.743635 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741600 2562 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:43:16.743635 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741602 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:43:16.743635 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741605 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:43:16.743635 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741607 2562 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:43:16.743635 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741609 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:43:16.743635 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741612 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:43:16.744130 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741615 2562 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:43:16.744130 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741617 2562 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:43:16.744130 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741621 2562 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:43:16.744130 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741623 2562 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:43:16.744130 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741626 2562 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:43:16.744130 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741628 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:43:16.744130 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741631 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:43:16.744130 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741634 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:43:16.744130 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741636 2562 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:43:16.744130 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741639 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:43:16.744130 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741641 2562 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:43:16.744130 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741644 2562 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:43:16.744130 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741647 2562 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:43:16.744130 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741649 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:43:16.744130 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741652 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:43:16.744130 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741655 2562 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:43:16.744130 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741657 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:43:16.744130 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741660 2562 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:43:16.744130 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741662 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:43:16.744595 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741664 2562 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:43:16.744595 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741667 2562 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:43:16.744595 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741669 2562 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:43:16.744595 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741672 2562 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:43:16.744595 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741675 2562 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:43:16.744595 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741678 2562 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:43:16.744595 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741681 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:43:16.744595 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741684 2562 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:43:16.744595 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741686 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:43:16.744595 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741689 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:43:16.744595 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741691 2562 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:43:16.744595 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741694 2562 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:43:16.744595 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741697 2562 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:43:16.744595 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741700 2562 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:43:16.744595 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741703 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:43:16.744595 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741706 2562 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:43:16.744595 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741709 2562 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:43:16.744595 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741711 2562 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:43:16.744595 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741713 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:43:16.744595 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741716 2562 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:43:16.745095 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741719 2562 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:43:16.745095 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741721 2562 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:43:16.745095 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741724 2562 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:43:16.745095 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741726 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:43:16.745095 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.741730 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:43:16.745095 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743446 2562 flags.go:64] FLAG: --address="0.0.0.0" Mar 18 16:43:16.745095 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743458 2562 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 18 16:43:16.745095 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743465 2562 flags.go:64] FLAG: --anonymous-auth="true" Mar 18 16:43:16.745095 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743470 2562 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 18 16:43:16.745095 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743474 2562 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 18 16:43:16.745095 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743478 2562 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 18 16:43:16.745095 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743483 2562 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 18 16:43:16.745095 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743487 2562 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 18 16:43:16.745095 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743490 2562 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 18 16:43:16.745095 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743494 2562 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 18 16:43:16.745095 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743497 2562 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 18 16:43:16.745095 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743501 2562 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 18 16:43:16.745095 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743504 2562 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 18 16:43:16.745095 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743507 2562 flags.go:64] FLAG: --cgroup-root="" Mar 18 16:43:16.745095 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743510 2562 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 18 16:43:16.745095 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743513 2562 flags.go:64] FLAG: --client-ca-file="" Mar 18 16:43:16.745095 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743516 2562 flags.go:64] FLAG: --cloud-config="" Mar 18 16:43:16.745095 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743518 2562 flags.go:64] FLAG: --cloud-provider="external" Mar 18 16:43:16.745095 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743521 2562 flags.go:64] FLAG: --cluster-dns="[]" Mar 18 16:43:16.745656 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743526 2562 flags.go:64] FLAG: --cluster-domain="" Mar 18 16:43:16.745656 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743529 2562 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 18 16:43:16.745656 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743532 2562 flags.go:64] FLAG: --config-dir="" Mar 18 16:43:16.745656 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743535 2562 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 18 16:43:16.745656 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743538 2562 flags.go:64] FLAG: --container-log-max-files="5" Mar 18 16:43:16.745656 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743543 2562 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 18 16:43:16.745656 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743545 2562 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 18 16:43:16.745656 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743548 2562 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 18 16:43:16.745656 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743552 2562 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 18 16:43:16.745656 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743556 2562 flags.go:64] FLAG: --contention-profiling="false" Mar 18 16:43:16.745656 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743559 2562 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 18 16:43:16.745656 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743562 2562 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 18 16:43:16.745656 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743565 2562 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 18 16:43:16.745656 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743568 2562 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 18 16:43:16.745656 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743572 2562 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 18 16:43:16.745656 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743575 2562 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 18 16:43:16.745656 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743578 2562 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 18 16:43:16.745656 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743581 2562 flags.go:64] FLAG: --enable-load-reader="false" Mar 18 16:43:16.745656 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743583 2562 flags.go:64] FLAG: --enable-server="true" Mar 18 16:43:16.745656 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743587 2562 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 18 16:43:16.745656 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743592 2562 flags.go:64] FLAG: --event-burst="100" Mar 18 16:43:16.745656 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743595 2562 flags.go:64] FLAG: --event-qps="50" Mar 18 16:43:16.745656 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743598 2562 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 18 16:43:16.745656 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743601 2562 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 18 16:43:16.745656 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743604 2562 flags.go:64] FLAG: --eviction-hard="" Mar 18 16:43:16.746273 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743607 2562 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 18 16:43:16.746273 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743610 2562 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 18 16:43:16.746273 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743613 2562 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 18 16:43:16.746273 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743616 2562 flags.go:64] FLAG: --eviction-soft="" Mar 18 16:43:16.746273 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743619 2562 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 18 16:43:16.746273 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743622 2562 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 18 16:43:16.746273 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743625 2562 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 18 16:43:16.746273 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743627 2562 flags.go:64] FLAG: --experimental-mounter-path="" Mar 18 16:43:16.746273 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743630 2562 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 18 16:43:16.746273 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743633 2562 flags.go:64] FLAG: --fail-swap-on="true" Mar 18 16:43:16.746273 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743636 2562 flags.go:64] FLAG: --feature-gates="" Mar 18 16:43:16.746273 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743641 2562 flags.go:64] FLAG: --file-check-frequency="20s" Mar 18 16:43:16.746273 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743644 2562 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 18 16:43:16.746273 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743647 2562 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 18 16:43:16.746273 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743650 2562 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 18 16:43:16.746273 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743653 2562 flags.go:64] FLAG: --healthz-port="10248" Mar 18 16:43:16.746273 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743657 2562 flags.go:64] FLAG: --help="false" Mar 18 16:43:16.746273 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743661 2562 flags.go:64] FLAG: --hostname-override="ip-10-0-131-5.ec2.internal" Mar 18 16:43:16.746273 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743664 2562 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 18 16:43:16.746273 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743667 2562 flags.go:64] FLAG: --http-check-frequency="20s" Mar 18 16:43:16.746273 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743670 2562 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Mar 18 16:43:16.746273 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743673 2562 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Mar 18 16:43:16.746273 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743677 2562 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 18 16:43:16.746822 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743679 2562 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 18 16:43:16.746822 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743682 2562 flags.go:64] FLAG: --image-service-endpoint="" Mar 18 16:43:16.746822 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743685 2562 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 18 16:43:16.746822 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743688 2562 flags.go:64] FLAG: --kube-api-burst="100" Mar 18 16:43:16.746822 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743691 2562 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 18 16:43:16.746822 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743694 2562 flags.go:64] FLAG: --kube-api-qps="50" Mar 18 16:43:16.746822 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743697 2562 flags.go:64] FLAG: --kube-reserved="" Mar 18 16:43:16.746822 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743700 2562 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 18 16:43:16.746822 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743702 2562 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 18 16:43:16.746822 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743706 2562 flags.go:64] FLAG: --kubelet-cgroups="" Mar 18 16:43:16.746822 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743708 2562 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 18 16:43:16.746822 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743711 2562 flags.go:64] FLAG: --lock-file="" Mar 18 16:43:16.746822 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743716 2562 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 18 16:43:16.746822 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743719 2562 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 18 16:43:16.746822 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743722 2562 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 18 16:43:16.746822 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743728 2562 flags.go:64] FLAG: --log-json-split-stream="false" Mar 18 16:43:16.746822 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743731 2562 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 18 16:43:16.746822 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743734 2562 flags.go:64] FLAG: --log-text-split-stream="false" Mar 18 16:43:16.746822 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743737 2562 flags.go:64] FLAG: --logging-format="text" Mar 18 16:43:16.746822 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743739 2562 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 18 16:43:16.746822 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743743 2562 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 18 16:43:16.746822 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743745 2562 flags.go:64] FLAG: --manifest-url="" Mar 18 16:43:16.746822 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743748 2562 flags.go:64] FLAG: --manifest-url-header="" Mar 18 16:43:16.746822 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743752 2562 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 18 16:43:16.746822 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743755 2562 flags.go:64] FLAG: --max-open-files="1000000" Mar 18 16:43:16.747466 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743760 2562 flags.go:64] FLAG: --max-pods="110" Mar 18 16:43:16.747466 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743763 2562 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 18 16:43:16.747466 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743769 2562 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 18 16:43:16.747466 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743772 2562 flags.go:64] FLAG: --memory-manager-policy="None" Mar 18 16:43:16.747466 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743775 2562 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 18 16:43:16.747466 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743778 2562 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 18 16:43:16.747466 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743781 2562 flags.go:64] FLAG: --node-ip="0.0.0.0" Mar 18 16:43:16.747466 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743784 2562 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Mar 18 16:43:16.747466 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743791 2562 flags.go:64] FLAG: --node-status-max-images="50" Mar 18 16:43:16.747466 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743794 2562 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 18 16:43:16.747466 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743797 2562 flags.go:64] FLAG: --oom-score-adj="-999" Mar 18 16:43:16.747466 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743800 2562 flags.go:64] FLAG: --pod-cidr="" Mar 18 16:43:16.747466 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743803 2562 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b3115b2610585407ab0742648cfbe39c72f57482889f0e778f5ac6fdc482217b" Mar 18 16:43:16.747466 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743809 2562 flags.go:64] FLAG: --pod-manifest-path="" Mar 18 16:43:16.747466 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743812 2562 flags.go:64] FLAG: --pod-max-pids="-1" Mar 18 16:43:16.747466 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743815 2562 flags.go:64] FLAG: --pods-per-core="0" Mar 18 16:43:16.747466 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743818 2562 flags.go:64] FLAG: --port="10250" Mar 18 16:43:16.747466 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743821 2562 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 18 16:43:16.747466 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743824 2562 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0e0ec7d6c83004532" Mar 18 16:43:16.747466 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743829 2562 flags.go:64] FLAG: --qos-reserved="" Mar 18 16:43:16.747466 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743832 2562 flags.go:64] FLAG: --read-only-port="10255" Mar 18 16:43:16.747466 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743835 2562 flags.go:64] FLAG: --register-node="true" Mar 18 16:43:16.747466 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743838 2562 flags.go:64] FLAG: --register-schedulable="true" Mar 18 16:43:16.747466 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743841 2562 flags.go:64] FLAG: --register-with-taints="" Mar 18 16:43:16.748052 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743844 2562 flags.go:64] FLAG: --registry-burst="10" Mar 18 16:43:16.748052 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743847 2562 flags.go:64] FLAG: --registry-qps="5" Mar 18 16:43:16.748052 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743850 2562 flags.go:64] FLAG: --reserved-cpus="" Mar 18 16:43:16.748052 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743852 2562 flags.go:64] FLAG: --reserved-memory="" Mar 18 16:43:16.748052 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743857 2562 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 18 16:43:16.748052 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743859 2562 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 18 16:43:16.748052 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743862 2562 flags.go:64] FLAG: --rotate-certificates="false" Mar 18 16:43:16.748052 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743865 2562 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 18 16:43:16.748052 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743868 2562 flags.go:64] FLAG: --runonce="false" Mar 18 16:43:16.748052 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743873 2562 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 18 16:43:16.748052 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743876 2562 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 18 16:43:16.748052 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743879 2562 flags.go:64] FLAG: --seccomp-default="false" Mar 18 16:43:16.748052 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743882 2562 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 18 16:43:16.748052 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743885 2562 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 18 16:43:16.748052 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743888 2562 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 18 16:43:16.748052 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743892 2562 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 18 16:43:16.748052 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743895 2562 flags.go:64] FLAG: --storage-driver-password="root" Mar 18 16:43:16.748052 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743897 2562 flags.go:64] FLAG: --storage-driver-secure="false" Mar 18 16:43:16.748052 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743900 2562 flags.go:64] FLAG: --storage-driver-table="stats" Mar 18 16:43:16.748052 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743903 2562 flags.go:64] FLAG: --storage-driver-user="root" Mar 18 16:43:16.748052 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743906 2562 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 18 16:43:16.748052 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743909 2562 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 18 16:43:16.748052 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743912 2562 flags.go:64] FLAG: --system-cgroups="" Mar 18 16:43:16.748052 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743914 2562 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Mar 18 16:43:16.748052 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743920 2562 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 18 16:43:16.748645 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743922 2562 flags.go:64] FLAG: --tls-cert-file="" Mar 18 16:43:16.748645 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743925 2562 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 18 16:43:16.748645 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743930 2562 flags.go:64] FLAG: --tls-min-version="" Mar 18 16:43:16.748645 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743933 2562 flags.go:64] FLAG: --tls-private-key-file="" Mar 18 16:43:16.748645 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743948 2562 flags.go:64] FLAG: --topology-manager-policy="none" Mar 18 16:43:16.748645 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743951 2562 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 18 16:43:16.748645 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743954 2562 flags.go:64] FLAG: --topology-manager-scope="container" Mar 18 16:43:16.748645 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743957 2562 flags.go:64] FLAG: --v="2" Mar 18 16:43:16.748645 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743961 2562 flags.go:64] FLAG: --version="false" Mar 18 16:43:16.748645 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743965 2562 flags.go:64] FLAG: --vmodule="" Mar 18 16:43:16.748645 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743969 2562 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 18 16:43:16.748645 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.743972 2562 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 18 16:43:16.748645 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744063 2562 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:43:16.748645 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744067 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:43:16.748645 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744070 2562 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:43:16.748645 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744073 2562 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:43:16.748645 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744078 2562 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:43:16.748645 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744082 2562 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:43:16.748645 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744084 2562 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:43:16.748645 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744088 2562 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:43:16.748645 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744090 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:43:16.748645 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744093 2562 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:43:16.749176 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744096 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:43:16.749176 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744098 2562 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:43:16.749176 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744101 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:43:16.749176 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744104 2562 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:43:16.749176 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744106 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:43:16.749176 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744109 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:43:16.749176 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744112 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:43:16.749176 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744114 2562 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:43:16.749176 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744117 2562 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:43:16.749176 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744120 2562 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:43:16.749176 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744122 2562 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:43:16.749176 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744125 2562 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:43:16.749176 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744129 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:43:16.749176 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744132 2562 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:43:16.749176 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744134 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:43:16.749176 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744137 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:43:16.749176 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744140 2562 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:43:16.749176 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744142 2562 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:43:16.749176 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744145 2562 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:43:16.749176 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744148 2562 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:43:16.749689 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744150 2562 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:43:16.749689 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744154 2562 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:43:16.749689 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744158 2562 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:43:16.749689 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744160 2562 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:43:16.749689 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744163 2562 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:43:16.749689 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744165 2562 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:43:16.749689 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744169 2562 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:43:16.749689 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744171 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:43:16.749689 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744174 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:43:16.749689 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744176 2562 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:43:16.749689 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744179 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:43:16.749689 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744182 2562 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:43:16.749689 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744184 2562 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:43:16.749689 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744187 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:43:16.749689 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744190 2562 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:43:16.749689 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744192 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:43:16.749689 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744195 2562 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:43:16.749689 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744197 2562 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:43:16.749689 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744199 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:43:16.750166 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744202 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:43:16.750166 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744205 2562 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:43:16.750166 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744207 2562 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:43:16.750166 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744209 2562 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:43:16.750166 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744212 2562 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:43:16.750166 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744215 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:43:16.750166 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744218 2562 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:43:16.750166 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744220 2562 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:43:16.750166 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744223 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:43:16.750166 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744225 2562 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:43:16.750166 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744228 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:43:16.750166 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744232 2562 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:43:16.750166 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744235 2562 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:43:16.750166 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744238 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:43:16.750166 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744240 2562 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:43:16.750166 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744243 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:43:16.750166 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744246 2562 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:43:16.750166 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744248 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:43:16.750166 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744250 2562 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:43:16.750166 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744253 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:43:16.750665 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744256 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:43:16.750665 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744258 2562 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:43:16.750665 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744260 2562 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:43:16.750665 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744263 2562 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:43:16.750665 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744266 2562 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:43:16.750665 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744268 2562 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:43:16.750665 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744271 2562 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:43:16.750665 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744274 2562 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:43:16.750665 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744276 2562 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:43:16.750665 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744279 2562 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:43:16.750665 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744281 2562 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:43:16.750665 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744284 2562 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:43:16.750665 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744287 2562 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:43:16.750665 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744289 2562 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:43:16.750665 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744292 2562 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:43:16.750665 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744294 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:43:16.750665 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.744297 2562 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:43:16.751313 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.745085 2562 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Mar 18 16:43:16.752307 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.752289 2562 server.go:530] "Kubelet version" kubeletVersion="v1.33.8" Mar 18 16:43:16.752307 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.752307 2562 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 18 16:43:16.752402 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752355 2562 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:43:16.752402 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752360 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:43:16.752402 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752363 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:43:16.752402 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752367 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:43:16.752402 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752369 2562 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:43:16.752402 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752372 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:43:16.752402 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752375 2562 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:43:16.752402 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752378 2562 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:43:16.752402 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752381 2562 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:43:16.752402 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752385 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:43:16.752402 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752387 2562 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:43:16.752402 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752390 2562 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:43:16.752402 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752392 2562 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:43:16.752402 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752395 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:43:16.752402 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752398 2562 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:43:16.752402 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752400 2562 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:43:16.752402 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752403 2562 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:43:16.752402 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752406 2562 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:43:16.752402 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752409 2562 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:43:16.752402 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752411 2562 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:43:16.752885 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752415 2562 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:43:16.752885 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752418 2562 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:43:16.752885 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752421 2562 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:43:16.752885 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752424 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:43:16.752885 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752426 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:43:16.752885 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752429 2562 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:43:16.752885 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752431 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:43:16.752885 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752434 2562 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:43:16.752885 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752436 2562 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:43:16.752885 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752439 2562 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:43:16.752885 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752442 2562 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:43:16.752885 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752445 2562 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:43:16.752885 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752447 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:43:16.752885 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752450 2562 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:43:16.752885 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752453 2562 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:43:16.752885 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752455 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:43:16.752885 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752458 2562 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:43:16.752885 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752462 2562 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:43:16.752885 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752466 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:43:16.753374 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752469 2562 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:43:16.753374 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752472 2562 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:43:16.753374 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752476 2562 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:43:16.753374 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752480 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:43:16.753374 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752483 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:43:16.753374 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752485 2562 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:43:16.753374 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752488 2562 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:43:16.753374 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752490 2562 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:43:16.753374 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752493 2562 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:43:16.753374 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752496 2562 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:43:16.753374 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752498 2562 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:43:16.753374 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752501 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:43:16.753374 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752503 2562 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:43:16.753374 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752506 2562 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:43:16.753374 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752509 2562 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:43:16.753374 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752512 2562 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:43:16.753374 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752515 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:43:16.753374 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752518 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:43:16.753374 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752520 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:43:16.753843 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752523 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:43:16.753843 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752526 2562 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:43:16.753843 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752528 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:43:16.753843 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752531 2562 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:43:16.753843 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752533 2562 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:43:16.753843 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752536 2562 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:43:16.753843 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752538 2562 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:43:16.753843 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752541 2562 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:43:16.753843 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752543 2562 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:43:16.753843 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752546 2562 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:43:16.753843 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752549 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:43:16.753843 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752551 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:43:16.753843 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752554 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:43:16.753843 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752557 2562 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:43:16.753843 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752559 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:43:16.753843 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752562 2562 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:43:16.753843 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752564 2562 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:43:16.753843 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752567 2562 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:43:16.753843 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752570 2562 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:43:16.753843 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752573 2562 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:43:16.754345 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752575 2562 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:43:16.754345 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752578 2562 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:43:16.754345 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752580 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:43:16.754345 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752583 2562 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:43:16.754345 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752585 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:43:16.754345 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752587 2562 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:43:16.754345 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752590 2562 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:43:16.754345 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752593 2562 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:43:16.754345 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.752599 2562 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Mar 18 16:43:16.754345 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752690 2562 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:43:16.754345 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752694 2562 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:43:16.754345 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752697 2562 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:43:16.754345 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752701 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:43:16.754345 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752703 2562 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:43:16.754345 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752706 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:43:16.754345 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752709 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:43:16.754738 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752712 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:43:16.754738 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752715 2562 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:43:16.754738 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752717 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:43:16.754738 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752720 2562 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:43:16.754738 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752722 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:43:16.754738 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752726 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:43:16.754738 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752729 2562 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:43:16.754738 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752731 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:43:16.754738 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752734 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:43:16.754738 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752737 2562 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:43:16.754738 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752739 2562 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:43:16.754738 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752742 2562 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:43:16.754738 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752744 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:43:16.754738 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752747 2562 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:43:16.754738 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752749 2562 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:43:16.754738 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752753 2562 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:43:16.754738 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752755 2562 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:43:16.754738 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752758 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:43:16.754738 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752760 2562 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:43:16.755281 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752763 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:43:16.755281 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752765 2562 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:43:16.755281 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752768 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:43:16.755281 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752771 2562 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:43:16.755281 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752773 2562 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:43:16.755281 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752776 2562 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:43:16.755281 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752778 2562 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:43:16.755281 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752781 2562 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:43:16.755281 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752784 2562 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:43:16.755281 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752786 2562 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:43:16.755281 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752789 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:43:16.755281 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752791 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:43:16.755281 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752794 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:43:16.755281 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752796 2562 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:43:16.755281 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752799 2562 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:43:16.755281 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752802 2562 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:43:16.755281 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752804 2562 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:43:16.755281 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752807 2562 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:43:16.755281 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752809 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:43:16.755281 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752812 2562 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:43:16.755795 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752814 2562 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:43:16.755795 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752817 2562 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:43:16.755795 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752820 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:43:16.755795 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752822 2562 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:43:16.755795 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752824 2562 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:43:16.755795 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752827 2562 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:43:16.755795 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752829 2562 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:43:16.755795 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752832 2562 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:43:16.755795 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752834 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:43:16.755795 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752837 2562 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:43:16.755795 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752840 2562 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:43:16.755795 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752842 2562 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:43:16.755795 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752845 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:43:16.755795 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752847 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:43:16.755795 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752850 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:43:16.755795 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752852 2562 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:43:16.755795 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752856 2562 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:43:16.755795 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752860 2562 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:43:16.755795 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752863 2562 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:43:16.756262 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752865 2562 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:43:16.756262 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752868 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:43:16.756262 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752870 2562 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:43:16.756262 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752873 2562 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:43:16.756262 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752875 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:43:16.756262 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752878 2562 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:43:16.756262 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752881 2562 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:43:16.756262 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752883 2562 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:43:16.756262 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752886 2562 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:43:16.756262 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752889 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:43:16.756262 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752892 2562 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:43:16.756262 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752894 2562 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:43:16.756262 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752898 2562 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:43:16.756262 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752902 2562 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:43:16.756262 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752905 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:43:16.756262 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752908 2562 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:43:16.756262 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752911 2562 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:43:16.756262 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752914 2562 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:43:16.756262 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752916 2562 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:43:16.756262 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752919 2562 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:43:16.756729 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:16.752922 2562 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:43:16.756729 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.752926 2562 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Mar 18 16:43:16.756729 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.753684 2562 server.go:962] "Client rotation is on, will bootstrap in background" Mar 18 16:43:16.757960 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.757934 2562 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 18 16:43:16.759281 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.759270 2562 server.go:1019] "Starting client certificate rotation" Mar 18 16:43:16.759381 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.759365 2562 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Mar 18 16:43:16.759412 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.759405 2562 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Mar 18 16:43:16.795639 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.795624 2562 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 16:43:16.798539 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.798522 2562 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 16:43:16.819326 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.819307 2562 log.go:25] "Validated CRI v1 runtime API" Mar 18 16:43:16.826510 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.826485 2562 log.go:25] "Validated CRI v1 image API" Mar 18 16:43:16.828408 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.828391 2562 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 18 16:43:16.830406 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.830389 2562 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Mar 18 16:43:16.831427 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.831411 2562 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 7eb79029-458a-481d-a463-3dee0ea08cf9:/dev/nvme0n1p3 d91fd535-748e-4422-bf07-c1dead17650e:/dev/nvme0n1p4] Mar 18 16:43:16.831475 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.831427 2562 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Mar 18 16:43:16.838865 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.838748 2562 manager.go:217] Machine: {Timestamp:2026-03-18 16:43:16.83618756 +0000 UTC m=+0.522309916 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3108524 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2eb14d97b5df87cb291fb4fcfc4d69 SystemUUID:ec2eb14d-97b5-df87-cb29-1fb4fcfc4d69 BootID:81f64184-c940-43d0-8ff8-0154f41cfba0 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6094848 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:4b:d8:09:3f:21 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:4b:d8:09:3f:21 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:e2:d7:c9:a8:8c:35 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 18 16:43:16.838865 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.838855 2562 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 18 16:43:16.839014 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.838960 2562 manager.go:233] Version: {KernelVersion:5.14.0-570.96.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260303-1 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 18 16:43:16.842823 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.842801 2562 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 18 16:43:16.842968 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.842826 2562 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-131-5.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 18 16:43:16.843017 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.842977 2562 topology_manager.go:138] "Creating topology manager with none policy" Mar 18 16:43:16.843017 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.842985 2562 container_manager_linux.go:306] "Creating device plugin manager" Mar 18 16:43:16.843017 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.842997 2562 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 18 16:43:16.844562 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.844551 2562 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 18 16:43:16.845443 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.845433 2562 state_mem.go:36] "Initialized new in-memory state store" Mar 18 16:43:16.845667 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.845659 2562 server.go:1267] "Using root directory" path="/var/lib/kubelet" Mar 18 16:43:16.849916 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.849907 2562 kubelet.go:491] "Attempting to sync node with API server" Mar 18 16:43:16.849977 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.849920 2562 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 18 16:43:16.849977 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.849930 2562 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 18 16:43:16.850049 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.849995 2562 kubelet.go:397] "Adding apiserver pod source" Mar 18 16:43:16.850049 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.850004 2562 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 18 16:43:16.851553 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.851537 2562 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Mar 18 16:43:16.851625 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.851558 2562 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Mar 18 16:43:16.855924 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.855904 2562 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.9-3.rhaos4.20.gitb9ac835.el9" apiVersion="v1" Mar 18 16:43:16.858010 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.857989 2562 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 18 16:43:16.860072 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.860058 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 18 16:43:16.860136 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.860076 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 18 16:43:16.860136 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.860082 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 18 16:43:16.860136 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.860087 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 18 16:43:16.860136 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.860093 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 18 16:43:16.860136 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.860098 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 18 16:43:16.860136 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.860104 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 18 16:43:16.860136 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.860121 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 18 16:43:16.860136 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.860127 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 18 16:43:16.860136 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.860133 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 18 16:43:16.860369 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.860144 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 18 16:43:16.860369 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.860153 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 18 16:43:16.861601 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.861589 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 18 16:43:16.861601 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.861602 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Mar 18 16:43:16.863924 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:16.863897 2562 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-131-5.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 18 16:43:16.864054 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:16.864028 2562 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 18 16:43:16.865143 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.865128 2562 watchdog_linux.go:99] "Systemd watchdog is not enabled" Mar 18 16:43:16.865225 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.865163 2562 server.go:1295] "Started kubelet" Mar 18 16:43:16.865291 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.865257 2562 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 18 16:43:16.865815 ip-10-0-131-5 systemd[1]: Started Kubernetes Kubelet. Mar 18 16:43:16.865907 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.865850 2562 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 18 16:43:16.865983 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.865916 2562 server_v1.go:47] "podresources" method="list" useActivePods=true Mar 18 16:43:16.867326 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.867309 2562 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 18 16:43:16.868680 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.868665 2562 server.go:317] "Adding debug handlers to kubelet server" Mar 18 16:43:16.874258 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.874110 2562 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Mar 18 16:43:16.874400 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.874385 2562 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-5.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 16:43:16.875012 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.874997 2562 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 18 16:43:16.875653 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:16.874276 2562 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-5.ec2.internal.189dfd2cbe316692 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-5.ec2.internal,UID:ip-10-0-131-5.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-131-5.ec2.internal,},FirstTimestamp:2026-03-18 16:43:16.86514037 +0000 UTC m=+0.551262723,LastTimestamp:2026-03-18 16:43:16.86514037 +0000 UTC m=+0.551262723,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-5.ec2.internal,}" Mar 18 16:43:16.876291 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:16.876168 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-5.ec2.internal\" not found" Mar 18 16:43:16.876291 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.876244 2562 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Mar 18 16:43:16.876291 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.876247 2562 volume_manager.go:295] "The desired_state_of_world populator starts" Mar 18 16:43:16.876291 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.876270 2562 volume_manager.go:297] "Starting Kubelet Volume Manager" Mar 18 16:43:16.876513 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.876384 2562 reconstruct.go:97] "Volume reconstruction finished" Mar 18 16:43:16.876513 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.876395 2562 reconciler.go:26] "Reconciler: start to sync state" Mar 18 16:43:16.877319 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:16.877287 2562 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 18 16:43:16.877701 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:16.877675 2562 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-131-5.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Mar 18 16:43:16.878392 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.878375 2562 factory.go:55] Registering systemd factory Mar 18 16:43:16.878392 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.878394 2562 factory.go:223] Registration of the systemd container factory successfully Mar 18 16:43:16.878617 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.878604 2562 factory.go:153] Registering CRI-O factory Mar 18 16:43:16.878689 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.878620 2562 factory.go:223] Registration of the crio container factory successfully Mar 18 16:43:16.878740 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.878707 2562 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 18 16:43:16.878740 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.878730 2562 factory.go:103] Registering Raw factory Mar 18 16:43:16.878839 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.878745 2562 manager.go:1196] Started watching for new ooms in manager Mar 18 16:43:16.879227 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.879211 2562 manager.go:319] Starting recovery of all containers Mar 18 16:43:16.881451 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:16.881412 2562 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Mar 18 16:43:16.884913 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.884889 2562 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-g7km5" Mar 18 16:43:16.891728 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.891710 2562 manager.go:324] Recovery completed Mar 18 16:43:16.895474 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.895463 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:43:16.896250 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.896234 2562 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-g7km5" Mar 18 16:43:16.898029 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.898015 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-5.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:43:16.898104 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.898046 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-5.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:43:16.898104 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.898061 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-5.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:43:16.898479 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.898464 2562 cpu_manager.go:222] "Starting CPU manager" policy="none" Mar 18 16:43:16.898479 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.898478 2562 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Mar 18 16:43:16.898573 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.898494 2562 state_mem.go:36] "Initialized new in-memory state store" Mar 18 16:43:16.900407 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:16.900344 2562 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-5.ec2.internal.189dfd2cc02744b2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-5.ec2.internal,UID:ip-10-0-131-5.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-131-5.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-131-5.ec2.internal,},FirstTimestamp:2026-03-18 16:43:16.89803077 +0000 UTC m=+0.584153126,LastTimestamp:2026-03-18 16:43:16.89803077 +0000 UTC m=+0.584153126,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-5.ec2.internal,}" Mar 18 16:43:16.900592 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.900580 2562 policy_none.go:49] "None policy: Start" Mar 18 16:43:16.900628 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.900596 2562 memory_manager.go:186] "Starting memorymanager" policy="None" Mar 18 16:43:16.900628 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.900606 2562 state_mem.go:35] "Initializing new in-memory state store" Mar 18 16:43:16.951319 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.937617 2562 manager.go:341] "Starting Device Plugin manager" Mar 18 16:43:16.951319 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:16.937669 2562 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 18 16:43:16.951319 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.937680 2562 server.go:85] "Starting device plugin registration server" Mar 18 16:43:16.951319 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.937879 2562 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 18 16:43:16.951319 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.937891 2562 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 18 16:43:16.951319 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.938006 2562 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 18 16:43:16.951319 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.938083 2562 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 18 16:43:16.951319 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.938093 2562 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 18 16:43:16.951319 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:16.938515 2562 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Mar 18 16:43:16.951319 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:16.938542 2562 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-5.ec2.internal\" not found" Mar 18 16:43:16.978469 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.978414 2562 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Mar 18 16:43:16.979491 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.979479 2562 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Mar 18 16:43:16.979549 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.979501 2562 status_manager.go:230] "Starting to sync pod status with apiserver" Mar 18 16:43:16.979549 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.979516 2562 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 18 16:43:16.979549 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.979522 2562 kubelet.go:2451] "Starting kubelet main sync loop" Mar 18 16:43:16.979674 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:16.979550 2562 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Mar 18 16:43:16.982189 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:16.982173 2562 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:43:17.038519 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:17.038500 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:43:17.039317 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:17.039303 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-5.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:43:17.039375 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:17.039328 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-5.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:43:17.039375 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:17.039340 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-5.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:43:17.039375 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:17.039363 2562 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-5.ec2.internal" Mar 18 16:43:17.048723 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:17.048706 2562 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-131-5.ec2.internal" Mar 18 16:43:17.048776 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:17.048725 2562 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-131-5.ec2.internal\": node \"ip-10-0-131-5.ec2.internal\" not found" Mar 18 16:43:17.075036 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:17.075015 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-5.ec2.internal\" not found" Mar 18 16:43:17.080369 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:17.080347 2562 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-5.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-131-5.ec2.internal"] Mar 18 16:43:17.080419 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:17.080410 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:43:17.082152 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:17.082141 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-5.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:43:17.082223 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:17.082163 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-5.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:43:17.082223 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:17.082172 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-5.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:43:17.084362 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:17.084351 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:43:17.084993 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:17.084980 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-5.ec2.internal" Mar 18 16:43:17.085038 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:17.085005 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:43:17.085152 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:17.085139 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-5.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:43:17.085195 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:17.085165 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-5.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:43:17.085195 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:17.085178 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-5.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:43:17.085541 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:17.085528 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-5.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:43:17.085609 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:17.085565 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-5.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:43:17.085609 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:17.085584 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-5.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:43:17.087279 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:17.087263 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-5.ec2.internal" Mar 18 16:43:17.087354 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:17.087287 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:43:17.087859 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:17.087842 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-5.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:43:17.087934 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:17.087872 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-5.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:43:17.087934 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:17.087887 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-5.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:43:17.115461 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:17.115439 2562 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-5.ec2.internal\" not found" node="ip-10-0-131-5.ec2.internal" Mar 18 16:43:17.119812 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:17.119798 2562 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-5.ec2.internal\" not found" node="ip-10-0-131-5.ec2.internal" Mar 18 16:43:17.175446 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:17.175430 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-5.ec2.internal\" not found" Mar 18 16:43:17.178009 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:17.177993 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cc51fc8cf8e0664cd067f24564b101ef-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-5.ec2.internal\" (UID: \"cc51fc8cf8e0664cd067f24564b101ef\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-5.ec2.internal" Mar 18 16:43:17.178078 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:17.178016 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6da92965ac9d0f6a332c1d675e2f540d-config\") pod \"kube-apiserver-proxy-ip-10-0-131-5.ec2.internal\" (UID: \"6da92965ac9d0f6a332c1d675e2f540d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-5.ec2.internal" Mar 18 16:43:17.178078 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:17.178034 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/cc51fc8cf8e0664cd067f24564b101ef-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-5.ec2.internal\" (UID: \"cc51fc8cf8e0664cd067f24564b101ef\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-5.ec2.internal" Mar 18 16:43:17.275795 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:17.275735 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-5.ec2.internal\" not found" Mar 18 16:43:17.279006 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:17.278993 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/cc51fc8cf8e0664cd067f24564b101ef-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-5.ec2.internal\" (UID: \"cc51fc8cf8e0664cd067f24564b101ef\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-5.ec2.internal" Mar 18 16:43:17.279062 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:17.279015 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cc51fc8cf8e0664cd067f24564b101ef-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-5.ec2.internal\" (UID: \"cc51fc8cf8e0664cd067f24564b101ef\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-5.ec2.internal" Mar 18 16:43:17.279062 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:17.279032 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6da92965ac9d0f6a332c1d675e2f540d-config\") pod \"kube-apiserver-proxy-ip-10-0-131-5.ec2.internal\" (UID: \"6da92965ac9d0f6a332c1d675e2f540d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-5.ec2.internal" Mar 18 16:43:17.279134 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:17.279079 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/cc51fc8cf8e0664cd067f24564b101ef-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-5.ec2.internal\" (UID: \"cc51fc8cf8e0664cd067f24564b101ef\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-5.ec2.internal" Mar 18 16:43:17.279134 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:17.279103 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6da92965ac9d0f6a332c1d675e2f540d-config\") pod \"kube-apiserver-proxy-ip-10-0-131-5.ec2.internal\" (UID: \"6da92965ac9d0f6a332c1d675e2f540d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-5.ec2.internal" Mar 18 16:43:17.279134 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:17.279109 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cc51fc8cf8e0664cd067f24564b101ef-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-5.ec2.internal\" (UID: \"cc51fc8cf8e0664cd067f24564b101ef\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-5.ec2.internal" Mar 18 16:43:17.376497 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:17.376481 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-5.ec2.internal\" not found" Mar 18 16:43:17.418751 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:17.418733 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-5.ec2.internal" Mar 18 16:43:17.421270 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:17.421256 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-5.ec2.internal" Mar 18 16:43:17.477198 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:17.477178 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-5.ec2.internal\" not found" Mar 18 16:43:17.577708 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:17.577655 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-5.ec2.internal\" not found" Mar 18 16:43:17.678183 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:17.678160 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-5.ec2.internal\" not found" Mar 18 16:43:17.752782 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:17.752762 2562 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:43:17.758915 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:17.758894 2562 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 18 16:43:17.759042 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:17.759025 2562 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Mar 18 16:43:17.759085 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:17.759061 2562 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Mar 18 16:43:17.778222 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:17.778201 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-5.ec2.internal\" not found" Mar 18 16:43:17.874981 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:17.874919 2562 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Mar 18 16:43:17.878793 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:17.878778 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-5.ec2.internal\" not found" Mar 18 16:43:17.890783 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:17.890631 2562 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Mar 18 16:43:17.897834 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:17.897806 2562 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-03-17 16:38:16 +0000 UTC" deadline="2027-10-03 03:49:04.945529851 +0000 UTC" Mar 18 16:43:17.897834 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:17.897831 2562 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13523h5m47.047701792s" Mar 18 16:43:17.915218 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:17.915197 2562 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-fv58l" Mar 18 16:43:17.922482 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:17.922464 2562 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-fv58l" Mar 18 16:43:17.979733 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:17.979711 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-5.ec2.internal\" not found" Mar 18 16:43:18.039446 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.039422 2562 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:43:18.064684 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:18.064655 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6da92965ac9d0f6a332c1d675e2f540d.slice/crio-6bda4b40a3290d9bacc184bbaba0bb0f9ade86d5629249bc73ce375ea3c8db21 WatchSource:0}: Error finding container 6bda4b40a3290d9bacc184bbaba0bb0f9ade86d5629249bc73ce375ea3c8db21: Status 404 returned error can't find the container with id 6bda4b40a3290d9bacc184bbaba0bb0f9ade86d5629249bc73ce375ea3c8db21 Mar 18 16:43:18.065077 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:18.065056 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc51fc8cf8e0664cd067f24564b101ef.slice/crio-7b8c77ed38e9ab3a66329b367a6af151b7fc7c39598da9d4d1d09e100648ed92 WatchSource:0}: Error finding container 7b8c77ed38e9ab3a66329b367a6af151b7fc7c39598da9d4d1d09e100648ed92: Status 404 returned error can't find the container with id 7b8c77ed38e9ab3a66329b367a6af151b7fc7c39598da9d4d1d09e100648ed92 Mar 18 16:43:18.069466 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.069452 2562 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:43:18.079988 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:18.079970 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-5.ec2.internal\" not found" Mar 18 16:43:18.180488 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:18.180439 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-5.ec2.internal\" not found" Mar 18 16:43:18.281000 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:18.280977 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-5.ec2.internal\" not found" Mar 18 16:43:18.340794 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.340771 2562 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:43:18.375352 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.375327 2562 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-5.ec2.internal" Mar 18 16:43:18.387629 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.387607 2562 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 18 16:43:18.388915 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.388901 2562 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-5.ec2.internal" Mar 18 16:43:18.396919 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.396904 2562 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 18 16:43:18.851446 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.851399 2562 apiserver.go:52] "Watching apiserver" Mar 18 16:43:18.858886 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.858476 2562 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Mar 18 16:43:18.860225 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.860135 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-62kx2","openshift-cluster-node-tuning-operator/tuned-bhcbr","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-5.ec2.internal","openshift-multus/network-metrics-daemon-dlnjg","openshift-network-diagnostics/network-check-target-mnsnh","openshift-network-operator/iptables-alerter-s4vkz","openshift-ovn-kubernetes/ovnkube-node-nsv52","kube-system/kube-apiserver-proxy-ip-10-0-131-5.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtmbp","openshift-dns/node-resolver-zd7qk","openshift-image-registry/node-ca-9t8zk","openshift-multus/multus-additional-cni-plugins-qhxzz","openshift-multus/multus-whxl2"] Mar 18 16:43:18.865961 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.865536 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-bhcbr" Mar 18 16:43:18.867648 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.867627 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-wmhkw\"" Mar 18 16:43:18.867829 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.867814 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Mar 18 16:43:18.868094 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.868077 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Mar 18 16:43:18.868905 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.868730 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlnjg" Mar 18 16:43:18.868905 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:18.868817 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlnjg" podUID="008308d5-c00b-40b2-a413-eb4caebc48c0" Mar 18 16:43:18.871652 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.871635 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.873604 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.873575 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Mar 18 16:43:18.873784 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.873764 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Mar 18 16:43:18.873953 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.873923 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-rfqzx\"" Mar 18 16:43:18.874225 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.874209 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Mar 18 16:43:18.874337 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.874308 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Mar 18 16:43:18.874402 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.874359 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Mar 18 16:43:18.874548 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.874532 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Mar 18 16:43:18.874972 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.874878 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mnsnh" Mar 18 16:43:18.874972 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:18.874953 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mnsnh" podUID="473b4404-da3f-4c35-92c4-a69465dc3f06" Mar 18 16:43:18.877307 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.877286 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-s4vkz" Mar 18 16:43:18.877403 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.877388 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-62kx2" Mar 18 16:43:18.883655 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.883417 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-kjv5x\"" Mar 18 16:43:18.883655 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.883447 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Mar 18 16:43:18.883655 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.883542 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-q6bb2\"" Mar 18 16:43:18.883655 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.883655 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Mar 18 16:43:18.883879 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.883732 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Mar 18 16:43:18.883879 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.883797 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Mar 18 16:43:18.885337 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.883420 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Mar 18 16:43:18.885337 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.884657 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtmbp" Mar 18 16:43:18.885337 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.884750 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zd7qk" Mar 18 16:43:18.887517 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.887234 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Mar 18 16:43:18.888685 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.887747 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Mar 18 16:43:18.888685 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.887813 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Mar 18 16:43:18.888685 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.888168 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Mar 18 16:43:18.888685 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.888291 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Mar 18 16:43:18.888975 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.888790 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-9rc74\"" Mar 18 16:43:18.888975 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.888802 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7177d49a-364c-4c44-9dd5-4fa101f99bdb-konnectivity-ca\") pod \"konnectivity-agent-62kx2\" (UID: \"7177d49a-364c-4c44-9dd5-4fa101f99bdb\") " pod="kube-system/konnectivity-agent-62kx2" Mar 18 16:43:18.888975 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.888845 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/09466950-dea7-48b9-b4a4-b9b73d845973-run-systemd\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.888975 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.888878 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/09466950-dea7-48b9-b4a4-b9b73d845973-run-ovn\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.888975 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.888953 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/09466950-dea7-48b9-b4a4-b9b73d845973-node-log\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.889234 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.888988 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/09466950-dea7-48b9-b4a4-b9b73d845973-log-socket\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.889234 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.889014 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ba39ce75-7939-4e19-9800-7072765d139b-etc-sysconfig\") pod \"tuned-bhcbr\" (UID: \"ba39ce75-7939-4e19-9800-7072765d139b\") " pod="openshift-cluster-node-tuning-operator/tuned-bhcbr" Mar 18 16:43:18.889234 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.889043 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ba39ce75-7939-4e19-9800-7072765d139b-sys\") pod \"tuned-bhcbr\" (UID: \"ba39ce75-7939-4e19-9800-7072765d139b\") " pod="openshift-cluster-node-tuning-operator/tuned-bhcbr" Mar 18 16:43:18.889234 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.889072 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ba39ce75-7939-4e19-9800-7072765d139b-tmp\") pod \"tuned-bhcbr\" (UID: \"ba39ce75-7939-4e19-9800-7072765d139b\") " pod="openshift-cluster-node-tuning-operator/tuned-bhcbr" Mar 18 16:43:18.889234 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.889106 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/09466950-dea7-48b9-b4a4-b9b73d845973-etc-openvswitch\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.889234 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.889139 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/09466950-dea7-48b9-b4a4-b9b73d845973-ovnkube-script-lib\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.889234 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.889166 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6w48m\" (UniqueName: \"kubernetes.io/projected/09466950-dea7-48b9-b4a4-b9b73d845973-kube-api-access-6w48m\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.889234 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.889207 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ba39ce75-7939-4e19-9800-7072765d139b-run\") pod \"tuned-bhcbr\" (UID: \"ba39ce75-7939-4e19-9800-7072765d139b\") " pod="openshift-cluster-node-tuning-operator/tuned-bhcbr" Mar 18 16:43:18.889234 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.889235 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ba39ce75-7939-4e19-9800-7072765d139b-host\") pod \"tuned-bhcbr\" (UID: \"ba39ce75-7939-4e19-9800-7072765d139b\") " pod="openshift-cluster-node-tuning-operator/tuned-bhcbr" Mar 18 16:43:18.889658 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.889265 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mmr2\" (UniqueName: \"kubernetes.io/projected/ba39ce75-7939-4e19-9800-7072765d139b-kube-api-access-7mmr2\") pod \"tuned-bhcbr\" (UID: \"ba39ce75-7939-4e19-9800-7072765d139b\") " pod="openshift-cluster-node-tuning-operator/tuned-bhcbr" Mar 18 16:43:18.889658 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.889299 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/09466950-dea7-48b9-b4a4-b9b73d845973-var-lib-openvswitch\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.889658 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.889332 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/09466950-dea7-48b9-b4a4-b9b73d845973-host-cni-netd\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.889658 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.889358 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/09466950-dea7-48b9-b4a4-b9b73d845973-ovnkube-config\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.889658 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.889407 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-84sd2\"" Mar 18 16:43:18.889658 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.889426 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ba39ce75-7939-4e19-9800-7072765d139b-etc-kubernetes\") pod \"tuned-bhcbr\" (UID: \"ba39ce75-7939-4e19-9800-7072765d139b\") " pod="openshift-cluster-node-tuning-operator/tuned-bhcbr" Mar 18 16:43:18.889658 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.889574 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ba39ce75-7939-4e19-9800-7072765d139b-etc-sysctl-conf\") pod \"tuned-bhcbr\" (UID: \"ba39ce75-7939-4e19-9800-7072765d139b\") " pod="openshift-cluster-node-tuning-operator/tuned-bhcbr" Mar 18 16:43:18.889658 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.889623 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ba39ce75-7939-4e19-9800-7072765d139b-etc-tuned\") pod \"tuned-bhcbr\" (UID: \"ba39ce75-7939-4e19-9800-7072765d139b\") " pod="openshift-cluster-node-tuning-operator/tuned-bhcbr" Mar 18 16:43:18.890632 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.889679 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/008308d5-c00b-40b2-a413-eb4caebc48c0-metrics-certs\") pod \"network-metrics-daemon-dlnjg\" (UID: \"008308d5-c00b-40b2-a413-eb4caebc48c0\") " pod="openshift-multus/network-metrics-daemon-dlnjg" Mar 18 16:43:18.890632 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.889722 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/09466950-dea7-48b9-b4a4-b9b73d845973-systemd-units\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.890632 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.889757 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/09466950-dea7-48b9-b4a4-b9b73d845973-host-run-netns\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.890632 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.889821 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/09466950-dea7-48b9-b4a4-b9b73d845973-env-overrides\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.890632 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.889957 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/09466950-dea7-48b9-b4a4-b9b73d845973-ovn-node-metrics-cert\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.890632 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.889990 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/203ce5e3-84d7-4528-841c-52a57d9ccb6e-iptables-alerter-script\") pod \"iptables-alerter-s4vkz\" (UID: \"203ce5e3-84d7-4528-841c-52a57d9ccb6e\") " pod="openshift-network-operator/iptables-alerter-s4vkz" Mar 18 16:43:18.890632 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.890019 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55b5r\" (UniqueName: \"kubernetes.io/projected/203ce5e3-84d7-4528-841c-52a57d9ccb6e-kube-api-access-55b5r\") pod \"iptables-alerter-s4vkz\" (UID: \"203ce5e3-84d7-4528-841c-52a57d9ccb6e\") " pod="openshift-network-operator/iptables-alerter-s4vkz" Mar 18 16:43:18.890632 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.890054 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/09466950-dea7-48b9-b4a4-b9b73d845973-run-openvswitch\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.890632 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.890082 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/09466950-dea7-48b9-b4a4-b9b73d845973-host-cni-bin\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.890632 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.890160 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ba39ce75-7939-4e19-9800-7072765d139b-etc-systemd\") pod \"tuned-bhcbr\" (UID: \"ba39ce75-7939-4e19-9800-7072765d139b\") " pod="openshift-cluster-node-tuning-operator/tuned-bhcbr" Mar 18 16:43:18.890632 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.890200 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ba39ce75-7939-4e19-9800-7072765d139b-var-lib-kubelet\") pod \"tuned-bhcbr\" (UID: \"ba39ce75-7939-4e19-9800-7072765d139b\") " pod="openshift-cluster-node-tuning-operator/tuned-bhcbr" Mar 18 16:43:18.890632 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.890236 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z259z\" (UniqueName: \"kubernetes.io/projected/473b4404-da3f-4c35-92c4-a69465dc3f06-kube-api-access-z259z\") pod \"network-check-target-mnsnh\" (UID: \"473b4404-da3f-4c35-92c4-a69465dc3f06\") " pod="openshift-network-diagnostics/network-check-target-mnsnh" Mar 18 16:43:18.890632 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.890268 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7177d49a-364c-4c44-9dd5-4fa101f99bdb-agent-certs\") pod \"konnectivity-agent-62kx2\" (UID: \"7177d49a-364c-4c44-9dd5-4fa101f99bdb\") " pod="kube-system/konnectivity-agent-62kx2" Mar 18 16:43:18.890632 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.890298 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/09466950-dea7-48b9-b4a4-b9b73d845973-host-kubelet\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.890632 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.890322 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/09466950-dea7-48b9-b4a4-b9b73d845973-host-slash\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.890632 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.890356 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ba39ce75-7939-4e19-9800-7072765d139b-etc-modprobe-d\") pod \"tuned-bhcbr\" (UID: \"ba39ce75-7939-4e19-9800-7072765d139b\") " pod="openshift-cluster-node-tuning-operator/tuned-bhcbr" Mar 18 16:43:18.891400 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.890386 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ba39ce75-7939-4e19-9800-7072765d139b-etc-sysctl-d\") pod \"tuned-bhcbr\" (UID: \"ba39ce75-7939-4e19-9800-7072765d139b\") " pod="openshift-cluster-node-tuning-operator/tuned-bhcbr" Mar 18 16:43:18.891400 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.890415 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/203ce5e3-84d7-4528-841c-52a57d9ccb6e-host-slash\") pod \"iptables-alerter-s4vkz\" (UID: \"203ce5e3-84d7-4528-841c-52a57d9ccb6e\") " pod="openshift-network-operator/iptables-alerter-s4vkz" Mar 18 16:43:18.891400 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.890447 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/09466950-dea7-48b9-b4a4-b9b73d845973-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.891400 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.890476 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/09466950-dea7-48b9-b4a4-b9b73d845973-host-run-ovn-kubernetes\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.891400 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.890504 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ba39ce75-7939-4e19-9800-7072765d139b-lib-modules\") pod \"tuned-bhcbr\" (UID: \"ba39ce75-7939-4e19-9800-7072765d139b\") " pod="openshift-cluster-node-tuning-operator/tuned-bhcbr" Mar 18 16:43:18.891400 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.890550 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f42k\" (UniqueName: \"kubernetes.io/projected/008308d5-c00b-40b2-a413-eb4caebc48c0-kube-api-access-6f42k\") pod \"network-metrics-daemon-dlnjg\" (UID: \"008308d5-c00b-40b2-a413-eb4caebc48c0\") " pod="openshift-multus/network-metrics-daemon-dlnjg" Mar 18 16:43:18.891739 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.891716 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qhxzz" Mar 18 16:43:18.894758 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.894032 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Mar 18 16:43:18.894758 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.894146 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-7z5fg\"" Mar 18 16:43:18.894758 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.894187 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Mar 18 16:43:18.894758 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.894293 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Mar 18 16:43:18.894758 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.894314 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Mar 18 16:43:18.894758 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.894401 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Mar 18 16:43:18.896726 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.896321 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-whxl2" Mar 18 16:43:18.896726 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.896615 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9t8zk" Mar 18 16:43:18.898147 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.898122 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Mar 18 16:43:18.898404 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.898380 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Mar 18 16:43:18.898493 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.898196 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-bnwwn\"" Mar 18 16:43:18.898706 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.898686 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Mar 18 16:43:18.898829 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.898808 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-lstmt\"" Mar 18 16:43:18.899054 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.899038 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Mar 18 16:43:18.923114 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.923079 2562 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-03-17 16:38:17 +0000 UTC" deadline="2027-11-14 08:07:24.633252559 +0000 UTC" Mar 18 16:43:18.923114 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.923108 2562 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14535h24m5.710149632s" Mar 18 16:43:18.955794 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.955742 2562 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:43:18.977806 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.977780 2562 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Mar 18 16:43:18.984881 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.984840 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-5.ec2.internal" event={"ID":"6da92965ac9d0f6a332c1d675e2f540d","Type":"ContainerStarted","Data":"6bda4b40a3290d9bacc184bbaba0bb0f9ade86d5629249bc73ce375ea3c8db21"} Mar 18 16:43:18.986152 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.986084 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-5.ec2.internal" event={"ID":"cc51fc8cf8e0664cd067f24564b101ef","Type":"ContainerStarted","Data":"7b8c77ed38e9ab3a66329b367a6af151b7fc7c39598da9d4d1d09e100648ed92"} Mar 18 16:43:18.990957 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.990777 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6f42k\" (UniqueName: \"kubernetes.io/projected/008308d5-c00b-40b2-a413-eb4caebc48c0-kube-api-access-6f42k\") pod \"network-metrics-daemon-dlnjg\" (UID: \"008308d5-c00b-40b2-a413-eb4caebc48c0\") " pod="openshift-multus/network-metrics-daemon-dlnjg" Mar 18 16:43:18.990957 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.990811 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7177d49a-364c-4c44-9dd5-4fa101f99bdb-konnectivity-ca\") pod \"konnectivity-agent-62kx2\" (UID: \"7177d49a-364c-4c44-9dd5-4fa101f99bdb\") " pod="kube-system/konnectivity-agent-62kx2" Mar 18 16:43:18.990957 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.990844 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e010756a-9f4e-48da-b759-b8f2ff9d1f1b-cni-binary-copy\") pod \"multus-whxl2\" (UID: \"e010756a-9f4e-48da-b759-b8f2ff9d1f1b\") " pod="openshift-multus/multus-whxl2" Mar 18 16:43:18.990957 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.990880 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ce59883f-2102-41a1-af45-7415b5af228f-sys-fs\") pod \"aws-ebs-csi-driver-node-dtmbp\" (UID: \"ce59883f-2102-41a1-af45-7415b5af228f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtmbp" Mar 18 16:43:18.990957 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.990906 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/09466950-dea7-48b9-b4a4-b9b73d845973-run-systemd\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.990957 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.990929 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/09466950-dea7-48b9-b4a4-b9b73d845973-log-socket\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.991271 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.990971 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ba39ce75-7939-4e19-9800-7072765d139b-etc-sysconfig\") pod \"tuned-bhcbr\" (UID: \"ba39ce75-7939-4e19-9800-7072765d139b\") " pod="openshift-cluster-node-tuning-operator/tuned-bhcbr" Mar 18 16:43:18.991271 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.990993 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ba39ce75-7939-4e19-9800-7072765d139b-tmp\") pod \"tuned-bhcbr\" (UID: \"ba39ce75-7939-4e19-9800-7072765d139b\") " pod="openshift-cluster-node-tuning-operator/tuned-bhcbr" Mar 18 16:43:18.991271 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.991018 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e010756a-9f4e-48da-b759-b8f2ff9d1f1b-cnibin\") pod \"multus-whxl2\" (UID: \"e010756a-9f4e-48da-b759-b8f2ff9d1f1b\") " pod="openshift-multus/multus-whxl2" Mar 18 16:43:18.991271 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.991044 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e010756a-9f4e-48da-b759-b8f2ff9d1f1b-host-run-k8s-cni-cncf-io\") pod \"multus-whxl2\" (UID: \"e010756a-9f4e-48da-b759-b8f2ff9d1f1b\") " pod="openshift-multus/multus-whxl2" Mar 18 16:43:18.991271 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.991077 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e010756a-9f4e-48da-b759-b8f2ff9d1f1b-multus-conf-dir\") pod \"multus-whxl2\" (UID: \"e010756a-9f4e-48da-b759-b8f2ff9d1f1b\") " pod="openshift-multus/multus-whxl2" Mar 18 16:43:18.991271 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.991103 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/09466950-dea7-48b9-b4a4-b9b73d845973-etc-openvswitch\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.991271 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.991126 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ba39ce75-7939-4e19-9800-7072765d139b-run\") pod \"tuned-bhcbr\" (UID: \"ba39ce75-7939-4e19-9800-7072765d139b\") " pod="openshift-cluster-node-tuning-operator/tuned-bhcbr" Mar 18 16:43:18.991271 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.991148 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ba39ce75-7939-4e19-9800-7072765d139b-host\") pod \"tuned-bhcbr\" (UID: \"ba39ce75-7939-4e19-9800-7072765d139b\") " pod="openshift-cluster-node-tuning-operator/tuned-bhcbr" Mar 18 16:43:18.991271 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.991171 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7mmr2\" (UniqueName: \"kubernetes.io/projected/ba39ce75-7939-4e19-9800-7072765d139b-kube-api-access-7mmr2\") pod \"tuned-bhcbr\" (UID: \"ba39ce75-7939-4e19-9800-7072765d139b\") " pod="openshift-cluster-node-tuning-operator/tuned-bhcbr" Mar 18 16:43:18.991271 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.991198 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dcfc6f29-52f0-4f09-b50d-f044f9886e51-system-cni-dir\") pod \"multus-additional-cni-plugins-qhxzz\" (UID: \"dcfc6f29-52f0-4f09-b50d-f044f9886e51\") " pod="openshift-multus/multus-additional-cni-plugins-qhxzz" Mar 18 16:43:18.991271 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.991221 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dcfc6f29-52f0-4f09-b50d-f044f9886e51-os-release\") pod \"multus-additional-cni-plugins-qhxzz\" (UID: \"dcfc6f29-52f0-4f09-b50d-f044f9886e51\") " pod="openshift-multus/multus-additional-cni-plugins-qhxzz" Mar 18 16:43:18.991271 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.991246 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e010756a-9f4e-48da-b759-b8f2ff9d1f1b-etc-kubernetes\") pod \"multus-whxl2\" (UID: \"e010756a-9f4e-48da-b759-b8f2ff9d1f1b\") " pod="openshift-multus/multus-whxl2" Mar 18 16:43:18.991271 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.991270 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dcfc6f29-52f0-4f09-b50d-f044f9886e51-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qhxzz\" (UID: \"dcfc6f29-52f0-4f09-b50d-f044f9886e51\") " pod="openshift-multus/multus-additional-cni-plugins-qhxzz" Mar 18 16:43:18.991986 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.991297 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ba39ce75-7939-4e19-9800-7072765d139b-etc-kubernetes\") pod \"tuned-bhcbr\" (UID: \"ba39ce75-7939-4e19-9800-7072765d139b\") " pod="openshift-cluster-node-tuning-operator/tuned-bhcbr" Mar 18 16:43:18.991986 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.991319 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/008308d5-c00b-40b2-a413-eb4caebc48c0-metrics-certs\") pod \"network-metrics-daemon-dlnjg\" (UID: \"008308d5-c00b-40b2-a413-eb4caebc48c0\") " pod="openshift-multus/network-metrics-daemon-dlnjg" Mar 18 16:43:18.991986 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.991351 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e010756a-9f4e-48da-b759-b8f2ff9d1f1b-multus-cni-dir\") pod \"multus-whxl2\" (UID: \"e010756a-9f4e-48da-b759-b8f2ff9d1f1b\") " pod="openshift-multus/multus-whxl2" Mar 18 16:43:18.991986 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.991376 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e010756a-9f4e-48da-b759-b8f2ff9d1f1b-host-run-netns\") pod \"multus-whxl2\" (UID: \"e010756a-9f4e-48da-b759-b8f2ff9d1f1b\") " pod="openshift-multus/multus-whxl2" Mar 18 16:43:18.991986 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.991403 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e010756a-9f4e-48da-b759-b8f2ff9d1f1b-host-run-multus-certs\") pod \"multus-whxl2\" (UID: \"e010756a-9f4e-48da-b759-b8f2ff9d1f1b\") " pod="openshift-multus/multus-whxl2" Mar 18 16:43:18.991986 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.991447 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ce59883f-2102-41a1-af45-7415b5af228f-registration-dir\") pod \"aws-ebs-csi-driver-node-dtmbp\" (UID: \"ce59883f-2102-41a1-af45-7415b5af228f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtmbp" Mar 18 16:43:18.991986 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.991474 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/09466950-dea7-48b9-b4a4-b9b73d845973-ovn-node-metrics-cert\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.991986 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.991479 2562 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 18 16:43:18.991986 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.991501 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/203ce5e3-84d7-4528-841c-52a57d9ccb6e-iptables-alerter-script\") pod \"iptables-alerter-s4vkz\" (UID: \"203ce5e3-84d7-4528-841c-52a57d9ccb6e\") " pod="openshift-network-operator/iptables-alerter-s4vkz" Mar 18 16:43:18.991986 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.991533 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-55b5r\" (UniqueName: \"kubernetes.io/projected/203ce5e3-84d7-4528-841c-52a57d9ccb6e-kube-api-access-55b5r\") pod \"iptables-alerter-s4vkz\" (UID: \"203ce5e3-84d7-4528-841c-52a57d9ccb6e\") " pod="openshift-network-operator/iptables-alerter-s4vkz" Mar 18 16:43:18.991986 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.991556 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/abad6a1d-3e28-4f96-90e8-383ed7b5b8e1-host\") pod \"node-ca-9t8zk\" (UID: \"abad6a1d-3e28-4f96-90e8-383ed7b5b8e1\") " pod="openshift-image-registry/node-ca-9t8zk" Mar 18 16:43:18.991986 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.991577 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dcfc6f29-52f0-4f09-b50d-f044f9886e51-cni-binary-copy\") pod \"multus-additional-cni-plugins-qhxzz\" (UID: \"dcfc6f29-52f0-4f09-b50d-f044f9886e51\") " pod="openshift-multus/multus-additional-cni-plugins-qhxzz" Mar 18 16:43:18.991986 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.991599 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e010756a-9f4e-48da-b759-b8f2ff9d1f1b-multus-daemon-config\") pod \"multus-whxl2\" (UID: \"e010756a-9f4e-48da-b759-b8f2ff9d1f1b\") " pod="openshift-multus/multus-whxl2" Mar 18 16:43:18.991986 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.991622 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/09466950-dea7-48b9-b4a4-b9b73d845973-host-cni-bin\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.991986 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.991645 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/abad6a1d-3e28-4f96-90e8-383ed7b5b8e1-serviceca\") pod \"node-ca-9t8zk\" (UID: \"abad6a1d-3e28-4f96-90e8-383ed7b5b8e1\") " pod="openshift-image-registry/node-ca-9t8zk" Mar 18 16:43:18.991986 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.991671 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ba39ce75-7939-4e19-9800-7072765d139b-run\") pod \"tuned-bhcbr\" (UID: \"ba39ce75-7939-4e19-9800-7072765d139b\") " pod="openshift-cluster-node-tuning-operator/tuned-bhcbr" Mar 18 16:43:18.991986 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.991671 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dcfc6f29-52f0-4f09-b50d-f044f9886e51-cnibin\") pod \"multus-additional-cni-plugins-qhxzz\" (UID: \"dcfc6f29-52f0-4f09-b50d-f044f9886e51\") " pod="openshift-multus/multus-additional-cni-plugins-qhxzz" Mar 18 16:43:18.992803 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.991720 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jxjl\" (UniqueName: \"kubernetes.io/projected/dcfc6f29-52f0-4f09-b50d-f044f9886e51-kube-api-access-9jxjl\") pod \"multus-additional-cni-plugins-qhxzz\" (UID: \"dcfc6f29-52f0-4f09-b50d-f044f9886e51\") " pod="openshift-multus/multus-additional-cni-plugins-qhxzz" Mar 18 16:43:18.992803 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.991746 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ba39ce75-7939-4e19-9800-7072765d139b-host\") pod \"tuned-bhcbr\" (UID: \"ba39ce75-7939-4e19-9800-7072765d139b\") " pod="openshift-cluster-node-tuning-operator/tuned-bhcbr" Mar 18 16:43:18.992803 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.991756 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e010756a-9f4e-48da-b759-b8f2ff9d1f1b-host-var-lib-cni-multus\") pod \"multus-whxl2\" (UID: \"e010756a-9f4e-48da-b759-b8f2ff9d1f1b\") " pod="openshift-multus/multus-whxl2" Mar 18 16:43:18.992803 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.991781 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ce59883f-2102-41a1-af45-7415b5af228f-socket-dir\") pod \"aws-ebs-csi-driver-node-dtmbp\" (UID: \"ce59883f-2102-41a1-af45-7415b5af228f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtmbp" Mar 18 16:43:18.992803 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.991805 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/184966df-7341-412a-aede-a32364efc520-hosts-file\") pod \"node-resolver-zd7qk\" (UID: \"184966df-7341-412a-aede-a32364efc520\") " pod="openshift-dns/node-resolver-zd7qk" Mar 18 16:43:18.992803 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.991844 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/09466950-dea7-48b9-b4a4-b9b73d845973-host-kubelet\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.992803 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.991899 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/09466950-dea7-48b9-b4a4-b9b73d845973-run-systemd\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.992803 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.991915 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/09466950-dea7-48b9-b4a4-b9b73d845973-etc-openvswitch\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.992803 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.991957 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/09466950-dea7-48b9-b4a4-b9b73d845973-host-cni-bin\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.992803 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.991991 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ba39ce75-7939-4e19-9800-7072765d139b-etc-kubernetes\") pod \"tuned-bhcbr\" (UID: \"ba39ce75-7939-4e19-9800-7072765d139b\") " pod="openshift-cluster-node-tuning-operator/tuned-bhcbr" Mar 18 16:43:18.992803 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.992024 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ba39ce75-7939-4e19-9800-7072765d139b-etc-sysconfig\") pod \"tuned-bhcbr\" (UID: \"ba39ce75-7939-4e19-9800-7072765d139b\") " pod="openshift-cluster-node-tuning-operator/tuned-bhcbr" Mar 18 16:43:18.992803 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.992025 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/09466950-dea7-48b9-b4a4-b9b73d845973-host-slash\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.992803 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.992059 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/09466950-dea7-48b9-b4a4-b9b73d845973-host-kubelet\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.992803 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:18.992062 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:43:18.992803 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.992067 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/09466950-dea7-48b9-b4a4-b9b73d845973-host-slash\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.992803 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.992098 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/203ce5e3-84d7-4528-841c-52a57d9ccb6e-host-slash\") pod \"iptables-alerter-s4vkz\" (UID: \"203ce5e3-84d7-4528-841c-52a57d9ccb6e\") " pod="openshift-network-operator/iptables-alerter-s4vkz" Mar 18 16:43:18.992803 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.992145 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/203ce5e3-84d7-4528-841c-52a57d9ccb6e-host-slash\") pod \"iptables-alerter-s4vkz\" (UID: \"203ce5e3-84d7-4528-841c-52a57d9ccb6e\") " pod="openshift-network-operator/iptables-alerter-s4vkz" Mar 18 16:43:18.993602 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:18.992157 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/008308d5-c00b-40b2-a413-eb4caebc48c0-metrics-certs podName:008308d5-c00b-40b2-a413-eb4caebc48c0 nodeName:}" failed. No retries permitted until 2026-03-18 16:43:19.492110127 +0000 UTC m=+3.178232487 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/008308d5-c00b-40b2-a413-eb4caebc48c0-metrics-certs") pod "network-metrics-daemon-dlnjg" (UID: "008308d5-c00b-40b2-a413-eb4caebc48c0") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:43:18.993602 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.992179 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ce59883f-2102-41a1-af45-7415b5af228f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-dtmbp\" (UID: \"ce59883f-2102-41a1-af45-7415b5af228f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtmbp" Mar 18 16:43:18.993602 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.992222 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7v7m\" (UniqueName: \"kubernetes.io/projected/184966df-7341-412a-aede-a32364efc520-kube-api-access-n7v7m\") pod \"node-resolver-zd7qk\" (UID: \"184966df-7341-412a-aede-a32364efc520\") " pod="openshift-dns/node-resolver-zd7qk" Mar 18 16:43:18.993602 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.992458 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/09466950-dea7-48b9-b4a4-b9b73d845973-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.993602 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.992496 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e010756a-9f4e-48da-b759-b8f2ff9d1f1b-hostroot\") pod \"multus-whxl2\" (UID: \"e010756a-9f4e-48da-b759-b8f2ff9d1f1b\") " pod="openshift-multus/multus-whxl2" Mar 18 16:43:18.993602 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.992532 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/09466950-dea7-48b9-b4a4-b9b73d845973-host-run-ovn-kubernetes\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.993602 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.992538 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/09466950-dea7-48b9-b4a4-b9b73d845973-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.993602 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.992588 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ba39ce75-7939-4e19-9800-7072765d139b-lib-modules\") pod \"tuned-bhcbr\" (UID: \"ba39ce75-7939-4e19-9800-7072765d139b\") " pod="openshift-cluster-node-tuning-operator/tuned-bhcbr" Mar 18 16:43:18.993602 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.992621 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e010756a-9f4e-48da-b759-b8f2ff9d1f1b-multus-socket-dir-parent\") pod \"multus-whxl2\" (UID: \"e010756a-9f4e-48da-b759-b8f2ff9d1f1b\") " pod="openshift-multus/multus-whxl2" Mar 18 16:43:18.993602 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.992645 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ce59883f-2102-41a1-af45-7415b5af228f-device-dir\") pod \"aws-ebs-csi-driver-node-dtmbp\" (UID: \"ce59883f-2102-41a1-af45-7415b5af228f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtmbp" Mar 18 16:43:18.993602 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.992672 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/09466950-dea7-48b9-b4a4-b9b73d845973-run-ovn\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.993602 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.992725 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/09466950-dea7-48b9-b4a4-b9b73d845973-node-log\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.993602 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.992750 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ba39ce75-7939-4e19-9800-7072765d139b-sys\") pod \"tuned-bhcbr\" (UID: \"ba39ce75-7939-4e19-9800-7072765d139b\") " pod="openshift-cluster-node-tuning-operator/tuned-bhcbr" Mar 18 16:43:18.993602 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.992778 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/09466950-dea7-48b9-b4a4-b9b73d845973-ovnkube-script-lib\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.993602 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.992789 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/09466950-dea7-48b9-b4a4-b9b73d845973-log-socket\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.993602 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.992804 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6w48m\" (UniqueName: \"kubernetes.io/projected/09466950-dea7-48b9-b4a4-b9b73d845973-kube-api-access-6w48m\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.994382 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.992824 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/09466950-dea7-48b9-b4a4-b9b73d845973-run-ovn\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.994382 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.992833 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/dcfc6f29-52f0-4f09-b50d-f044f9886e51-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-qhxzz\" (UID: \"dcfc6f29-52f0-4f09-b50d-f044f9886e51\") " pod="openshift-multus/multus-additional-cni-plugins-qhxzz" Mar 18 16:43:18.994382 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.992857 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/09466950-dea7-48b9-b4a4-b9b73d845973-host-run-ovn-kubernetes\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.994382 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.992895 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/09466950-dea7-48b9-b4a4-b9b73d845973-var-lib-openvswitch\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.994382 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.992920 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/09466950-dea7-48b9-b4a4-b9b73d845973-host-cni-netd\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.994382 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.992966 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/09466950-dea7-48b9-b4a4-b9b73d845973-ovnkube-config\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.994382 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.992993 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ba39ce75-7939-4e19-9800-7072765d139b-etc-sysctl-conf\") pod \"tuned-bhcbr\" (UID: \"ba39ce75-7939-4e19-9800-7072765d139b\") " pod="openshift-cluster-node-tuning-operator/tuned-bhcbr" Mar 18 16:43:18.994382 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.993017 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ba39ce75-7939-4e19-9800-7072765d139b-etc-tuned\") pod \"tuned-bhcbr\" (UID: \"ba39ce75-7939-4e19-9800-7072765d139b\") " pod="openshift-cluster-node-tuning-operator/tuned-bhcbr" Mar 18 16:43:18.994382 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.993044 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e010756a-9f4e-48da-b759-b8f2ff9d1f1b-system-cni-dir\") pod \"multus-whxl2\" (UID: \"e010756a-9f4e-48da-b759-b8f2ff9d1f1b\") " pod="openshift-multus/multus-whxl2" Mar 18 16:43:18.994382 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.993057 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7177d49a-364c-4c44-9dd5-4fa101f99bdb-konnectivity-ca\") pod \"konnectivity-agent-62kx2\" (UID: \"7177d49a-364c-4c44-9dd5-4fa101f99bdb\") " pod="kube-system/konnectivity-agent-62kx2" Mar 18 16:43:18.994382 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.993093 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ba39ce75-7939-4e19-9800-7072765d139b-sys\") pod \"tuned-bhcbr\" (UID: \"ba39ce75-7939-4e19-9800-7072765d139b\") " pod="openshift-cluster-node-tuning-operator/tuned-bhcbr" Mar 18 16:43:18.994382 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.993147 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/09466950-dea7-48b9-b4a4-b9b73d845973-systemd-units\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.994382 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.993161 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/203ce5e3-84d7-4528-841c-52a57d9ccb6e-iptables-alerter-script\") pod \"iptables-alerter-s4vkz\" (UID: \"203ce5e3-84d7-4528-841c-52a57d9ccb6e\") " pod="openshift-network-operator/iptables-alerter-s4vkz" Mar 18 16:43:18.994382 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.993177 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/09466950-dea7-48b9-b4a4-b9b73d845973-host-run-netns\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.994382 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.993181 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/09466950-dea7-48b9-b4a4-b9b73d845973-var-lib-openvswitch\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.994382 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.993208 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/09466950-dea7-48b9-b4a4-b9b73d845973-env-overrides\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.994382 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.993221 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/09466950-dea7-48b9-b4a4-b9b73d845973-host-cni-netd\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.995154 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.993238 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59zr5\" (UniqueName: \"kubernetes.io/projected/e010756a-9f4e-48da-b759-b8f2ff9d1f1b-kube-api-access-59zr5\") pod \"multus-whxl2\" (UID: \"e010756a-9f4e-48da-b759-b8f2ff9d1f1b\") " pod="openshift-multus/multus-whxl2" Mar 18 16:43:18.995154 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.993268 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/09466950-dea7-48b9-b4a4-b9b73d845973-host-run-netns\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.995154 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.993368 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ba39ce75-7939-4e19-9800-7072765d139b-etc-sysctl-conf\") pod \"tuned-bhcbr\" (UID: \"ba39ce75-7939-4e19-9800-7072765d139b\") " pod="openshift-cluster-node-tuning-operator/tuned-bhcbr" Mar 18 16:43:18.995154 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.993369 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ba39ce75-7939-4e19-9800-7072765d139b-lib-modules\") pod \"tuned-bhcbr\" (UID: \"ba39ce75-7939-4e19-9800-7072765d139b\") " pod="openshift-cluster-node-tuning-operator/tuned-bhcbr" Mar 18 16:43:18.995154 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.993381 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/09466950-dea7-48b9-b4a4-b9b73d845973-systemd-units\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.995154 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.993412 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/09466950-dea7-48b9-b4a4-b9b73d845973-node-log\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.995154 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.993442 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/09466950-dea7-48b9-b4a4-b9b73d845973-run-openvswitch\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.995154 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.993502 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ba39ce75-7939-4e19-9800-7072765d139b-etc-systemd\") pod \"tuned-bhcbr\" (UID: \"ba39ce75-7939-4e19-9800-7072765d139b\") " pod="openshift-cluster-node-tuning-operator/tuned-bhcbr" Mar 18 16:43:18.995154 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.993529 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ba39ce75-7939-4e19-9800-7072765d139b-var-lib-kubelet\") pod \"tuned-bhcbr\" (UID: \"ba39ce75-7939-4e19-9800-7072765d139b\") " pod="openshift-cluster-node-tuning-operator/tuned-bhcbr" Mar 18 16:43:18.995154 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.993570 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z259z\" (UniqueName: \"kubernetes.io/projected/473b4404-da3f-4c35-92c4-a69465dc3f06-kube-api-access-z259z\") pod \"network-check-target-mnsnh\" (UID: \"473b4404-da3f-4c35-92c4-a69465dc3f06\") " pod="openshift-network-diagnostics/network-check-target-mnsnh" Mar 18 16:43:18.995154 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.993596 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7177d49a-364c-4c44-9dd5-4fa101f99bdb-agent-certs\") pod \"konnectivity-agent-62kx2\" (UID: \"7177d49a-364c-4c44-9dd5-4fa101f99bdb\") " pod="kube-system/konnectivity-agent-62kx2" Mar 18 16:43:18.995154 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.993624 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sj48\" (UniqueName: \"kubernetes.io/projected/abad6a1d-3e28-4f96-90e8-383ed7b5b8e1-kube-api-access-9sj48\") pod \"node-ca-9t8zk\" (UID: \"abad6a1d-3e28-4f96-90e8-383ed7b5b8e1\") " pod="openshift-image-registry/node-ca-9t8zk" Mar 18 16:43:18.995154 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.993645 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/09466950-dea7-48b9-b4a4-b9b73d845973-run-openvswitch\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.995154 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.993680 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/dcfc6f29-52f0-4f09-b50d-f044f9886e51-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qhxzz\" (UID: \"dcfc6f29-52f0-4f09-b50d-f044f9886e51\") " pod="openshift-multus/multus-additional-cni-plugins-qhxzz" Mar 18 16:43:18.995154 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.993698 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ba39ce75-7939-4e19-9800-7072765d139b-var-lib-kubelet\") pod \"tuned-bhcbr\" (UID: \"ba39ce75-7939-4e19-9800-7072765d139b\") " pod="openshift-cluster-node-tuning-operator/tuned-bhcbr" Mar 18 16:43:18.995154 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.993697 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/09466950-dea7-48b9-b4a4-b9b73d845973-ovnkube-script-lib\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.995154 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.993710 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ba39ce75-7939-4e19-9800-7072765d139b-etc-modprobe-d\") pod \"tuned-bhcbr\" (UID: \"ba39ce75-7939-4e19-9800-7072765d139b\") " pod="openshift-cluster-node-tuning-operator/tuned-bhcbr" Mar 18 16:43:18.995819 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.993738 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ba39ce75-7939-4e19-9800-7072765d139b-etc-sysctl-d\") pod \"tuned-bhcbr\" (UID: \"ba39ce75-7939-4e19-9800-7072765d139b\") " pod="openshift-cluster-node-tuning-operator/tuned-bhcbr" Mar 18 16:43:18.995819 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.993736 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/09466950-dea7-48b9-b4a4-b9b73d845973-ovnkube-config\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.995819 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.993752 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/09466950-dea7-48b9-b4a4-b9b73d845973-env-overrides\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.995819 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.993808 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ba39ce75-7939-4e19-9800-7072765d139b-etc-modprobe-d\") pod \"tuned-bhcbr\" (UID: \"ba39ce75-7939-4e19-9800-7072765d139b\") " pod="openshift-cluster-node-tuning-operator/tuned-bhcbr" Mar 18 16:43:18.995819 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.993821 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ba39ce75-7939-4e19-9800-7072765d139b-etc-systemd\") pod \"tuned-bhcbr\" (UID: \"ba39ce75-7939-4e19-9800-7072765d139b\") " pod="openshift-cluster-node-tuning-operator/tuned-bhcbr" Mar 18 16:43:18.995819 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.993847 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e010756a-9f4e-48da-b759-b8f2ff9d1f1b-host-var-lib-kubelet\") pod \"multus-whxl2\" (UID: \"e010756a-9f4e-48da-b759-b8f2ff9d1f1b\") " pod="openshift-multus/multus-whxl2" Mar 18 16:43:18.995819 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.993886 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ce59883f-2102-41a1-af45-7415b5af228f-etc-selinux\") pod \"aws-ebs-csi-driver-node-dtmbp\" (UID: \"ce59883f-2102-41a1-af45-7415b5af228f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtmbp" Mar 18 16:43:18.995819 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.993916 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxplp\" (UniqueName: \"kubernetes.io/projected/ce59883f-2102-41a1-af45-7415b5af228f-kube-api-access-kxplp\") pod \"aws-ebs-csi-driver-node-dtmbp\" (UID: \"ce59883f-2102-41a1-af45-7415b5af228f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtmbp" Mar 18 16:43:18.995819 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.993923 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ba39ce75-7939-4e19-9800-7072765d139b-etc-sysctl-d\") pod \"tuned-bhcbr\" (UID: \"ba39ce75-7939-4e19-9800-7072765d139b\") " pod="openshift-cluster-node-tuning-operator/tuned-bhcbr" Mar 18 16:43:18.995819 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.993968 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e010756a-9f4e-48da-b759-b8f2ff9d1f1b-os-release\") pod \"multus-whxl2\" (UID: \"e010756a-9f4e-48da-b759-b8f2ff9d1f1b\") " pod="openshift-multus/multus-whxl2" Mar 18 16:43:18.995819 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.993998 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e010756a-9f4e-48da-b759-b8f2ff9d1f1b-host-var-lib-cni-bin\") pod \"multus-whxl2\" (UID: \"e010756a-9f4e-48da-b759-b8f2ff9d1f1b\") " pod="openshift-multus/multus-whxl2" Mar 18 16:43:18.995819 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.994052 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/184966df-7341-412a-aede-a32364efc520-tmp-dir\") pod \"node-resolver-zd7qk\" (UID: \"184966df-7341-412a-aede-a32364efc520\") " pod="openshift-dns/node-resolver-zd7qk" Mar 18 16:43:18.995819 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.994842 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ba39ce75-7939-4e19-9800-7072765d139b-tmp\") pod \"tuned-bhcbr\" (UID: \"ba39ce75-7939-4e19-9800-7072765d139b\") " pod="openshift-cluster-node-tuning-operator/tuned-bhcbr" Mar 18 16:43:18.996459 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.995834 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ba39ce75-7939-4e19-9800-7072765d139b-etc-tuned\") pod \"tuned-bhcbr\" (UID: \"ba39ce75-7939-4e19-9800-7072765d139b\") " pod="openshift-cluster-node-tuning-operator/tuned-bhcbr" Mar 18 16:43:18.996459 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.996377 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7177d49a-364c-4c44-9dd5-4fa101f99bdb-agent-certs\") pod \"konnectivity-agent-62kx2\" (UID: \"7177d49a-364c-4c44-9dd5-4fa101f99bdb\") " pod="kube-system/konnectivity-agent-62kx2" Mar 18 16:43:18.996657 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:18.996633 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/09466950-dea7-48b9-b4a4-b9b73d845973-ovn-node-metrics-cert\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:18.999476 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:18.998865 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:43:18.999476 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:18.998884 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:43:18.999476 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:18.998896 2562 projected.go:194] Error preparing data for projected volume kube-api-access-z259z for pod openshift-network-diagnostics/network-check-target-mnsnh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:43:18.999476 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:18.998970 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/473b4404-da3f-4c35-92c4-a69465dc3f06-kube-api-access-z259z podName:473b4404-da3f-4c35-92c4-a69465dc3f06 nodeName:}" failed. No retries permitted until 2026-03-18 16:43:19.498929608 +0000 UTC m=+3.185051954 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-z259z" (UniqueName: "kubernetes.io/projected/473b4404-da3f-4c35-92c4-a69465dc3f06-kube-api-access-z259z") pod "network-check-target-mnsnh" (UID: "473b4404-da3f-4c35-92c4-a69465dc3f06") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:43:19.000169 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.000107 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f42k\" (UniqueName: \"kubernetes.io/projected/008308d5-c00b-40b2-a413-eb4caebc48c0-kube-api-access-6f42k\") pod \"network-metrics-daemon-dlnjg\" (UID: \"008308d5-c00b-40b2-a413-eb4caebc48c0\") " pod="openshift-multus/network-metrics-daemon-dlnjg" Mar 18 16:43:19.001428 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.001388 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mmr2\" (UniqueName: \"kubernetes.io/projected/ba39ce75-7939-4e19-9800-7072765d139b-kube-api-access-7mmr2\") pod \"tuned-bhcbr\" (UID: \"ba39ce75-7939-4e19-9800-7072765d139b\") " pod="openshift-cluster-node-tuning-operator/tuned-bhcbr" Mar 18 16:43:19.001540 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.001518 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-55b5r\" (UniqueName: \"kubernetes.io/projected/203ce5e3-84d7-4528-841c-52a57d9ccb6e-kube-api-access-55b5r\") pod \"iptables-alerter-s4vkz\" (UID: \"203ce5e3-84d7-4528-841c-52a57d9ccb6e\") " pod="openshift-network-operator/iptables-alerter-s4vkz" Mar 18 16:43:19.002237 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.002218 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6w48m\" (UniqueName: \"kubernetes.io/projected/09466950-dea7-48b9-b4a4-b9b73d845973-kube-api-access-6w48m\") pod \"ovnkube-node-nsv52\" (UID: \"09466950-dea7-48b9-b4a4-b9b73d845973\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:19.095039 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.095013 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e010756a-9f4e-48da-b759-b8f2ff9d1f1b-system-cni-dir\") pod \"multus-whxl2\" (UID: \"e010756a-9f4e-48da-b759-b8f2ff9d1f1b\") " pod="openshift-multus/multus-whxl2" Mar 18 16:43:19.095168 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.095046 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-59zr5\" (UniqueName: \"kubernetes.io/projected/e010756a-9f4e-48da-b759-b8f2ff9d1f1b-kube-api-access-59zr5\") pod \"multus-whxl2\" (UID: \"e010756a-9f4e-48da-b759-b8f2ff9d1f1b\") " pod="openshift-multus/multus-whxl2" Mar 18 16:43:19.095168 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.095075 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9sj48\" (UniqueName: \"kubernetes.io/projected/abad6a1d-3e28-4f96-90e8-383ed7b5b8e1-kube-api-access-9sj48\") pod \"node-ca-9t8zk\" (UID: \"abad6a1d-3e28-4f96-90e8-383ed7b5b8e1\") " pod="openshift-image-registry/node-ca-9t8zk" Mar 18 16:43:19.095168 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.095100 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/dcfc6f29-52f0-4f09-b50d-f044f9886e51-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qhxzz\" (UID: \"dcfc6f29-52f0-4f09-b50d-f044f9886e51\") " pod="openshift-multus/multus-additional-cni-plugins-qhxzz" Mar 18 16:43:19.095168 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.095127 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e010756a-9f4e-48da-b759-b8f2ff9d1f1b-host-var-lib-kubelet\") pod \"multus-whxl2\" (UID: \"e010756a-9f4e-48da-b759-b8f2ff9d1f1b\") " pod="openshift-multus/multus-whxl2" Mar 18 16:43:19.095168 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.095144 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e010756a-9f4e-48da-b759-b8f2ff9d1f1b-system-cni-dir\") pod \"multus-whxl2\" (UID: \"e010756a-9f4e-48da-b759-b8f2ff9d1f1b\") " pod="openshift-multus/multus-whxl2" Mar 18 16:43:19.095168 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.095151 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ce59883f-2102-41a1-af45-7415b5af228f-etc-selinux\") pod \"aws-ebs-csi-driver-node-dtmbp\" (UID: \"ce59883f-2102-41a1-af45-7415b5af228f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtmbp" Mar 18 16:43:19.095458 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.095198 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ce59883f-2102-41a1-af45-7415b5af228f-etc-selinux\") pod \"aws-ebs-csi-driver-node-dtmbp\" (UID: \"ce59883f-2102-41a1-af45-7415b5af228f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtmbp" Mar 18 16:43:19.095458 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.095228 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e010756a-9f4e-48da-b759-b8f2ff9d1f1b-host-var-lib-kubelet\") pod \"multus-whxl2\" (UID: \"e010756a-9f4e-48da-b759-b8f2ff9d1f1b\") " pod="openshift-multus/multus-whxl2" Mar 18 16:43:19.095458 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.095285 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kxplp\" (UniqueName: \"kubernetes.io/projected/ce59883f-2102-41a1-af45-7415b5af228f-kube-api-access-kxplp\") pod \"aws-ebs-csi-driver-node-dtmbp\" (UID: \"ce59883f-2102-41a1-af45-7415b5af228f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtmbp" Mar 18 16:43:19.095458 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.095312 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e010756a-9f4e-48da-b759-b8f2ff9d1f1b-os-release\") pod \"multus-whxl2\" (UID: \"e010756a-9f4e-48da-b759-b8f2ff9d1f1b\") " pod="openshift-multus/multus-whxl2" Mar 18 16:43:19.095458 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.095346 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e010756a-9f4e-48da-b759-b8f2ff9d1f1b-host-var-lib-cni-bin\") pod \"multus-whxl2\" (UID: \"e010756a-9f4e-48da-b759-b8f2ff9d1f1b\") " pod="openshift-multus/multus-whxl2" Mar 18 16:43:19.095458 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.095362 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/184966df-7341-412a-aede-a32364efc520-tmp-dir\") pod \"node-resolver-zd7qk\" (UID: \"184966df-7341-412a-aede-a32364efc520\") " pod="openshift-dns/node-resolver-zd7qk" Mar 18 16:43:19.095458 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.095390 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e010756a-9f4e-48da-b759-b8f2ff9d1f1b-os-release\") pod \"multus-whxl2\" (UID: \"e010756a-9f4e-48da-b759-b8f2ff9d1f1b\") " pod="openshift-multus/multus-whxl2" Mar 18 16:43:19.095458 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.095408 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e010756a-9f4e-48da-b759-b8f2ff9d1f1b-host-var-lib-cni-bin\") pod \"multus-whxl2\" (UID: \"e010756a-9f4e-48da-b759-b8f2ff9d1f1b\") " pod="openshift-multus/multus-whxl2" Mar 18 16:43:19.095458 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.095436 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e010756a-9f4e-48da-b759-b8f2ff9d1f1b-cni-binary-copy\") pod \"multus-whxl2\" (UID: \"e010756a-9f4e-48da-b759-b8f2ff9d1f1b\") " pod="openshift-multus/multus-whxl2" Mar 18 16:43:19.095458 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.095462 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ce59883f-2102-41a1-af45-7415b5af228f-sys-fs\") pod \"aws-ebs-csi-driver-node-dtmbp\" (UID: \"ce59883f-2102-41a1-af45-7415b5af228f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtmbp" Mar 18 16:43:19.095896 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.095488 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e010756a-9f4e-48da-b759-b8f2ff9d1f1b-cnibin\") pod \"multus-whxl2\" (UID: \"e010756a-9f4e-48da-b759-b8f2ff9d1f1b\") " pod="openshift-multus/multus-whxl2" Mar 18 16:43:19.095896 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.095510 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e010756a-9f4e-48da-b759-b8f2ff9d1f1b-host-run-k8s-cni-cncf-io\") pod \"multus-whxl2\" (UID: \"e010756a-9f4e-48da-b759-b8f2ff9d1f1b\") " pod="openshift-multus/multus-whxl2" Mar 18 16:43:19.095896 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.095534 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e010756a-9f4e-48da-b759-b8f2ff9d1f1b-multus-conf-dir\") pod \"multus-whxl2\" (UID: \"e010756a-9f4e-48da-b759-b8f2ff9d1f1b\") " pod="openshift-multus/multus-whxl2" Mar 18 16:43:19.095896 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.095563 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dcfc6f29-52f0-4f09-b50d-f044f9886e51-system-cni-dir\") pod \"multus-additional-cni-plugins-qhxzz\" (UID: \"dcfc6f29-52f0-4f09-b50d-f044f9886e51\") " pod="openshift-multus/multus-additional-cni-plugins-qhxzz" Mar 18 16:43:19.095896 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.095565 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ce59883f-2102-41a1-af45-7415b5af228f-sys-fs\") pod \"aws-ebs-csi-driver-node-dtmbp\" (UID: \"ce59883f-2102-41a1-af45-7415b5af228f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtmbp" Mar 18 16:43:19.095896 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.095578 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e010756a-9f4e-48da-b759-b8f2ff9d1f1b-cnibin\") pod \"multus-whxl2\" (UID: \"e010756a-9f4e-48da-b759-b8f2ff9d1f1b\") " pod="openshift-multus/multus-whxl2" Mar 18 16:43:19.095896 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.095587 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dcfc6f29-52f0-4f09-b50d-f044f9886e51-os-release\") pod \"multus-additional-cni-plugins-qhxzz\" (UID: \"dcfc6f29-52f0-4f09-b50d-f044f9886e51\") " pod="openshift-multus/multus-additional-cni-plugins-qhxzz" Mar 18 16:43:19.095896 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.095618 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e010756a-9f4e-48da-b759-b8f2ff9d1f1b-multus-conf-dir\") pod \"multus-whxl2\" (UID: \"e010756a-9f4e-48da-b759-b8f2ff9d1f1b\") " pod="openshift-multus/multus-whxl2" Mar 18 16:43:19.095896 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.095625 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e010756a-9f4e-48da-b759-b8f2ff9d1f1b-etc-kubernetes\") pod \"multus-whxl2\" (UID: \"e010756a-9f4e-48da-b759-b8f2ff9d1f1b\") " pod="openshift-multus/multus-whxl2" Mar 18 16:43:19.095896 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.095637 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dcfc6f29-52f0-4f09-b50d-f044f9886e51-os-release\") pod \"multus-additional-cni-plugins-qhxzz\" (UID: \"dcfc6f29-52f0-4f09-b50d-f044f9886e51\") " pod="openshift-multus/multus-additional-cni-plugins-qhxzz" Mar 18 16:43:19.095896 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.095658 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e010756a-9f4e-48da-b759-b8f2ff9d1f1b-host-run-k8s-cni-cncf-io\") pod \"multus-whxl2\" (UID: \"e010756a-9f4e-48da-b759-b8f2ff9d1f1b\") " pod="openshift-multus/multus-whxl2" Mar 18 16:43:19.095896 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.095662 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dcfc6f29-52f0-4f09-b50d-f044f9886e51-system-cni-dir\") pod \"multus-additional-cni-plugins-qhxzz\" (UID: \"dcfc6f29-52f0-4f09-b50d-f044f9886e51\") " pod="openshift-multus/multus-additional-cni-plugins-qhxzz" Mar 18 16:43:19.095896 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.095656 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dcfc6f29-52f0-4f09-b50d-f044f9886e51-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qhxzz\" (UID: \"dcfc6f29-52f0-4f09-b50d-f044f9886e51\") " pod="openshift-multus/multus-additional-cni-plugins-qhxzz" Mar 18 16:43:19.095896 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.095689 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/dcfc6f29-52f0-4f09-b50d-f044f9886e51-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qhxzz\" (UID: \"dcfc6f29-52f0-4f09-b50d-f044f9886e51\") " pod="openshift-multus/multus-additional-cni-plugins-qhxzz" Mar 18 16:43:19.095896 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.095666 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/184966df-7341-412a-aede-a32364efc520-tmp-dir\") pod \"node-resolver-zd7qk\" (UID: \"184966df-7341-412a-aede-a32364efc520\") " pod="openshift-dns/node-resolver-zd7qk" Mar 18 16:43:19.095896 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.095728 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e010756a-9f4e-48da-b759-b8f2ff9d1f1b-multus-cni-dir\") pod \"multus-whxl2\" (UID: \"e010756a-9f4e-48da-b759-b8f2ff9d1f1b\") " pod="openshift-multus/multus-whxl2" Mar 18 16:43:19.095896 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.095681 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e010756a-9f4e-48da-b759-b8f2ff9d1f1b-etc-kubernetes\") pod \"multus-whxl2\" (UID: \"e010756a-9f4e-48da-b759-b8f2ff9d1f1b\") " pod="openshift-multus/multus-whxl2" Mar 18 16:43:19.095896 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.095762 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e010756a-9f4e-48da-b759-b8f2ff9d1f1b-host-run-netns\") pod \"multus-whxl2\" (UID: \"e010756a-9f4e-48da-b759-b8f2ff9d1f1b\") " pod="openshift-multus/multus-whxl2" Mar 18 16:43:19.096547 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.095797 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e010756a-9f4e-48da-b759-b8f2ff9d1f1b-multus-cni-dir\") pod \"multus-whxl2\" (UID: \"e010756a-9f4e-48da-b759-b8f2ff9d1f1b\") " pod="openshift-multus/multus-whxl2" Mar 18 16:43:19.096547 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.095837 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e010756a-9f4e-48da-b759-b8f2ff9d1f1b-host-run-multus-certs\") pod \"multus-whxl2\" (UID: \"e010756a-9f4e-48da-b759-b8f2ff9d1f1b\") " pod="openshift-multus/multus-whxl2" Mar 18 16:43:19.096547 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.095842 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e010756a-9f4e-48da-b759-b8f2ff9d1f1b-host-run-netns\") pod \"multus-whxl2\" (UID: \"e010756a-9f4e-48da-b759-b8f2ff9d1f1b\") " pod="openshift-multus/multus-whxl2" Mar 18 16:43:19.096547 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.095853 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dcfc6f29-52f0-4f09-b50d-f044f9886e51-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qhxzz\" (UID: \"dcfc6f29-52f0-4f09-b50d-f044f9886e51\") " pod="openshift-multus/multus-additional-cni-plugins-qhxzz" Mar 18 16:43:19.096547 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.095869 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ce59883f-2102-41a1-af45-7415b5af228f-registration-dir\") pod \"aws-ebs-csi-driver-node-dtmbp\" (UID: \"ce59883f-2102-41a1-af45-7415b5af228f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtmbp" Mar 18 16:43:19.096547 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.095875 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e010756a-9f4e-48da-b759-b8f2ff9d1f1b-host-run-multus-certs\") pod \"multus-whxl2\" (UID: \"e010756a-9f4e-48da-b759-b8f2ff9d1f1b\") " pod="openshift-multus/multus-whxl2" Mar 18 16:43:19.096547 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.095919 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ce59883f-2102-41a1-af45-7415b5af228f-registration-dir\") pod \"aws-ebs-csi-driver-node-dtmbp\" (UID: \"ce59883f-2102-41a1-af45-7415b5af228f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtmbp" Mar 18 16:43:19.096547 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.095926 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/abad6a1d-3e28-4f96-90e8-383ed7b5b8e1-host\") pod \"node-ca-9t8zk\" (UID: \"abad6a1d-3e28-4f96-90e8-383ed7b5b8e1\") " pod="openshift-image-registry/node-ca-9t8zk" Mar 18 16:43:19.096547 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.095970 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dcfc6f29-52f0-4f09-b50d-f044f9886e51-cni-binary-copy\") pod \"multus-additional-cni-plugins-qhxzz\" (UID: \"dcfc6f29-52f0-4f09-b50d-f044f9886e51\") " pod="openshift-multus/multus-additional-cni-plugins-qhxzz" Mar 18 16:43:19.096547 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.095976 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/abad6a1d-3e28-4f96-90e8-383ed7b5b8e1-host\") pod \"node-ca-9t8zk\" (UID: \"abad6a1d-3e28-4f96-90e8-383ed7b5b8e1\") " pod="openshift-image-registry/node-ca-9t8zk" Mar 18 16:43:19.096547 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.095997 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e010756a-9f4e-48da-b759-b8f2ff9d1f1b-multus-daemon-config\") pod \"multus-whxl2\" (UID: \"e010756a-9f4e-48da-b759-b8f2ff9d1f1b\") " pod="openshift-multus/multus-whxl2" Mar 18 16:43:19.096547 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.096022 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/abad6a1d-3e28-4f96-90e8-383ed7b5b8e1-serviceca\") pod \"node-ca-9t8zk\" (UID: \"abad6a1d-3e28-4f96-90e8-383ed7b5b8e1\") " pod="openshift-image-registry/node-ca-9t8zk" Mar 18 16:43:19.096547 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.096046 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e010756a-9f4e-48da-b759-b8f2ff9d1f1b-cni-binary-copy\") pod \"multus-whxl2\" (UID: \"e010756a-9f4e-48da-b759-b8f2ff9d1f1b\") " pod="openshift-multus/multus-whxl2" Mar 18 16:43:19.096547 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.096108 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dcfc6f29-52f0-4f09-b50d-f044f9886e51-cnibin\") pod \"multus-additional-cni-plugins-qhxzz\" (UID: \"dcfc6f29-52f0-4f09-b50d-f044f9886e51\") " pod="openshift-multus/multus-additional-cni-plugins-qhxzz" Mar 18 16:43:19.096547 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.096139 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9jxjl\" (UniqueName: \"kubernetes.io/projected/dcfc6f29-52f0-4f09-b50d-f044f9886e51-kube-api-access-9jxjl\") pod \"multus-additional-cni-plugins-qhxzz\" (UID: \"dcfc6f29-52f0-4f09-b50d-f044f9886e51\") " pod="openshift-multus/multus-additional-cni-plugins-qhxzz" Mar 18 16:43:19.096547 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.096164 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e010756a-9f4e-48da-b759-b8f2ff9d1f1b-host-var-lib-cni-multus\") pod \"multus-whxl2\" (UID: \"e010756a-9f4e-48da-b759-b8f2ff9d1f1b\") " pod="openshift-multus/multus-whxl2" Mar 18 16:43:19.096547 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.096190 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ce59883f-2102-41a1-af45-7415b5af228f-socket-dir\") pod \"aws-ebs-csi-driver-node-dtmbp\" (UID: \"ce59883f-2102-41a1-af45-7415b5af228f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtmbp" Mar 18 16:43:19.097131 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.096205 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dcfc6f29-52f0-4f09-b50d-f044f9886e51-cnibin\") pod \"multus-additional-cni-plugins-qhxzz\" (UID: \"dcfc6f29-52f0-4f09-b50d-f044f9886e51\") " pod="openshift-multus/multus-additional-cni-plugins-qhxzz" Mar 18 16:43:19.097131 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.096215 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/184966df-7341-412a-aede-a32364efc520-hosts-file\") pod \"node-resolver-zd7qk\" (UID: \"184966df-7341-412a-aede-a32364efc520\") " pod="openshift-dns/node-resolver-zd7qk" Mar 18 16:43:19.097131 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.096250 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ce59883f-2102-41a1-af45-7415b5af228f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-dtmbp\" (UID: \"ce59883f-2102-41a1-af45-7415b5af228f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtmbp" Mar 18 16:43:19.097131 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.096275 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7v7m\" (UniqueName: \"kubernetes.io/projected/184966df-7341-412a-aede-a32364efc520-kube-api-access-n7v7m\") pod \"node-resolver-zd7qk\" (UID: \"184966df-7341-412a-aede-a32364efc520\") " pod="openshift-dns/node-resolver-zd7qk" Mar 18 16:43:19.097131 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.096301 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e010756a-9f4e-48da-b759-b8f2ff9d1f1b-hostroot\") pod \"multus-whxl2\" (UID: \"e010756a-9f4e-48da-b759-b8f2ff9d1f1b\") " pod="openshift-multus/multus-whxl2" Mar 18 16:43:19.097131 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.096326 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e010756a-9f4e-48da-b759-b8f2ff9d1f1b-multus-socket-dir-parent\") pod \"multus-whxl2\" (UID: \"e010756a-9f4e-48da-b759-b8f2ff9d1f1b\") " pod="openshift-multus/multus-whxl2" Mar 18 16:43:19.097131 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.096352 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ce59883f-2102-41a1-af45-7415b5af228f-device-dir\") pod \"aws-ebs-csi-driver-node-dtmbp\" (UID: \"ce59883f-2102-41a1-af45-7415b5af228f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtmbp" Mar 18 16:43:19.097131 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.096383 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/abad6a1d-3e28-4f96-90e8-383ed7b5b8e1-serviceca\") pod \"node-ca-9t8zk\" (UID: \"abad6a1d-3e28-4f96-90e8-383ed7b5b8e1\") " pod="openshift-image-registry/node-ca-9t8zk" Mar 18 16:43:19.097131 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.096383 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/dcfc6f29-52f0-4f09-b50d-f044f9886e51-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-qhxzz\" (UID: \"dcfc6f29-52f0-4f09-b50d-f044f9886e51\") " pod="openshift-multus/multus-additional-cni-plugins-qhxzz" Mar 18 16:43:19.097131 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.096452 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dcfc6f29-52f0-4f09-b50d-f044f9886e51-cni-binary-copy\") pod \"multus-additional-cni-plugins-qhxzz\" (UID: \"dcfc6f29-52f0-4f09-b50d-f044f9886e51\") " pod="openshift-multus/multus-additional-cni-plugins-qhxzz" Mar 18 16:43:19.097131 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.096465 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e010756a-9f4e-48da-b759-b8f2ff9d1f1b-multus-daemon-config\") pod \"multus-whxl2\" (UID: \"e010756a-9f4e-48da-b759-b8f2ff9d1f1b\") " pod="openshift-multus/multus-whxl2" Mar 18 16:43:19.097131 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.096474 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ce59883f-2102-41a1-af45-7415b5af228f-socket-dir\") pod \"aws-ebs-csi-driver-node-dtmbp\" (UID: \"ce59883f-2102-41a1-af45-7415b5af228f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtmbp" Mar 18 16:43:19.097131 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.096506 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/184966df-7341-412a-aede-a32364efc520-hosts-file\") pod \"node-resolver-zd7qk\" (UID: \"184966df-7341-412a-aede-a32364efc520\") " pod="openshift-dns/node-resolver-zd7qk" Mar 18 16:43:19.097131 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.096513 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e010756a-9f4e-48da-b759-b8f2ff9d1f1b-hostroot\") pod \"multus-whxl2\" (UID: \"e010756a-9f4e-48da-b759-b8f2ff9d1f1b\") " pod="openshift-multus/multus-whxl2" Mar 18 16:43:19.097131 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.096510 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ce59883f-2102-41a1-af45-7415b5af228f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-dtmbp\" (UID: \"ce59883f-2102-41a1-af45-7415b5af228f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtmbp" Mar 18 16:43:19.097131 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.096250 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e010756a-9f4e-48da-b759-b8f2ff9d1f1b-host-var-lib-cni-multus\") pod \"multus-whxl2\" (UID: \"e010756a-9f4e-48da-b759-b8f2ff9d1f1b\") " pod="openshift-multus/multus-whxl2" Mar 18 16:43:19.097131 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.096536 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e010756a-9f4e-48da-b759-b8f2ff9d1f1b-multus-socket-dir-parent\") pod \"multus-whxl2\" (UID: \"e010756a-9f4e-48da-b759-b8f2ff9d1f1b\") " pod="openshift-multus/multus-whxl2" Mar 18 16:43:19.097681 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.096558 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ce59883f-2102-41a1-af45-7415b5af228f-device-dir\") pod \"aws-ebs-csi-driver-node-dtmbp\" (UID: \"ce59883f-2102-41a1-af45-7415b5af228f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtmbp" Mar 18 16:43:19.097681 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.096817 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/dcfc6f29-52f0-4f09-b50d-f044f9886e51-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-qhxzz\" (UID: \"dcfc6f29-52f0-4f09-b50d-f044f9886e51\") " pod="openshift-multus/multus-additional-cni-plugins-qhxzz" Mar 18 16:43:19.104414 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.104345 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-59zr5\" (UniqueName: \"kubernetes.io/projected/e010756a-9f4e-48da-b759-b8f2ff9d1f1b-kube-api-access-59zr5\") pod \"multus-whxl2\" (UID: \"e010756a-9f4e-48da-b759-b8f2ff9d1f1b\") " pod="openshift-multus/multus-whxl2" Mar 18 16:43:19.104414 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.104399 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7v7m\" (UniqueName: \"kubernetes.io/projected/184966df-7341-412a-aede-a32364efc520-kube-api-access-n7v7m\") pod \"node-resolver-zd7qk\" (UID: \"184966df-7341-412a-aede-a32364efc520\") " pod="openshift-dns/node-resolver-zd7qk" Mar 18 16:43:19.104574 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.104486 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxplp\" (UniqueName: \"kubernetes.io/projected/ce59883f-2102-41a1-af45-7415b5af228f-kube-api-access-kxplp\") pod \"aws-ebs-csi-driver-node-dtmbp\" (UID: \"ce59883f-2102-41a1-af45-7415b5af228f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtmbp" Mar 18 16:43:19.105204 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.105188 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sj48\" (UniqueName: \"kubernetes.io/projected/abad6a1d-3e28-4f96-90e8-383ed7b5b8e1-kube-api-access-9sj48\") pod \"node-ca-9t8zk\" (UID: \"abad6a1d-3e28-4f96-90e8-383ed7b5b8e1\") " pod="openshift-image-registry/node-ca-9t8zk" Mar 18 16:43:19.105293 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.105228 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jxjl\" (UniqueName: \"kubernetes.io/projected/dcfc6f29-52f0-4f09-b50d-f044f9886e51-kube-api-access-9jxjl\") pod \"multus-additional-cni-plugins-qhxzz\" (UID: \"dcfc6f29-52f0-4f09-b50d-f044f9886e51\") " pod="openshift-multus/multus-additional-cni-plugins-qhxzz" Mar 18 16:43:19.189205 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.189172 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-bhcbr" Mar 18 16:43:19.198974 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.198935 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:19.216592 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.216570 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-s4vkz" Mar 18 16:43:19.221262 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.221235 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-62kx2" Mar 18 16:43:19.228838 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.228817 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtmbp" Mar 18 16:43:19.236363 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.236346 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zd7qk" Mar 18 16:43:19.243830 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.243811 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qhxzz" Mar 18 16:43:19.251341 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.251325 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-whxl2" Mar 18 16:43:19.256899 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.256879 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9t8zk" Mar 18 16:43:19.498751 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.498667 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/008308d5-c00b-40b2-a413-eb4caebc48c0-metrics-certs\") pod \"network-metrics-daemon-dlnjg\" (UID: \"008308d5-c00b-40b2-a413-eb4caebc48c0\") " pod="openshift-multus/network-metrics-daemon-dlnjg" Mar 18 16:43:19.498902 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:19.498786 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:43:19.498902 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:19.498850 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/008308d5-c00b-40b2-a413-eb4caebc48c0-metrics-certs podName:008308d5-c00b-40b2-a413-eb4caebc48c0 nodeName:}" failed. No retries permitted until 2026-03-18 16:43:20.498831647 +0000 UTC m=+4.184954002 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/008308d5-c00b-40b2-a413-eb4caebc48c0-metrics-certs") pod "network-metrics-daemon-dlnjg" (UID: "008308d5-c00b-40b2-a413-eb4caebc48c0") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:43:19.598959 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.598919 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z259z\" (UniqueName: \"kubernetes.io/projected/473b4404-da3f-4c35-92c4-a69465dc3f06-kube-api-access-z259z\") pod \"network-check-target-mnsnh\" (UID: \"473b4404-da3f-4c35-92c4-a69465dc3f06\") " pod="openshift-network-diagnostics/network-check-target-mnsnh" Mar 18 16:43:19.599111 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:19.599053 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:43:19.599111 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:19.599075 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:43:19.599111 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:19.599089 2562 projected.go:194] Error preparing data for projected volume kube-api-access-z259z for pod openshift-network-diagnostics/network-check-target-mnsnh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:43:19.599218 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:19.599144 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/473b4404-da3f-4c35-92c4-a69465dc3f06-kube-api-access-z259z podName:473b4404-da3f-4c35-92c4-a69465dc3f06 nodeName:}" failed. No retries permitted until 2026-03-18 16:43:20.599129143 +0000 UTC m=+4.285251502 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-z259z" (UniqueName: "kubernetes.io/projected/473b4404-da3f-4c35-92c4-a69465dc3f06-kube-api-access-z259z") pod "network-check-target-mnsnh" (UID: "473b4404-da3f-4c35-92c4-a69465dc3f06") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:43:19.761110 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:19.761078 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce59883f_2102_41a1_af45_7415b5af228f.slice/crio-4689befc9b7ef920226f9871c28d117074a25ba5571de9171222b4498da2f130 WatchSource:0}: Error finding container 4689befc9b7ef920226f9871c28d117074a25ba5571de9171222b4498da2f130: Status 404 returned error can't find the container with id 4689befc9b7ef920226f9871c28d117074a25ba5571de9171222b4498da2f130 Mar 18 16:43:19.765179 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:19.765153 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7177d49a_364c_4c44_9dd5_4fa101f99bdb.slice/crio-c135eff02ee452725a2de541a9cc25298882c23c319bc5514d664151f5c7204c WatchSource:0}: Error finding container c135eff02ee452725a2de541a9cc25298882c23c319bc5514d664151f5c7204c: Status 404 returned error can't find the container with id c135eff02ee452725a2de541a9cc25298882c23c319bc5514d664151f5c7204c Mar 18 16:43:19.767100 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:19.766880 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09466950_dea7_48b9_b4a4_b9b73d845973.slice/crio-c7a75b7be85e7f97534861a493624ed1808b5d3b3e3c99a027a4a76e45d58987 WatchSource:0}: Error finding container c7a75b7be85e7f97534861a493624ed1808b5d3b3e3c99a027a4a76e45d58987: Status 404 returned error can't find the container with id c7a75b7be85e7f97534861a493624ed1808b5d3b3e3c99a027a4a76e45d58987 Mar 18 16:43:19.767730 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:19.767710 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod203ce5e3_84d7_4528_841c_52a57d9ccb6e.slice/crio-4ea3e254bb16722b7f8534933c70e8c80e5400a64f06254ec3de60fce824de4f WatchSource:0}: Error finding container 4ea3e254bb16722b7f8534933c70e8c80e5400a64f06254ec3de60fce824de4f: Status 404 returned error can't find the container with id 4ea3e254bb16722b7f8534933c70e8c80e5400a64f06254ec3de60fce824de4f Mar 18 16:43:19.768541 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:19.768456 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba39ce75_7939_4e19_9800_7072765d139b.slice/crio-7caeda4fb86400f7690b5d26df41fa10b32251c247790a84915faa27491e5392 WatchSource:0}: Error finding container 7caeda4fb86400f7690b5d26df41fa10b32251c247790a84915faa27491e5392: Status 404 returned error can't find the container with id 7caeda4fb86400f7690b5d26df41fa10b32251c247790a84915faa27491e5392 Mar 18 16:43:19.769596 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:19.769553 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod184966df_7341_412a_aede_a32364efc520.slice/crio-0f1eb2d97515b8782e5db394d85634a54b4cd5b3768af7c17de781b628c39e52 WatchSource:0}: Error finding container 0f1eb2d97515b8782e5db394d85634a54b4cd5b3768af7c17de781b628c39e52: Status 404 returned error can't find the container with id 0f1eb2d97515b8782e5db394d85634a54b4cd5b3768af7c17de781b628c39e52 Mar 18 16:43:19.770678 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:19.770540 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcfc6f29_52f0_4f09_b50d_f044f9886e51.slice/crio-6619af8d6ebca1ab1dcb1d4394d941d40367757c6affbab46a2e4f18923f850d WatchSource:0}: Error finding container 6619af8d6ebca1ab1dcb1d4394d941d40367757c6affbab46a2e4f18923f850d: Status 404 returned error can't find the container with id 6619af8d6ebca1ab1dcb1d4394d941d40367757c6affbab46a2e4f18923f850d Mar 18 16:43:19.771370 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:19.771288 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode010756a_9f4e_48da_b759_b8f2ff9d1f1b.slice/crio-6d9afc4f92b7ac665b487becb542682b7c1bb22885dfbc4cc7b96c3543f2e0bd WatchSource:0}: Error finding container 6d9afc4f92b7ac665b487becb542682b7c1bb22885dfbc4cc7b96c3543f2e0bd: Status 404 returned error can't find the container with id 6d9afc4f92b7ac665b487becb542682b7c1bb22885dfbc4cc7b96c3543f2e0bd Mar 18 16:43:19.795719 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:19.795687 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabad6a1d_3e28_4f96_90e8_383ed7b5b8e1.slice/crio-6c9132e41f9c9a456f824ba33b87a749e81796c3fe698610e27def3f3efad9b0 WatchSource:0}: Error finding container 6c9132e41f9c9a456f824ba33b87a749e81796c3fe698610e27def3f3efad9b0: Status 404 returned error can't find the container with id 6c9132e41f9c9a456f824ba33b87a749e81796c3fe698610e27def3f3efad9b0 Mar 18 16:43:19.924240 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.924210 2562 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-03-17 16:38:17 +0000 UTC" deadline="2027-11-13 06:27:35.869321074 +0000 UTC" Mar 18 16:43:19.924240 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.924237 2562 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14509h44m15.945086382s" Mar 18 16:43:19.989150 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.989117 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-5.ec2.internal" event={"ID":"6da92965ac9d0f6a332c1d675e2f540d","Type":"ContainerStarted","Data":"b996aa1a5314f0df29cc1fa04d3fb3fef1369036b733dd1cd9a3ef2fb520bb06"} Mar 18 16:43:19.990315 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.990294 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9t8zk" event={"ID":"abad6a1d-3e28-4f96-90e8-383ed7b5b8e1","Type":"ContainerStarted","Data":"6c9132e41f9c9a456f824ba33b87a749e81796c3fe698610e27def3f3efad9b0"} Mar 18 16:43:19.991293 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.991274 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-whxl2" event={"ID":"e010756a-9f4e-48da-b759-b8f2ff9d1f1b","Type":"ContainerStarted","Data":"6d9afc4f92b7ac665b487becb542682b7c1bb22885dfbc4cc7b96c3543f2e0bd"} Mar 18 16:43:19.992196 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.992174 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qhxzz" event={"ID":"dcfc6f29-52f0-4f09-b50d-f044f9886e51","Type":"ContainerStarted","Data":"6619af8d6ebca1ab1dcb1d4394d941d40367757c6affbab46a2e4f18923f850d"} Mar 18 16:43:19.993133 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.993115 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zd7qk" event={"ID":"184966df-7341-412a-aede-a32364efc520","Type":"ContainerStarted","Data":"0f1eb2d97515b8782e5db394d85634a54b4cd5b3768af7c17de781b628c39e52"} Mar 18 16:43:19.994148 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.994113 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" event={"ID":"09466950-dea7-48b9-b4a4-b9b73d845973","Type":"ContainerStarted","Data":"c7a75b7be85e7f97534861a493624ed1808b5d3b3e3c99a027a4a76e45d58987"} Mar 18 16:43:19.994985 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.994961 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-62kx2" event={"ID":"7177d49a-364c-4c44-9dd5-4fa101f99bdb","Type":"ContainerStarted","Data":"c135eff02ee452725a2de541a9cc25298882c23c319bc5514d664151f5c7204c"} Mar 18 16:43:19.996651 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.996421 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-bhcbr" event={"ID":"ba39ce75-7939-4e19-9800-7072765d139b","Type":"ContainerStarted","Data":"7caeda4fb86400f7690b5d26df41fa10b32251c247790a84915faa27491e5392"} Mar 18 16:43:19.998147 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.998108 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-s4vkz" event={"ID":"203ce5e3-84d7-4528-841c-52a57d9ccb6e","Type":"ContainerStarted","Data":"4ea3e254bb16722b7f8534933c70e8c80e5400a64f06254ec3de60fce824de4f"} Mar 18 16:43:19.998932 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:19.998914 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtmbp" event={"ID":"ce59883f-2102-41a1-af45-7415b5af228f","Type":"ContainerStarted","Data":"4689befc9b7ef920226f9871c28d117074a25ba5571de9171222b4498da2f130"} Mar 18 16:43:20.003678 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:20.003642 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-5.ec2.internal" podStartSLOduration=2.00363165 podStartE2EDuration="2.00363165s" podCreationTimestamp="2026-03-18 16:43:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:43:20.003535485 +0000 UTC m=+3.689657846" watchObservedRunningTime="2026-03-18 16:43:20.00363165 +0000 UTC m=+3.689754028" Mar 18 16:43:20.506521 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:20.506373 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/008308d5-c00b-40b2-a413-eb4caebc48c0-metrics-certs\") pod \"network-metrics-daemon-dlnjg\" (UID: \"008308d5-c00b-40b2-a413-eb4caebc48c0\") " pod="openshift-multus/network-metrics-daemon-dlnjg" Mar 18 16:43:20.506682 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:20.506529 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:43:20.506682 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:20.506589 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/008308d5-c00b-40b2-a413-eb4caebc48c0-metrics-certs podName:008308d5-c00b-40b2-a413-eb4caebc48c0 nodeName:}" failed. No retries permitted until 2026-03-18 16:43:22.506572144 +0000 UTC m=+6.192694500 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/008308d5-c00b-40b2-a413-eb4caebc48c0-metrics-certs") pod "network-metrics-daemon-dlnjg" (UID: "008308d5-c00b-40b2-a413-eb4caebc48c0") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:43:20.607126 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:20.607052 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z259z\" (UniqueName: \"kubernetes.io/projected/473b4404-da3f-4c35-92c4-a69465dc3f06-kube-api-access-z259z\") pod \"network-check-target-mnsnh\" (UID: \"473b4404-da3f-4c35-92c4-a69465dc3f06\") " pod="openshift-network-diagnostics/network-check-target-mnsnh" Mar 18 16:43:20.607231 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:20.607195 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:43:20.607231 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:20.607216 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:43:20.607231 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:20.607228 2562 projected.go:194] Error preparing data for projected volume kube-api-access-z259z for pod openshift-network-diagnostics/network-check-target-mnsnh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:43:20.607371 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:20.607283 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/473b4404-da3f-4c35-92c4-a69465dc3f06-kube-api-access-z259z podName:473b4404-da3f-4c35-92c4-a69465dc3f06 nodeName:}" failed. No retries permitted until 2026-03-18 16:43:22.607263619 +0000 UTC m=+6.293385977 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-z259z" (UniqueName: "kubernetes.io/projected/473b4404-da3f-4c35-92c4-a69465dc3f06-kube-api-access-z259z") pod "network-check-target-mnsnh" (UID: "473b4404-da3f-4c35-92c4-a69465dc3f06") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:43:20.807750 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:20.807666 2562 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:43:20.982316 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:20.982291 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mnsnh" Mar 18 16:43:20.982745 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:20.982399 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mnsnh" podUID="473b4404-da3f-4c35-92c4-a69465dc3f06" Mar 18 16:43:20.982818 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:20.982798 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlnjg" Mar 18 16:43:20.982926 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:20.982901 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlnjg" podUID="008308d5-c00b-40b2-a413-eb4caebc48c0" Mar 18 16:43:21.025363 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:21.024350 2562 generic.go:358] "Generic (PLEG): container finished" podID="cc51fc8cf8e0664cd067f24564b101ef" containerID="7fa3f6707eaab83d7f1f50c25f6a7f9806f898e0ef1ed427722ea5cc04322c2f" exitCode=0 Mar 18 16:43:21.025363 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:21.025158 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-5.ec2.internal" event={"ID":"cc51fc8cf8e0664cd067f24564b101ef","Type":"ContainerDied","Data":"7fa3f6707eaab83d7f1f50c25f6a7f9806f898e0ef1ed427722ea5cc04322c2f"} Mar 18 16:43:22.053967 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:22.053916 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-5.ec2.internal" event={"ID":"cc51fc8cf8e0664cd067f24564b101ef","Type":"ContainerStarted","Data":"64e47ac56cb925a846392796dcf29720d52fac0da3a89032b60b9d464e8507c5"} Mar 18 16:43:22.073662 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:22.072949 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-5.ec2.internal" podStartSLOduration=4.072920289 podStartE2EDuration="4.072920289s" podCreationTimestamp="2026-03-18 16:43:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:43:22.072489549 +0000 UTC m=+5.758611911" watchObservedRunningTime="2026-03-18 16:43:22.072920289 +0000 UTC m=+5.759042653" Mar 18 16:43:22.521688 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:22.521613 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/008308d5-c00b-40b2-a413-eb4caebc48c0-metrics-certs\") pod \"network-metrics-daemon-dlnjg\" (UID: \"008308d5-c00b-40b2-a413-eb4caebc48c0\") " pod="openshift-multus/network-metrics-daemon-dlnjg" Mar 18 16:43:22.521840 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:22.521763 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:43:22.521840 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:22.521830 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/008308d5-c00b-40b2-a413-eb4caebc48c0-metrics-certs podName:008308d5-c00b-40b2-a413-eb4caebc48c0 nodeName:}" failed. No retries permitted until 2026-03-18 16:43:26.521810521 +0000 UTC m=+10.207932860 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/008308d5-c00b-40b2-a413-eb4caebc48c0-metrics-certs") pod "network-metrics-daemon-dlnjg" (UID: "008308d5-c00b-40b2-a413-eb4caebc48c0") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:43:22.622490 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:22.622452 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z259z\" (UniqueName: \"kubernetes.io/projected/473b4404-da3f-4c35-92c4-a69465dc3f06-kube-api-access-z259z\") pod \"network-check-target-mnsnh\" (UID: \"473b4404-da3f-4c35-92c4-a69465dc3f06\") " pod="openshift-network-diagnostics/network-check-target-mnsnh" Mar 18 16:43:22.622672 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:22.622639 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:43:22.622672 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:22.622656 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:43:22.622672 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:22.622668 2562 projected.go:194] Error preparing data for projected volume kube-api-access-z259z for pod openshift-network-diagnostics/network-check-target-mnsnh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:43:22.622843 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:22.622721 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/473b4404-da3f-4c35-92c4-a69465dc3f06-kube-api-access-z259z podName:473b4404-da3f-4c35-92c4-a69465dc3f06 nodeName:}" failed. No retries permitted until 2026-03-18 16:43:26.622702194 +0000 UTC m=+10.308824587 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-z259z" (UniqueName: "kubernetes.io/projected/473b4404-da3f-4c35-92c4-a69465dc3f06-kube-api-access-z259z") pod "network-check-target-mnsnh" (UID: "473b4404-da3f-4c35-92c4-a69465dc3f06") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:43:22.982517 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:22.982444 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlnjg" Mar 18 16:43:22.982673 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:22.982444 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mnsnh" Mar 18 16:43:22.982673 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:22.982587 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlnjg" podUID="008308d5-c00b-40b2-a413-eb4caebc48c0" Mar 18 16:43:22.982673 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:22.982645 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mnsnh" podUID="473b4404-da3f-4c35-92c4-a69465dc3f06" Mar 18 16:43:24.980168 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:24.980129 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mnsnh" Mar 18 16:43:24.980601 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:24.980243 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mnsnh" podUID="473b4404-da3f-4c35-92c4-a69465dc3f06" Mar 18 16:43:24.980601 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:24.980140 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlnjg" Mar 18 16:43:24.980712 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:24.980674 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlnjg" podUID="008308d5-c00b-40b2-a413-eb4caebc48c0" Mar 18 16:43:26.552723 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:26.552620 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/008308d5-c00b-40b2-a413-eb4caebc48c0-metrics-certs\") pod \"network-metrics-daemon-dlnjg\" (UID: \"008308d5-c00b-40b2-a413-eb4caebc48c0\") " pod="openshift-multus/network-metrics-daemon-dlnjg" Mar 18 16:43:26.553206 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:26.552753 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:43:26.553206 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:26.552834 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/008308d5-c00b-40b2-a413-eb4caebc48c0-metrics-certs podName:008308d5-c00b-40b2-a413-eb4caebc48c0 nodeName:}" failed. No retries permitted until 2026-03-18 16:43:34.552801258 +0000 UTC m=+18.238923611 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/008308d5-c00b-40b2-a413-eb4caebc48c0-metrics-certs") pod "network-metrics-daemon-dlnjg" (UID: "008308d5-c00b-40b2-a413-eb4caebc48c0") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:43:26.653593 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:26.653553 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z259z\" (UniqueName: \"kubernetes.io/projected/473b4404-da3f-4c35-92c4-a69465dc3f06-kube-api-access-z259z\") pod \"network-check-target-mnsnh\" (UID: \"473b4404-da3f-4c35-92c4-a69465dc3f06\") " pod="openshift-network-diagnostics/network-check-target-mnsnh" Mar 18 16:43:26.653759 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:26.653746 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:43:26.653836 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:26.653766 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:43:26.653836 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:26.653779 2562 projected.go:194] Error preparing data for projected volume kube-api-access-z259z for pod openshift-network-diagnostics/network-check-target-mnsnh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:43:26.654129 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:26.653838 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/473b4404-da3f-4c35-92c4-a69465dc3f06-kube-api-access-z259z podName:473b4404-da3f-4c35-92c4-a69465dc3f06 nodeName:}" failed. No retries permitted until 2026-03-18 16:43:34.653818449 +0000 UTC m=+18.339940805 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-z259z" (UniqueName: "kubernetes.io/projected/473b4404-da3f-4c35-92c4-a69465dc3f06-kube-api-access-z259z") pod "network-check-target-mnsnh" (UID: "473b4404-da3f-4c35-92c4-a69465dc3f06") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:43:26.980920 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:26.980813 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlnjg" Mar 18 16:43:26.981085 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:26.981037 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlnjg" podUID="008308d5-c00b-40b2-a413-eb4caebc48c0" Mar 18 16:43:26.981085 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:26.981065 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mnsnh" Mar 18 16:43:26.981214 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:26.981155 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mnsnh" podUID="473b4404-da3f-4c35-92c4-a69465dc3f06" Mar 18 16:43:28.980058 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:28.979725 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlnjg" Mar 18 16:43:28.980058 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:28.979733 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mnsnh" Mar 18 16:43:28.980058 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:28.979872 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlnjg" podUID="008308d5-c00b-40b2-a413-eb4caebc48c0" Mar 18 16:43:28.980058 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:28.979918 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mnsnh" podUID="473b4404-da3f-4c35-92c4-a69465dc3f06" Mar 18 16:43:29.713789 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:29.713758 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-cck66"] Mar 18 16:43:29.717263 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:29.717243 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cck66" Mar 18 16:43:29.717395 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:29.717310 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cck66" podUID="9a6bdc88-7df5-45fd-98ac-2f967cc2f192" Mar 18 16:43:29.778616 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:29.778575 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9a6bdc88-7df5-45fd-98ac-2f967cc2f192-original-pull-secret\") pod \"global-pull-secret-syncer-cck66\" (UID: \"9a6bdc88-7df5-45fd-98ac-2f967cc2f192\") " pod="kube-system/global-pull-secret-syncer-cck66" Mar 18 16:43:29.778616 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:29.778623 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9a6bdc88-7df5-45fd-98ac-2f967cc2f192-dbus\") pod \"global-pull-secret-syncer-cck66\" (UID: \"9a6bdc88-7df5-45fd-98ac-2f967cc2f192\") " pod="kube-system/global-pull-secret-syncer-cck66" Mar 18 16:43:29.778820 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:29.778663 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9a6bdc88-7df5-45fd-98ac-2f967cc2f192-kubelet-config\") pod \"global-pull-secret-syncer-cck66\" (UID: \"9a6bdc88-7df5-45fd-98ac-2f967cc2f192\") " pod="kube-system/global-pull-secret-syncer-cck66" Mar 18 16:43:29.879200 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:29.879099 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9a6bdc88-7df5-45fd-98ac-2f967cc2f192-dbus\") pod \"global-pull-secret-syncer-cck66\" (UID: \"9a6bdc88-7df5-45fd-98ac-2f967cc2f192\") " pod="kube-system/global-pull-secret-syncer-cck66" Mar 18 16:43:29.879200 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:29.879159 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9a6bdc88-7df5-45fd-98ac-2f967cc2f192-kubelet-config\") pod \"global-pull-secret-syncer-cck66\" (UID: \"9a6bdc88-7df5-45fd-98ac-2f967cc2f192\") " pod="kube-system/global-pull-secret-syncer-cck66" Mar 18 16:43:29.879407 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:29.879230 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9a6bdc88-7df5-45fd-98ac-2f967cc2f192-original-pull-secret\") pod \"global-pull-secret-syncer-cck66\" (UID: \"9a6bdc88-7df5-45fd-98ac-2f967cc2f192\") " pod="kube-system/global-pull-secret-syncer-cck66" Mar 18 16:43:29.879407 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:29.879314 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9a6bdc88-7df5-45fd-98ac-2f967cc2f192-dbus\") pod \"global-pull-secret-syncer-cck66\" (UID: \"9a6bdc88-7df5-45fd-98ac-2f967cc2f192\") " pod="kube-system/global-pull-secret-syncer-cck66" Mar 18 16:43:29.879407 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:29.879322 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9a6bdc88-7df5-45fd-98ac-2f967cc2f192-kubelet-config\") pod \"global-pull-secret-syncer-cck66\" (UID: \"9a6bdc88-7df5-45fd-98ac-2f967cc2f192\") " pod="kube-system/global-pull-secret-syncer-cck66" Mar 18 16:43:29.879407 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:29.879353 2562 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Mar 18 16:43:29.879584 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:29.879412 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a6bdc88-7df5-45fd-98ac-2f967cc2f192-original-pull-secret podName:9a6bdc88-7df5-45fd-98ac-2f967cc2f192 nodeName:}" failed. No retries permitted until 2026-03-18 16:43:30.379393198 +0000 UTC m=+14.065515552 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9a6bdc88-7df5-45fd-98ac-2f967cc2f192-original-pull-secret") pod "global-pull-secret-syncer-cck66" (UID: "9a6bdc88-7df5-45fd-98ac-2f967cc2f192") : object "kube-system"/"original-pull-secret" not registered Mar 18 16:43:30.382662 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:30.382628 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9a6bdc88-7df5-45fd-98ac-2f967cc2f192-original-pull-secret\") pod \"global-pull-secret-syncer-cck66\" (UID: \"9a6bdc88-7df5-45fd-98ac-2f967cc2f192\") " pod="kube-system/global-pull-secret-syncer-cck66" Mar 18 16:43:30.383040 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:30.382752 2562 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Mar 18 16:43:30.383040 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:30.382806 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a6bdc88-7df5-45fd-98ac-2f967cc2f192-original-pull-secret podName:9a6bdc88-7df5-45fd-98ac-2f967cc2f192 nodeName:}" failed. No retries permitted until 2026-03-18 16:43:31.382792893 +0000 UTC m=+15.068915237 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9a6bdc88-7df5-45fd-98ac-2f967cc2f192-original-pull-secret") pod "global-pull-secret-syncer-cck66" (UID: "9a6bdc88-7df5-45fd-98ac-2f967cc2f192") : object "kube-system"/"original-pull-secret" not registered Mar 18 16:43:30.980247 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:30.980207 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cck66" Mar 18 16:43:30.980405 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:30.980210 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mnsnh" Mar 18 16:43:30.980405 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:30.980311 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cck66" podUID="9a6bdc88-7df5-45fd-98ac-2f967cc2f192" Mar 18 16:43:30.980486 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:30.980218 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlnjg" Mar 18 16:43:30.980486 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:30.980408 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mnsnh" podUID="473b4404-da3f-4c35-92c4-a69465dc3f06" Mar 18 16:43:30.980560 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:30.980534 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlnjg" podUID="008308d5-c00b-40b2-a413-eb4caebc48c0" Mar 18 16:43:31.390450 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:31.390367 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9a6bdc88-7df5-45fd-98ac-2f967cc2f192-original-pull-secret\") pod \"global-pull-secret-syncer-cck66\" (UID: \"9a6bdc88-7df5-45fd-98ac-2f967cc2f192\") " pod="kube-system/global-pull-secret-syncer-cck66" Mar 18 16:43:31.390836 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:31.390537 2562 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Mar 18 16:43:31.390836 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:31.390589 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a6bdc88-7df5-45fd-98ac-2f967cc2f192-original-pull-secret podName:9a6bdc88-7df5-45fd-98ac-2f967cc2f192 nodeName:}" failed. No retries permitted until 2026-03-18 16:43:33.390575521 +0000 UTC m=+17.076697861 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9a6bdc88-7df5-45fd-98ac-2f967cc2f192-original-pull-secret") pod "global-pull-secret-syncer-cck66" (UID: "9a6bdc88-7df5-45fd-98ac-2f967cc2f192") : object "kube-system"/"original-pull-secret" not registered Mar 18 16:43:32.980389 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:32.980351 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mnsnh" Mar 18 16:43:32.980389 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:32.980398 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlnjg" Mar 18 16:43:32.980888 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:32.980426 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cck66" Mar 18 16:43:32.980888 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:32.980506 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlnjg" podUID="008308d5-c00b-40b2-a413-eb4caebc48c0" Mar 18 16:43:32.980888 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:32.980553 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cck66" podUID="9a6bdc88-7df5-45fd-98ac-2f967cc2f192" Mar 18 16:43:32.980888 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:32.980636 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mnsnh" podUID="473b4404-da3f-4c35-92c4-a69465dc3f06" Mar 18 16:43:33.406080 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:33.405993 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9a6bdc88-7df5-45fd-98ac-2f967cc2f192-original-pull-secret\") pod \"global-pull-secret-syncer-cck66\" (UID: \"9a6bdc88-7df5-45fd-98ac-2f967cc2f192\") " pod="kube-system/global-pull-secret-syncer-cck66" Mar 18 16:43:33.406280 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:33.406144 2562 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Mar 18 16:43:33.406280 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:33.406214 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a6bdc88-7df5-45fd-98ac-2f967cc2f192-original-pull-secret podName:9a6bdc88-7df5-45fd-98ac-2f967cc2f192 nodeName:}" failed. No retries permitted until 2026-03-18 16:43:37.406193686 +0000 UTC m=+21.092316028 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9a6bdc88-7df5-45fd-98ac-2f967cc2f192-original-pull-secret") pod "global-pull-secret-syncer-cck66" (UID: "9a6bdc88-7df5-45fd-98ac-2f967cc2f192") : object "kube-system"/"original-pull-secret" not registered Mar 18 16:43:34.614569 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:34.614529 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/008308d5-c00b-40b2-a413-eb4caebc48c0-metrics-certs\") pod \"network-metrics-daemon-dlnjg\" (UID: \"008308d5-c00b-40b2-a413-eb4caebc48c0\") " pod="openshift-multus/network-metrics-daemon-dlnjg" Mar 18 16:43:34.615043 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:34.614697 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:43:34.615043 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:34.614769 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/008308d5-c00b-40b2-a413-eb4caebc48c0-metrics-certs podName:008308d5-c00b-40b2-a413-eb4caebc48c0 nodeName:}" failed. No retries permitted until 2026-03-18 16:43:50.614749589 +0000 UTC m=+34.300871930 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/008308d5-c00b-40b2-a413-eb4caebc48c0-metrics-certs") pod "network-metrics-daemon-dlnjg" (UID: "008308d5-c00b-40b2-a413-eb4caebc48c0") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:43:34.715273 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:34.715245 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z259z\" (UniqueName: \"kubernetes.io/projected/473b4404-da3f-4c35-92c4-a69465dc3f06-kube-api-access-z259z\") pod \"network-check-target-mnsnh\" (UID: \"473b4404-da3f-4c35-92c4-a69465dc3f06\") " pod="openshift-network-diagnostics/network-check-target-mnsnh" Mar 18 16:43:34.715410 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:34.715368 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:43:34.715410 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:34.715382 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:43:34.715410 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:34.715390 2562 projected.go:194] Error preparing data for projected volume kube-api-access-z259z for pod openshift-network-diagnostics/network-check-target-mnsnh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:43:34.715574 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:34.715443 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/473b4404-da3f-4c35-92c4-a69465dc3f06-kube-api-access-z259z podName:473b4404-da3f-4c35-92c4-a69465dc3f06 nodeName:}" failed. No retries permitted until 2026-03-18 16:43:50.715429602 +0000 UTC m=+34.401551941 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-z259z" (UniqueName: "kubernetes.io/projected/473b4404-da3f-4c35-92c4-a69465dc3f06-kube-api-access-z259z") pod "network-check-target-mnsnh" (UID: "473b4404-da3f-4c35-92c4-a69465dc3f06") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:43:34.979846 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:34.979773 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mnsnh" Mar 18 16:43:34.979846 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:34.979781 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlnjg" Mar 18 16:43:34.980098 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:34.979917 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mnsnh" podUID="473b4404-da3f-4c35-92c4-a69465dc3f06" Mar 18 16:43:34.980098 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:34.979960 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cck66" Mar 18 16:43:34.980098 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:34.980082 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlnjg" podUID="008308d5-c00b-40b2-a413-eb4caebc48c0" Mar 18 16:43:34.980243 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:34.980179 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cck66" podUID="9a6bdc88-7df5-45fd-98ac-2f967cc2f192" Mar 18 16:43:36.980263 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:36.979963 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cck66" Mar 18 16:43:36.983294 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:36.981122 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlnjg" Mar 18 16:43:36.983294 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:36.981238 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlnjg" podUID="008308d5-c00b-40b2-a413-eb4caebc48c0" Mar 18 16:43:36.983294 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:36.981288 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mnsnh" Mar 18 16:43:36.983294 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:36.981352 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mnsnh" podUID="473b4404-da3f-4c35-92c4-a69465dc3f06" Mar 18 16:43:36.983294 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:36.981726 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cck66" podUID="9a6bdc88-7df5-45fd-98ac-2f967cc2f192" Mar 18 16:43:37.080862 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:37.080834 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9t8zk" event={"ID":"abad6a1d-3e28-4f96-90e8-383ed7b5b8e1","Type":"ContainerStarted","Data":"e105eccec61c30b33ceb0339617e8363b733c5e47d580f6bae9772dfdaa0ad30"} Mar 18 16:43:37.082173 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:37.082149 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-whxl2" event={"ID":"e010756a-9f4e-48da-b759-b8f2ff9d1f1b","Type":"ContainerStarted","Data":"32496f1004f9ecdb4fe1346a17fdab9485a1b45127bffe7b22a5a2d5b156e60e"} Mar 18 16:43:37.083230 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:37.083207 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qhxzz" event={"ID":"dcfc6f29-52f0-4f09-b50d-f044f9886e51","Type":"ContainerStarted","Data":"281900d74aeb2510207ebcb4680ee8a0e8f6226e68f993801de061d44b10a738"} Mar 18 16:43:37.084310 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:37.084293 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zd7qk" event={"ID":"184966df-7341-412a-aede-a32364efc520","Type":"ContainerStarted","Data":"1324c72a55f825720c0d6282a62abfc1348ad5e5b9c876dcecb035f4a7ca8c7d"} Mar 18 16:43:37.085866 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:37.085850 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" event={"ID":"09466950-dea7-48b9-b4a4-b9b73d845973","Type":"ContainerStarted","Data":"55b25b5b7ee29774418941eb628eb72edbc6f7f71f8812b1dfe2f9c7b94244cc"} Mar 18 16:43:37.086847 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:37.086832 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-62kx2" event={"ID":"7177d49a-364c-4c44-9dd5-4fa101f99bdb","Type":"ContainerStarted","Data":"ed3f0d151e0f40c93086f401c1ecdb7033a961cb5f06d86e20ab277359c138e1"} Mar 18 16:43:37.088157 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:37.088138 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-bhcbr" event={"ID":"ba39ce75-7939-4e19-9800-7072765d139b","Type":"ContainerStarted","Data":"7b95c2cdc892119c608dee0f416d9921f38bbad0dbe759b48ba6195431d849db"} Mar 18 16:43:37.089608 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:37.089592 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtmbp" event={"ID":"ce59883f-2102-41a1-af45-7415b5af228f","Type":"ContainerStarted","Data":"34e21b172e39c0887bad66d1504e3022ab34d5e7c297c545a8e62c252ae15cb2"} Mar 18 16:43:37.095334 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:37.095302 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-9t8zk" podStartSLOduration=3.126909263 podStartE2EDuration="20.095292922s" podCreationTimestamp="2026-03-18 16:43:17 +0000 UTC" firstStartedPulling="2026-03-18 16:43:19.798016973 +0000 UTC m=+3.484139313" lastFinishedPulling="2026-03-18 16:43:36.766400627 +0000 UTC m=+20.452522972" observedRunningTime="2026-03-18 16:43:37.095143309 +0000 UTC m=+20.781265671" watchObservedRunningTime="2026-03-18 16:43:37.095292922 +0000 UTC m=+20.781415333" Mar 18 16:43:37.107470 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:37.107436 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-62kx2" podStartSLOduration=3.107182352 podStartE2EDuration="20.107425978s" podCreationTimestamp="2026-03-18 16:43:17 +0000 UTC" firstStartedPulling="2026-03-18 16:43:19.766751074 +0000 UTC m=+3.452873414" lastFinishedPulling="2026-03-18 16:43:36.766994694 +0000 UTC m=+20.453117040" observedRunningTime="2026-03-18 16:43:37.107185061 +0000 UTC m=+20.793307424" watchObservedRunningTime="2026-03-18 16:43:37.107425978 +0000 UTC m=+20.793548339" Mar 18 16:43:37.139694 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:37.139649 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-zd7qk" podStartSLOduration=3.171886004 podStartE2EDuration="20.139638177s" podCreationTimestamp="2026-03-18 16:43:17 +0000 UTC" firstStartedPulling="2026-03-18 16:43:19.792173518 +0000 UTC m=+3.478295864" lastFinishedPulling="2026-03-18 16:43:36.759925693 +0000 UTC m=+20.446048037" observedRunningTime="2026-03-18 16:43:37.139060646 +0000 UTC m=+20.825183007" watchObservedRunningTime="2026-03-18 16:43:37.139638177 +0000 UTC m=+20.825760542" Mar 18 16:43:37.155761 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:37.155729 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-whxl2" podStartSLOduration=3.130574218 podStartE2EDuration="20.155720595s" podCreationTimestamp="2026-03-18 16:43:17 +0000 UTC" firstStartedPulling="2026-03-18 16:43:19.792208721 +0000 UTC m=+3.478331073" lastFinishedPulling="2026-03-18 16:43:36.817355095 +0000 UTC m=+20.503477450" observedRunningTime="2026-03-18 16:43:37.155304403 +0000 UTC m=+20.841426765" watchObservedRunningTime="2026-03-18 16:43:37.155720595 +0000 UTC m=+20.841842956" Mar 18 16:43:37.170610 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:37.170577 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-bhcbr" podStartSLOduration=3.181186845 podStartE2EDuration="20.170567821s" podCreationTimestamp="2026-03-18 16:43:17 +0000 UTC" firstStartedPulling="2026-03-18 16:43:19.77054208 +0000 UTC m=+3.456664423" lastFinishedPulling="2026-03-18 16:43:36.759923059 +0000 UTC m=+20.446045399" observedRunningTime="2026-03-18 16:43:37.170460224 +0000 UTC m=+20.856582587" watchObservedRunningTime="2026-03-18 16:43:37.170567821 +0000 UTC m=+20.856690211" Mar 18 16:43:37.320976 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:37.320850 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-62kx2" Mar 18 16:43:37.436797 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:37.436770 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9a6bdc88-7df5-45fd-98ac-2f967cc2f192-original-pull-secret\") pod \"global-pull-secret-syncer-cck66\" (UID: \"9a6bdc88-7df5-45fd-98ac-2f967cc2f192\") " pod="kube-system/global-pull-secret-syncer-cck66" Mar 18 16:43:37.436890 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:37.436878 2562 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Mar 18 16:43:37.436933 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:37.436925 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a6bdc88-7df5-45fd-98ac-2f967cc2f192-original-pull-secret podName:9a6bdc88-7df5-45fd-98ac-2f967cc2f192 nodeName:}" failed. No retries permitted until 2026-03-18 16:43:45.436910211 +0000 UTC m=+29.123032560 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9a6bdc88-7df5-45fd-98ac-2f967cc2f192-original-pull-secret") pod "global-pull-secret-syncer-cck66" (UID: "9a6bdc88-7df5-45fd-98ac-2f967cc2f192") : object "kube-system"/"original-pull-secret" not registered Mar 18 16:43:37.883759 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:37.883698 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-62kx2" Mar 18 16:43:37.884308 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:37.884289 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-62kx2" Mar 18 16:43:38.092105 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:38.092077 2562 generic.go:358] "Generic (PLEG): container finished" podID="dcfc6f29-52f0-4f09-b50d-f044f9886e51" containerID="281900d74aeb2510207ebcb4680ee8a0e8f6226e68f993801de061d44b10a738" exitCode=0 Mar 18 16:43:38.092601 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:38.092157 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qhxzz" event={"ID":"dcfc6f29-52f0-4f09-b50d-f044f9886e51","Type":"ContainerDied","Data":"281900d74aeb2510207ebcb4680ee8a0e8f6226e68f993801de061d44b10a738"} Mar 18 16:43:38.094426 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:38.094407 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nsv52_09466950-dea7-48b9-b4a4-b9b73d845973/ovn-acl-logging/0.log" Mar 18 16:43:38.094709 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:38.094684 2562 generic.go:358] "Generic (PLEG): container finished" podID="09466950-dea7-48b9-b4a4-b9b73d845973" containerID="b12853a709a05e8df84c7074883f9659463bc4c4756be185ee67c9c72d995dcb" exitCode=1 Mar 18 16:43:38.094794 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:38.094712 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" event={"ID":"09466950-dea7-48b9-b4a4-b9b73d845973","Type":"ContainerStarted","Data":"05fc324ce336fef3346aeff9c50419e34aac1191374d20b66931912562a26318"} Mar 18 16:43:38.094794 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:38.094735 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" event={"ID":"09466950-dea7-48b9-b4a4-b9b73d845973","Type":"ContainerStarted","Data":"7b1293f48cb6110ae6f4d3949acbe83d7edfc15656f991c7119593d77e5933fd"} Mar 18 16:43:38.094794 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:38.094755 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" event={"ID":"09466950-dea7-48b9-b4a4-b9b73d845973","Type":"ContainerStarted","Data":"d2650f23b6855df9853fdfbe78b23b02792e8c254927b70c2f1422b76f148c9c"} Mar 18 16:43:38.094794 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:38.094769 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" event={"ID":"09466950-dea7-48b9-b4a4-b9b73d845973","Type":"ContainerStarted","Data":"835dfbf96b54422e33efde72e661a617941d60ec8504c90f24483740cff28a7c"} Mar 18 16:43:38.094794 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:38.094780 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" event={"ID":"09466950-dea7-48b9-b4a4-b9b73d845973","Type":"ContainerDied","Data":"b12853a709a05e8df84c7074883f9659463bc4c4756be185ee67c9c72d995dcb"} Mar 18 16:43:38.095914 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:38.095897 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-s4vkz" event={"ID":"203ce5e3-84d7-4528-841c-52a57d9ccb6e","Type":"ContainerStarted","Data":"53fb25a430ccf8200baf66a6a721446e9a26fc2d3f8a058c9f8b68ae44b5c44f"} Mar 18 16:43:38.096678 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:38.096661 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-62kx2" Mar 18 16:43:38.513846 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:38.513815 2562 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Mar 18 16:43:38.952064 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:38.951954 2562 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-03-18T16:43:38.513837359Z","UUID":"2e9f0255-d45f-41cd-89ea-a118fb842d8b","Handler":null,"Name":"","Endpoint":""} Mar 18 16:43:38.954531 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:38.954505 2562 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Mar 18 16:43:38.954654 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:38.954546 2562 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Mar 18 16:43:38.979974 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:38.979933 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cck66" Mar 18 16:43:38.980087 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:38.980056 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cck66" podUID="9a6bdc88-7df5-45fd-98ac-2f967cc2f192" Mar 18 16:43:38.980087 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:38.979933 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlnjg" Mar 18 16:43:38.980195 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:38.980155 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlnjg" podUID="008308d5-c00b-40b2-a413-eb4caebc48c0" Mar 18 16:43:38.980244 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:38.980226 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mnsnh" Mar 18 16:43:38.980313 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:38.980290 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mnsnh" podUID="473b4404-da3f-4c35-92c4-a69465dc3f06" Mar 18 16:43:39.100132 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:39.099971 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtmbp" event={"ID":"ce59883f-2102-41a1-af45-7415b5af228f","Type":"ContainerStarted","Data":"694e52107da2c2c16c3c91fe05185e4c75d623ee78005cec4410197d33f5af39"} Mar 18 16:43:39.114137 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:39.114092 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-s4vkz" podStartSLOduration=5.123322182 podStartE2EDuration="22.114078574s" podCreationTimestamp="2026-03-18 16:43:17 +0000 UTC" firstStartedPulling="2026-03-18 16:43:19.769268271 +0000 UTC m=+3.455390611" lastFinishedPulling="2026-03-18 16:43:36.760024663 +0000 UTC m=+20.446147003" observedRunningTime="2026-03-18 16:43:39.113931572 +0000 UTC m=+22.800053948" watchObservedRunningTime="2026-03-18 16:43:39.114078574 +0000 UTC m=+22.800200936" Mar 18 16:43:40.104969 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:40.104731 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nsv52_09466950-dea7-48b9-b4a4-b9b73d845973/ovn-acl-logging/0.log" Mar 18 16:43:40.105478 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:40.105368 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" event={"ID":"09466950-dea7-48b9-b4a4-b9b73d845973","Type":"ContainerStarted","Data":"5afaf7af27c1991f3d530af62eb401cf4d93372e3b89910d093b3c934da66fa2"} Mar 18 16:43:40.107347 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:40.107299 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtmbp" event={"ID":"ce59883f-2102-41a1-af45-7415b5af228f","Type":"ContainerStarted","Data":"138fa5d700f45d6b375b8e20a4cd8730febeb068e0146740e3d939e49b8e66e3"} Mar 18 16:43:40.126587 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:40.126542 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtmbp" podStartSLOduration=3.204934302 podStartE2EDuration="23.126526305s" podCreationTimestamp="2026-03-18 16:43:17 +0000 UTC" firstStartedPulling="2026-03-18 16:43:19.764011346 +0000 UTC m=+3.450133686" lastFinishedPulling="2026-03-18 16:43:39.685603346 +0000 UTC m=+23.371725689" observedRunningTime="2026-03-18 16:43:40.126100766 +0000 UTC m=+23.812223128" watchObservedRunningTime="2026-03-18 16:43:40.126526305 +0000 UTC m=+23.812648668" Mar 18 16:43:40.979693 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:40.979661 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlnjg" Mar 18 16:43:40.979693 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:40.979693 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mnsnh" Mar 18 16:43:40.979925 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:40.979669 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cck66" Mar 18 16:43:40.979925 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:40.979785 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlnjg" podUID="008308d5-c00b-40b2-a413-eb4caebc48c0" Mar 18 16:43:40.980397 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:40.980259 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cck66" podUID="9a6bdc88-7df5-45fd-98ac-2f967cc2f192" Mar 18 16:43:40.980397 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:40.980354 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mnsnh" podUID="473b4404-da3f-4c35-92c4-a69465dc3f06" Mar 18 16:43:42.980788 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:42.980626 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mnsnh" Mar 18 16:43:42.981537 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:42.980630 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlnjg" Mar 18 16:43:42.981537 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:42.980845 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mnsnh" podUID="473b4404-da3f-4c35-92c4-a69465dc3f06" Mar 18 16:43:42.981537 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:42.980649 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cck66" Mar 18 16:43:42.981537 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:42.980890 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlnjg" podUID="008308d5-c00b-40b2-a413-eb4caebc48c0" Mar 18 16:43:42.981537 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:42.980994 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cck66" podUID="9a6bdc88-7df5-45fd-98ac-2f967cc2f192" Mar 18 16:43:43.114772 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:43.114745 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nsv52_09466950-dea7-48b9-b4a4-b9b73d845973/ovn-acl-logging/0.log" Mar 18 16:43:43.115098 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:43.115077 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" event={"ID":"09466950-dea7-48b9-b4a4-b9b73d845973","Type":"ContainerStarted","Data":"e62017b7da377d3ca4b857529f1e8ac97dcabd6c2d34fb576ae4fe665b7ee720"} Mar 18 16:43:43.115416 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:43.115396 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:43.115542 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:43.115436 2562 scope.go:117] "RemoveContainer" containerID="b12853a709a05e8df84c7074883f9659463bc4c4756be185ee67c9c72d995dcb" Mar 18 16:43:43.115542 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:43.115478 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:43.116809 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:43.116786 2562 generic.go:358] "Generic (PLEG): container finished" podID="dcfc6f29-52f0-4f09-b50d-f044f9886e51" containerID="0b66d92521f404229443b478da82b1b5205ca629549095422bb59fddd2dcb1ed" exitCode=0 Mar 18 16:43:43.116899 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:43.116811 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qhxzz" event={"ID":"dcfc6f29-52f0-4f09-b50d-f044f9886e51","Type":"ContainerDied","Data":"0b66d92521f404229443b478da82b1b5205ca629549095422bb59fddd2dcb1ed"} Mar 18 16:43:43.130637 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:43.130607 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:43.131148 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:43.131135 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:44.111404 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:44.111338 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-cck66"] Mar 18 16:43:44.111707 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:44.111442 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cck66" Mar 18 16:43:44.111707 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:44.111515 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cck66" podUID="9a6bdc88-7df5-45fd-98ac-2f967cc2f192" Mar 18 16:43:44.114437 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:44.114415 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-mnsnh"] Mar 18 16:43:44.114523 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:44.114502 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mnsnh" Mar 18 16:43:44.114583 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:44.114567 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mnsnh" podUID="473b4404-da3f-4c35-92c4-a69465dc3f06" Mar 18 16:43:44.117538 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:44.117518 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-dlnjg"] Mar 18 16:43:44.117627 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:44.117606 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlnjg" Mar 18 16:43:44.117717 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:44.117698 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlnjg" podUID="008308d5-c00b-40b2-a413-eb4caebc48c0" Mar 18 16:43:44.122338 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:44.122322 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nsv52_09466950-dea7-48b9-b4a4-b9b73d845973/ovn-acl-logging/0.log" Mar 18 16:43:44.122747 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:44.122720 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" event={"ID":"09466950-dea7-48b9-b4a4-b9b73d845973","Type":"ContainerStarted","Data":"ec7cf9c512eb895fabb48cb667717d968da1fbee6fac071e1acbb9dda037805a"} Mar 18 16:43:44.122839 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:44.122801 2562 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 16:43:44.124783 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:44.124758 2562 generic.go:358] "Generic (PLEG): container finished" podID="dcfc6f29-52f0-4f09-b50d-f044f9886e51" containerID="defb12e9a05d9a102c8b5c1a7cca37abed152b109b970f8164ef0c2b3b8d7ac0" exitCode=0 Mar 18 16:43:44.124856 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:44.124792 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qhxzz" event={"ID":"dcfc6f29-52f0-4f09-b50d-f044f9886e51","Type":"ContainerDied","Data":"defb12e9a05d9a102c8b5c1a7cca37abed152b109b970f8164ef0c2b3b8d7ac0"} Mar 18 16:43:44.154344 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:44.154304 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" podStartSLOduration=10.08497879 podStartE2EDuration="27.154292481s" podCreationTimestamp="2026-03-18 16:43:17 +0000 UTC" firstStartedPulling="2026-03-18 16:43:19.76878323 +0000 UTC m=+3.454905574" lastFinishedPulling="2026-03-18 16:43:36.83809691 +0000 UTC m=+20.524219265" observedRunningTime="2026-03-18 16:43:44.151233928 +0000 UTC m=+27.837356301" watchObservedRunningTime="2026-03-18 16:43:44.154292481 +0000 UTC m=+27.840414843" Mar 18 16:43:45.128453 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:45.128393 2562 generic.go:358] "Generic (PLEG): container finished" podID="dcfc6f29-52f0-4f09-b50d-f044f9886e51" containerID="3687bc5d6d947a706e595cb603347982587929e00a78ef21fc427ffa103ef277" exitCode=0 Mar 18 16:43:45.128747 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:45.128479 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qhxzz" event={"ID":"dcfc6f29-52f0-4f09-b50d-f044f9886e51","Type":"ContainerDied","Data":"3687bc5d6d947a706e595cb603347982587929e00a78ef21fc427ffa103ef277"} Mar 18 16:43:45.128747 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:45.128694 2562 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 16:43:45.506715 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:45.506645 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9a6bdc88-7df5-45fd-98ac-2f967cc2f192-original-pull-secret\") pod \"global-pull-secret-syncer-cck66\" (UID: \"9a6bdc88-7df5-45fd-98ac-2f967cc2f192\") " pod="kube-system/global-pull-secret-syncer-cck66" Mar 18 16:43:45.506830 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:45.506749 2562 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Mar 18 16:43:45.506830 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:45.506794 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a6bdc88-7df5-45fd-98ac-2f967cc2f192-original-pull-secret podName:9a6bdc88-7df5-45fd-98ac-2f967cc2f192 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:01.506781427 +0000 UTC m=+45.192903767 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9a6bdc88-7df5-45fd-98ac-2f967cc2f192-original-pull-secret") pod "global-pull-secret-syncer-cck66" (UID: "9a6bdc88-7df5-45fd-98ac-2f967cc2f192") : object "kube-system"/"original-pull-secret" not registered Mar 18 16:43:45.979903 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:45.979869 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mnsnh" Mar 18 16:43:45.980021 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:45.979974 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mnsnh" podUID="473b4404-da3f-4c35-92c4-a69465dc3f06" Mar 18 16:43:45.980141 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:45.980117 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlnjg" Mar 18 16:43:45.980257 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:45.980157 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cck66" Mar 18 16:43:45.980257 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:45.980241 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlnjg" podUID="008308d5-c00b-40b2-a413-eb4caebc48c0" Mar 18 16:43:45.980365 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:45.980309 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cck66" podUID="9a6bdc88-7df5-45fd-98ac-2f967cc2f192" Mar 18 16:43:47.980916 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:47.980696 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cck66" Mar 18 16:43:47.981357 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:47.980696 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mnsnh" Mar 18 16:43:47.981357 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:47.980702 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlnjg" Mar 18 16:43:47.981357 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:47.981030 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cck66" podUID="9a6bdc88-7df5-45fd-98ac-2f967cc2f192" Mar 18 16:43:47.981357 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:47.981133 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mnsnh" podUID="473b4404-da3f-4c35-92c4-a69465dc3f06" Mar 18 16:43:47.981357 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:47.981227 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dlnjg" podUID="008308d5-c00b-40b2-a413-eb4caebc48c0" Mar 18 16:43:49.053420 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.053382 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:49.053960 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.053637 2562 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 16:43:49.068206 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.068174 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nsv52" Mar 18 16:43:49.614685 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.614661 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-5.ec2.internal" event="NodeReady" Mar 18 16:43:49.614829 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.614778 2562 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Mar 18 16:43:49.649096 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.649068 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-85bd84d7cc-6q9bh"] Mar 18 16:43:49.683298 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.683153 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-84f6fbc954-l59hf"] Mar 18 16:43:49.683881 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.683550 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-85bd84d7cc-6q9bh" Mar 18 16:43:49.685658 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.685637 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Mar 18 16:43:49.685761 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.685706 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Mar 18 16:43:49.685838 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.685821 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Mar 18 16:43:49.685889 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.685836 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Mar 18 16:43:49.685952 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.685912 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-nfg9v\"" Mar 18 16:43:49.700955 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.700925 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-c66b889bd-d58k4"] Mar 18 16:43:49.701089 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.701069 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-84f6fbc954-l59hf" Mar 18 16:43:49.702759 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.702739 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Mar 18 16:43:49.702852 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.702777 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Mar 18 16:43:49.703183 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.703148 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Mar 18 16:43:49.703813 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.703544 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-27ljg\"" Mar 18 16:43:49.708303 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.708279 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Mar 18 16:43:49.722128 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.722071 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7f99d8fb77-f6mvg"] Mar 18 16:43:49.722226 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.722207 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-c66b889bd-d58k4" Mar 18 16:43:49.724026 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.723979 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Mar 18 16:43:49.741017 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.740998 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-85bd84d7cc-6q9bh"] Mar 18 16:43:49.741104 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.741023 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-c66b889bd-d58k4"] Mar 18 16:43:49.741104 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.741035 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7f99d8fb77-f6mvg"] Mar 18 16:43:49.741104 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.741062 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-84f6fbc954-l59hf"] Mar 18 16:43:49.741104 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.741080 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-t8ccf"] Mar 18 16:43:49.741279 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.741150 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7f99d8fb77-f6mvg" Mar 18 16:43:49.743167 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.743147 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Mar 18 16:43:49.743247 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.743198 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Mar 18 16:43:49.743247 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.743239 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Mar 18 16:43:49.743349 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.743302 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Mar 18 16:43:49.759609 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.759594 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-xwx7z"] Mar 18 16:43:49.759765 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.759750 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-t8ccf" Mar 18 16:43:49.761631 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.761611 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Mar 18 16:43:49.761631 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.761630 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Mar 18 16:43:49.761781 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.761754 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Mar 18 16:43:49.762116 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.762098 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-xkmdp\"" Mar 18 16:43:49.774832 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.774816 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-t8ccf"] Mar 18 16:43:49.774832 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.774835 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-xwx7z"] Mar 18 16:43:49.774984 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.774912 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xwx7z" Mar 18 16:43:49.776744 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.776716 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Mar 18 16:43:49.776825 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.776748 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-7bwt6\"" Mar 18 16:43:49.776825 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.776762 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Mar 18 16:43:49.846522 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.846346 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/90d8c52e-5d9f-4f3e-960d-eac042763564-hub\") pod \"cluster-proxy-proxy-agent-7f99d8fb77-f6mvg\" (UID: \"90d8c52e-5d9f-4f3e-960d-eac042763564\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7f99d8fb77-f6mvg" Mar 18 16:43:49.846522 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.846396 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e96119f-c454-4e76-a758-9be470a94be8-cert\") pod \"ingress-canary-t8ccf\" (UID: \"4e96119f-c454-4e76-a758-9be470a94be8\") " pod="openshift-ingress-canary/ingress-canary-t8ccf" Mar 18 16:43:49.846522 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.846437 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gpx7\" (UniqueName: \"kubernetes.io/projected/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-kube-api-access-9gpx7\") pod \"image-registry-84f6fbc954-l59hf\" (UID: \"678c1a16-a395-4d4a-b4f6-a2ac6e1870e8\") " pod="openshift-image-registry/image-registry-84f6fbc954-l59hf" Mar 18 16:43:49.846522 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.846478 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vvd6\" (UniqueName: \"kubernetes.io/projected/90d8c52e-5d9f-4f3e-960d-eac042763564-kube-api-access-6vvd6\") pod \"cluster-proxy-proxy-agent-7f99d8fb77-f6mvg\" (UID: \"90d8c52e-5d9f-4f3e-960d-eac042763564\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7f99d8fb77-f6mvg" Mar 18 16:43:49.846522 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.846520 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/90d8c52e-5d9f-4f3e-960d-eac042763564-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7f99d8fb77-f6mvg\" (UID: \"90d8c52e-5d9f-4f3e-960d-eac042763564\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7f99d8fb77-f6mvg" Mar 18 16:43:49.846820 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.846577 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/736b2881-9edf-4b48-bf49-24dcdaf409a2-klusterlet-config\") pod \"klusterlet-addon-workmgr-c66b889bd-d58k4\" (UID: \"736b2881-9edf-4b48-bf49-24dcdaf409a2\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-c66b889bd-d58k4" Mar 18 16:43:49.846820 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.846611 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/736b2881-9edf-4b48-bf49-24dcdaf409a2-tmp\") pod \"klusterlet-addon-workmgr-c66b889bd-d58k4\" (UID: \"736b2881-9edf-4b48-bf49-24dcdaf409a2\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-c66b889bd-d58k4" Mar 18 16:43:49.846820 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.846637 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/90d8c52e-5d9f-4f3e-960d-eac042763564-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7f99d8fb77-f6mvg\" (UID: \"90d8c52e-5d9f-4f3e-960d-eac042763564\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7f99d8fb77-f6mvg" Mar 18 16:43:49.846820 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.846675 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-ca-trust-extracted\") pod \"image-registry-84f6fbc954-l59hf\" (UID: \"678c1a16-a395-4d4a-b4f6-a2ac6e1870e8\") " pod="openshift-image-registry/image-registry-84f6fbc954-l59hf" Mar 18 16:43:49.846820 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.846709 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-registry-certificates\") pod \"image-registry-84f6fbc954-l59hf\" (UID: \"678c1a16-a395-4d4a-b4f6-a2ac6e1870e8\") " pod="openshift-image-registry/image-registry-84f6fbc954-l59hf" Mar 18 16:43:49.846820 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.846769 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-image-registry-private-configuration\") pod \"image-registry-84f6fbc954-l59hf\" (UID: \"678c1a16-a395-4d4a-b4f6-a2ac6e1870e8\") " pod="openshift-image-registry/image-registry-84f6fbc954-l59hf" Mar 18 16:43:49.846820 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.846808 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-bound-sa-token\") pod \"image-registry-84f6fbc954-l59hf\" (UID: \"678c1a16-a395-4d4a-b4f6-a2ac6e1870e8\") " pod="openshift-image-registry/image-registry-84f6fbc954-l59hf" Mar 18 16:43:49.847163 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.846902 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-trusted-ca\") pod \"image-registry-84f6fbc954-l59hf\" (UID: \"678c1a16-a395-4d4a-b4f6-a2ac6e1870e8\") " pod="openshift-image-registry/image-registry-84f6fbc954-l59hf" Mar 18 16:43:49.847163 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.846966 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/72d8d7bb-cf42-421f-94be-19b1c6f8778b-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-85bd84d7cc-6q9bh\" (UID: \"72d8d7bb-cf42-421f-94be-19b1c6f8778b\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-85bd84d7cc-6q9bh" Mar 18 16:43:49.847163 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.846997 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhp6w\" (UniqueName: \"kubernetes.io/projected/72d8d7bb-cf42-421f-94be-19b1c6f8778b-kube-api-access-lhp6w\") pod \"managed-serviceaccount-addon-agent-85bd84d7cc-6q9bh\" (UID: \"72d8d7bb-cf42-421f-94be-19b1c6f8778b\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-85bd84d7cc-6q9bh" Mar 18 16:43:49.847163 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.847030 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-registry-tls\") pod \"image-registry-84f6fbc954-l59hf\" (UID: \"678c1a16-a395-4d4a-b4f6-a2ac6e1870e8\") " pod="openshift-image-registry/image-registry-84f6fbc954-l59hf" Mar 18 16:43:49.847163 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.847067 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-installation-pull-secrets\") pod \"image-registry-84f6fbc954-l59hf\" (UID: \"678c1a16-a395-4d4a-b4f6-a2ac6e1870e8\") " pod="openshift-image-registry/image-registry-84f6fbc954-l59hf" Mar 18 16:43:49.847163 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.847100 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvqs2\" (UniqueName: \"kubernetes.io/projected/736b2881-9edf-4b48-bf49-24dcdaf409a2-kube-api-access-nvqs2\") pod \"klusterlet-addon-workmgr-c66b889bd-d58k4\" (UID: \"736b2881-9edf-4b48-bf49-24dcdaf409a2\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-c66b889bd-d58k4" Mar 18 16:43:49.847163 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.847134 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/90d8c52e-5d9f-4f3e-960d-eac042763564-ca\") pod \"cluster-proxy-proxy-agent-7f99d8fb77-f6mvg\" (UID: \"90d8c52e-5d9f-4f3e-960d-eac042763564\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7f99d8fb77-f6mvg" Mar 18 16:43:49.847468 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.847188 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/90d8c52e-5d9f-4f3e-960d-eac042763564-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7f99d8fb77-f6mvg\" (UID: \"90d8c52e-5d9f-4f3e-960d-eac042763564\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7f99d8fb77-f6mvg" Mar 18 16:43:49.847468 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.847224 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjlpr\" (UniqueName: \"kubernetes.io/projected/4e96119f-c454-4e76-a758-9be470a94be8-kube-api-access-mjlpr\") pod \"ingress-canary-t8ccf\" (UID: \"4e96119f-c454-4e76-a758-9be470a94be8\") " pod="openshift-ingress-canary/ingress-canary-t8ccf" Mar 18 16:43:49.947748 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.947715 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-trusted-ca\") pod \"image-registry-84f6fbc954-l59hf\" (UID: \"678c1a16-a395-4d4a-b4f6-a2ac6e1870e8\") " pod="openshift-image-registry/image-registry-84f6fbc954-l59hf" Mar 18 16:43:49.947883 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.947768 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/72d8d7bb-cf42-421f-94be-19b1c6f8778b-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-85bd84d7cc-6q9bh\" (UID: \"72d8d7bb-cf42-421f-94be-19b1c6f8778b\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-85bd84d7cc-6q9bh" Mar 18 16:43:49.947883 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.947797 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lhp6w\" (UniqueName: \"kubernetes.io/projected/72d8d7bb-cf42-421f-94be-19b1c6f8778b-kube-api-access-lhp6w\") pod \"managed-serviceaccount-addon-agent-85bd84d7cc-6q9bh\" (UID: \"72d8d7bb-cf42-421f-94be-19b1c6f8778b\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-85bd84d7cc-6q9bh" Mar 18 16:43:49.947883 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.947822 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-registry-tls\") pod \"image-registry-84f6fbc954-l59hf\" (UID: \"678c1a16-a395-4d4a-b4f6-a2ac6e1870e8\") " pod="openshift-image-registry/image-registry-84f6fbc954-l59hf" Mar 18 16:43:49.947883 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.947851 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-installation-pull-secrets\") pod \"image-registry-84f6fbc954-l59hf\" (UID: \"678c1a16-a395-4d4a-b4f6-a2ac6e1870e8\") " pod="openshift-image-registry/image-registry-84f6fbc954-l59hf" Mar 18 16:43:49.947883 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.947873 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nvqs2\" (UniqueName: \"kubernetes.io/projected/736b2881-9edf-4b48-bf49-24dcdaf409a2-kube-api-access-nvqs2\") pod \"klusterlet-addon-workmgr-c66b889bd-d58k4\" (UID: \"736b2881-9edf-4b48-bf49-24dcdaf409a2\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-c66b889bd-d58k4" Mar 18 16:43:49.948154 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.947897 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/90d8c52e-5d9f-4f3e-960d-eac042763564-ca\") pod \"cluster-proxy-proxy-agent-7f99d8fb77-f6mvg\" (UID: \"90d8c52e-5d9f-4f3e-960d-eac042763564\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7f99d8fb77-f6mvg" Mar 18 16:43:49.948154 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.947927 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3a05d5d2-a005-4847-b2aa-0f78e327686a-tmp-dir\") pod \"dns-default-xwx7z\" (UID: \"3a05d5d2-a005-4847-b2aa-0f78e327686a\") " pod="openshift-dns/dns-default-xwx7z" Mar 18 16:43:49.948154 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.947971 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/90d8c52e-5d9f-4f3e-960d-eac042763564-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7f99d8fb77-f6mvg\" (UID: \"90d8c52e-5d9f-4f3e-960d-eac042763564\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7f99d8fb77-f6mvg" Mar 18 16:43:49.948154 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.947996 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mjlpr\" (UniqueName: \"kubernetes.io/projected/4e96119f-c454-4e76-a758-9be470a94be8-kube-api-access-mjlpr\") pod \"ingress-canary-t8ccf\" (UID: \"4e96119f-c454-4e76-a758-9be470a94be8\") " pod="openshift-ingress-canary/ingress-canary-t8ccf" Mar 18 16:43:49.948154 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.948022 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p928\" (UniqueName: \"kubernetes.io/projected/3a05d5d2-a005-4847-b2aa-0f78e327686a-kube-api-access-4p928\") pod \"dns-default-xwx7z\" (UID: \"3a05d5d2-a005-4847-b2aa-0f78e327686a\") " pod="openshift-dns/dns-default-xwx7z" Mar 18 16:43:49.948154 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.948043 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/90d8c52e-5d9f-4f3e-960d-eac042763564-hub\") pod \"cluster-proxy-proxy-agent-7f99d8fb77-f6mvg\" (UID: \"90d8c52e-5d9f-4f3e-960d-eac042763564\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7f99d8fb77-f6mvg" Mar 18 16:43:49.948154 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.948058 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e96119f-c454-4e76-a758-9be470a94be8-cert\") pod \"ingress-canary-t8ccf\" (UID: \"4e96119f-c454-4e76-a758-9be470a94be8\") " pod="openshift-ingress-canary/ingress-canary-t8ccf" Mar 18 16:43:49.948154 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.948087 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9gpx7\" (UniqueName: \"kubernetes.io/projected/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-kube-api-access-9gpx7\") pod \"image-registry-84f6fbc954-l59hf\" (UID: \"678c1a16-a395-4d4a-b4f6-a2ac6e1870e8\") " pod="openshift-image-registry/image-registry-84f6fbc954-l59hf" Mar 18 16:43:49.948154 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.948118 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6vvd6\" (UniqueName: \"kubernetes.io/projected/90d8c52e-5d9f-4f3e-960d-eac042763564-kube-api-access-6vvd6\") pod \"cluster-proxy-proxy-agent-7f99d8fb77-f6mvg\" (UID: \"90d8c52e-5d9f-4f3e-960d-eac042763564\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7f99d8fb77-f6mvg" Mar 18 16:43:49.948154 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.948147 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/90d8c52e-5d9f-4f3e-960d-eac042763564-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7f99d8fb77-f6mvg\" (UID: \"90d8c52e-5d9f-4f3e-960d-eac042763564\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7f99d8fb77-f6mvg" Mar 18 16:43:49.948606 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.948172 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3a05d5d2-a005-4847-b2aa-0f78e327686a-metrics-tls\") pod \"dns-default-xwx7z\" (UID: \"3a05d5d2-a005-4847-b2aa-0f78e327686a\") " pod="openshift-dns/dns-default-xwx7z" Mar 18 16:43:49.948606 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.948245 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/736b2881-9edf-4b48-bf49-24dcdaf409a2-klusterlet-config\") pod \"klusterlet-addon-workmgr-c66b889bd-d58k4\" (UID: \"736b2881-9edf-4b48-bf49-24dcdaf409a2\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-c66b889bd-d58k4" Mar 18 16:43:49.948606 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.948276 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/736b2881-9edf-4b48-bf49-24dcdaf409a2-tmp\") pod \"klusterlet-addon-workmgr-c66b889bd-d58k4\" (UID: \"736b2881-9edf-4b48-bf49-24dcdaf409a2\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-c66b889bd-d58k4" Mar 18 16:43:49.948606 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.948301 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/90d8c52e-5d9f-4f3e-960d-eac042763564-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7f99d8fb77-f6mvg\" (UID: \"90d8c52e-5d9f-4f3e-960d-eac042763564\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7f99d8fb77-f6mvg" Mar 18 16:43:49.948606 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.948331 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a05d5d2-a005-4847-b2aa-0f78e327686a-config-volume\") pod \"dns-default-xwx7z\" (UID: \"3a05d5d2-a005-4847-b2aa-0f78e327686a\") " pod="openshift-dns/dns-default-xwx7z" Mar 18 16:43:49.948606 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.948364 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-ca-trust-extracted\") pod \"image-registry-84f6fbc954-l59hf\" (UID: \"678c1a16-a395-4d4a-b4f6-a2ac6e1870e8\") " pod="openshift-image-registry/image-registry-84f6fbc954-l59hf" Mar 18 16:43:49.948606 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.948392 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-registry-certificates\") pod \"image-registry-84f6fbc954-l59hf\" (UID: \"678c1a16-a395-4d4a-b4f6-a2ac6e1870e8\") " pod="openshift-image-registry/image-registry-84f6fbc954-l59hf" Mar 18 16:43:49.948606 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.948427 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-image-registry-private-configuration\") pod \"image-registry-84f6fbc954-l59hf\" (UID: \"678c1a16-a395-4d4a-b4f6-a2ac6e1870e8\") " pod="openshift-image-registry/image-registry-84f6fbc954-l59hf" Mar 18 16:43:49.948606 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.948452 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-bound-sa-token\") pod \"image-registry-84f6fbc954-l59hf\" (UID: \"678c1a16-a395-4d4a-b4f6-a2ac6e1870e8\") " pod="openshift-image-registry/image-registry-84f6fbc954-l59hf" Mar 18 16:43:49.949033 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.948875 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-trusted-ca\") pod \"image-registry-84f6fbc954-l59hf\" (UID: \"678c1a16-a395-4d4a-b4f6-a2ac6e1870e8\") " pod="openshift-image-registry/image-registry-84f6fbc954-l59hf" Mar 18 16:43:49.949318 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:49.949295 2562 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 18 16:43:49.949318 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:49.949317 2562 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-84f6fbc954-l59hf: secret "image-registry-tls" not found Mar 18 16:43:49.949469 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:49.949373 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-registry-tls podName:678c1a16-a395-4d4a-b4f6-a2ac6e1870e8 nodeName:}" failed. No retries permitted until 2026-03-18 16:43:50.449354432 +0000 UTC m=+34.135476776 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-registry-tls") pod "image-registry-84f6fbc954-l59hf" (UID: "678c1a16-a395-4d4a-b4f6-a2ac6e1870e8") : secret "image-registry-tls" not found Mar 18 16:43:49.949469 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.949408 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-ca-trust-extracted\") pod \"image-registry-84f6fbc954-l59hf\" (UID: \"678c1a16-a395-4d4a-b4f6-a2ac6e1870e8\") " pod="openshift-image-registry/image-registry-84f6fbc954-l59hf" Mar 18 16:43:49.949856 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:49.949835 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:43:49.949974 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:49.949890 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4e96119f-c454-4e76-a758-9be470a94be8-cert podName:4e96119f-c454-4e76-a758-9be470a94be8 nodeName:}" failed. No retries permitted until 2026-03-18 16:43:50.449872182 +0000 UTC m=+34.135994522 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4e96119f-c454-4e76-a758-9be470a94be8-cert") pod "ingress-canary-t8ccf" (UID: "4e96119f-c454-4e76-a758-9be470a94be8") : secret "canary-serving-cert" not found Mar 18 16:43:49.949974 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.949915 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-registry-certificates\") pod \"image-registry-84f6fbc954-l59hf\" (UID: \"678c1a16-a395-4d4a-b4f6-a2ac6e1870e8\") " pod="openshift-image-registry/image-registry-84f6fbc954-l59hf" Mar 18 16:43:49.950327 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.950231 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/90d8c52e-5d9f-4f3e-960d-eac042763564-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7f99d8fb77-f6mvg\" (UID: \"90d8c52e-5d9f-4f3e-960d-eac042763564\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7f99d8fb77-f6mvg" Mar 18 16:43:49.950582 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.950559 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/736b2881-9edf-4b48-bf49-24dcdaf409a2-tmp\") pod \"klusterlet-addon-workmgr-c66b889bd-d58k4\" (UID: \"736b2881-9edf-4b48-bf49-24dcdaf409a2\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-c66b889bd-d58k4" Mar 18 16:43:49.952842 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.952818 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-image-registry-private-configuration\") pod \"image-registry-84f6fbc954-l59hf\" (UID: \"678c1a16-a395-4d4a-b4f6-a2ac6e1870e8\") " pod="openshift-image-registry/image-registry-84f6fbc954-l59hf" Mar 18 16:43:49.953307 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.953280 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-installation-pull-secrets\") pod \"image-registry-84f6fbc954-l59hf\" (UID: \"678c1a16-a395-4d4a-b4f6-a2ac6e1870e8\") " pod="openshift-image-registry/image-registry-84f6fbc954-l59hf" Mar 18 16:43:49.953402 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.953355 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/90d8c52e-5d9f-4f3e-960d-eac042763564-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7f99d8fb77-f6mvg\" (UID: \"90d8c52e-5d9f-4f3e-960d-eac042763564\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7f99d8fb77-f6mvg" Mar 18 16:43:49.953453 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.953420 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/90d8c52e-5d9f-4f3e-960d-eac042763564-hub\") pod \"cluster-proxy-proxy-agent-7f99d8fb77-f6mvg\" (UID: \"90d8c52e-5d9f-4f3e-960d-eac042763564\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7f99d8fb77-f6mvg" Mar 18 16:43:49.953453 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.953428 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/736b2881-9edf-4b48-bf49-24dcdaf409a2-klusterlet-config\") pod \"klusterlet-addon-workmgr-c66b889bd-d58k4\" (UID: \"736b2881-9edf-4b48-bf49-24dcdaf409a2\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-c66b889bd-d58k4" Mar 18 16:43:49.953547 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.953494 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/72d8d7bb-cf42-421f-94be-19b1c6f8778b-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-85bd84d7cc-6q9bh\" (UID: \"72d8d7bb-cf42-421f-94be-19b1c6f8778b\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-85bd84d7cc-6q9bh" Mar 18 16:43:49.961018 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.960975 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjlpr\" (UniqueName: \"kubernetes.io/projected/4e96119f-c454-4e76-a758-9be470a94be8-kube-api-access-mjlpr\") pod \"ingress-canary-t8ccf\" (UID: \"4e96119f-c454-4e76-a758-9be470a94be8\") " pod="openshift-ingress-canary/ingress-canary-t8ccf" Mar 18 16:43:49.961921 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.961490 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vvd6\" (UniqueName: \"kubernetes.io/projected/90d8c52e-5d9f-4f3e-960d-eac042763564-kube-api-access-6vvd6\") pod \"cluster-proxy-proxy-agent-7f99d8fb77-f6mvg\" (UID: \"90d8c52e-5d9f-4f3e-960d-eac042763564\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7f99d8fb77-f6mvg" Mar 18 16:43:49.961921 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.961495 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gpx7\" (UniqueName: \"kubernetes.io/projected/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-kube-api-access-9gpx7\") pod \"image-registry-84f6fbc954-l59hf\" (UID: \"678c1a16-a395-4d4a-b4f6-a2ac6e1870e8\") " pod="openshift-image-registry/image-registry-84f6fbc954-l59hf" Mar 18 16:43:49.961921 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.961581 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhp6w\" (UniqueName: \"kubernetes.io/projected/72d8d7bb-cf42-421f-94be-19b1c6f8778b-kube-api-access-lhp6w\") pod \"managed-serviceaccount-addon-agent-85bd84d7cc-6q9bh\" (UID: \"72d8d7bb-cf42-421f-94be-19b1c6f8778b\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-85bd84d7cc-6q9bh" Mar 18 16:43:49.961921 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.961884 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-bound-sa-token\") pod \"image-registry-84f6fbc954-l59hf\" (UID: \"678c1a16-a395-4d4a-b4f6-a2ac6e1870e8\") " pod="openshift-image-registry/image-registry-84f6fbc954-l59hf" Mar 18 16:43:49.962322 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.962299 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvqs2\" (UniqueName: \"kubernetes.io/projected/736b2881-9edf-4b48-bf49-24dcdaf409a2-kube-api-access-nvqs2\") pod \"klusterlet-addon-workmgr-c66b889bd-d58k4\" (UID: \"736b2881-9edf-4b48-bf49-24dcdaf409a2\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-c66b889bd-d58k4" Mar 18 16:43:49.963465 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.963422 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/90d8c52e-5d9f-4f3e-960d-eac042763564-ca\") pod \"cluster-proxy-proxy-agent-7f99d8fb77-f6mvg\" (UID: \"90d8c52e-5d9f-4f3e-960d-eac042763564\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7f99d8fb77-f6mvg" Mar 18 16:43:49.964322 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.964299 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/90d8c52e-5d9f-4f3e-960d-eac042763564-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7f99d8fb77-f6mvg\" (UID: \"90d8c52e-5d9f-4f3e-960d-eac042763564\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7f99d8fb77-f6mvg" Mar 18 16:43:49.980137 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.980084 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mnsnh" Mar 18 16:43:49.980233 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.980213 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlnjg" Mar 18 16:43:49.980295 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.980253 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cck66" Mar 18 16:43:49.982273 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.982249 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Mar 18 16:43:49.982340 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.982280 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Mar 18 16:43:49.982340 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.982319 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-nk2x6\"" Mar 18 16:43:49.982453 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.982437 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Mar 18 16:43:49.982560 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.982540 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Mar 18 16:43:49.982560 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:49.982552 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-pqk66\"" Mar 18 16:43:50.004166 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:50.004141 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-85bd84d7cc-6q9bh" Mar 18 16:43:50.032353 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:50.032315 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-c66b889bd-d58k4" Mar 18 16:43:50.049149 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:50.049125 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3a05d5d2-a005-4847-b2aa-0f78e327686a-tmp-dir\") pod \"dns-default-xwx7z\" (UID: \"3a05d5d2-a005-4847-b2aa-0f78e327686a\") " pod="openshift-dns/dns-default-xwx7z" Mar 18 16:43:50.049255 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:50.049160 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4p928\" (UniqueName: \"kubernetes.io/projected/3a05d5d2-a005-4847-b2aa-0f78e327686a-kube-api-access-4p928\") pod \"dns-default-xwx7z\" (UID: \"3a05d5d2-a005-4847-b2aa-0f78e327686a\") " pod="openshift-dns/dns-default-xwx7z" Mar 18 16:43:50.049375 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:50.049350 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3a05d5d2-a005-4847-b2aa-0f78e327686a-metrics-tls\") pod \"dns-default-xwx7z\" (UID: \"3a05d5d2-a005-4847-b2aa-0f78e327686a\") " pod="openshift-dns/dns-default-xwx7z" Mar 18 16:43:50.049453 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:50.049413 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a05d5d2-a005-4847-b2aa-0f78e327686a-config-volume\") pod \"dns-default-xwx7z\" (UID: \"3a05d5d2-a005-4847-b2aa-0f78e327686a\") " pod="openshift-dns/dns-default-xwx7z" Mar 18 16:43:50.049512 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:50.049447 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3a05d5d2-a005-4847-b2aa-0f78e327686a-tmp-dir\") pod \"dns-default-xwx7z\" (UID: \"3a05d5d2-a005-4847-b2aa-0f78e327686a\") " pod="openshift-dns/dns-default-xwx7z" Mar 18 16:43:50.049512 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:50.049492 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:43:50.049583 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:50.049551 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a05d5d2-a005-4847-b2aa-0f78e327686a-metrics-tls podName:3a05d5d2-a005-4847-b2aa-0f78e327686a nodeName:}" failed. No retries permitted until 2026-03-18 16:43:50.549530761 +0000 UTC m=+34.235653127 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3a05d5d2-a005-4847-b2aa-0f78e327686a-metrics-tls") pod "dns-default-xwx7z" (UID: "3a05d5d2-a005-4847-b2aa-0f78e327686a") : secret "dns-default-metrics-tls" not found Mar 18 16:43:50.049987 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:50.049963 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a05d5d2-a005-4847-b2aa-0f78e327686a-config-volume\") pod \"dns-default-xwx7z\" (UID: \"3a05d5d2-a005-4847-b2aa-0f78e327686a\") " pod="openshift-dns/dns-default-xwx7z" Mar 18 16:43:50.051110 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:50.051093 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7f99d8fb77-f6mvg" Mar 18 16:43:50.060200 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:50.060182 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p928\" (UniqueName: \"kubernetes.io/projected/3a05d5d2-a005-4847-b2aa-0f78e327686a-kube-api-access-4p928\") pod \"dns-default-xwx7z\" (UID: \"3a05d5d2-a005-4847-b2aa-0f78e327686a\") " pod="openshift-dns/dns-default-xwx7z" Mar 18 16:43:50.452433 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:50.452361 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-registry-tls\") pod \"image-registry-84f6fbc954-l59hf\" (UID: \"678c1a16-a395-4d4a-b4f6-a2ac6e1870e8\") " pod="openshift-image-registry/image-registry-84f6fbc954-l59hf" Mar 18 16:43:50.452433 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:50.452401 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e96119f-c454-4e76-a758-9be470a94be8-cert\") pod \"ingress-canary-t8ccf\" (UID: \"4e96119f-c454-4e76-a758-9be470a94be8\") " pod="openshift-ingress-canary/ingress-canary-t8ccf" Mar 18 16:43:50.452623 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:50.452522 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:43:50.452623 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:50.452526 2562 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 18 16:43:50.452623 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:50.452545 2562 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-84f6fbc954-l59hf: secret "image-registry-tls" not found Mar 18 16:43:50.452623 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:50.452584 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4e96119f-c454-4e76-a758-9be470a94be8-cert podName:4e96119f-c454-4e76-a758-9be470a94be8 nodeName:}" failed. No retries permitted until 2026-03-18 16:43:51.452570163 +0000 UTC m=+35.138692503 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4e96119f-c454-4e76-a758-9be470a94be8-cert") pod "ingress-canary-t8ccf" (UID: "4e96119f-c454-4e76-a758-9be470a94be8") : secret "canary-serving-cert" not found Mar 18 16:43:50.452623 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:50.452597 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-registry-tls podName:678c1a16-a395-4d4a-b4f6-a2ac6e1870e8 nodeName:}" failed. No retries permitted until 2026-03-18 16:43:51.452591653 +0000 UTC m=+35.138713993 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-registry-tls") pod "image-registry-84f6fbc954-l59hf" (UID: "678c1a16-a395-4d4a-b4f6-a2ac6e1870e8") : secret "image-registry-tls" not found Mar 18 16:43:50.552719 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:50.552689 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3a05d5d2-a005-4847-b2aa-0f78e327686a-metrics-tls\") pod \"dns-default-xwx7z\" (UID: \"3a05d5d2-a005-4847-b2aa-0f78e327686a\") " pod="openshift-dns/dns-default-xwx7z" Mar 18 16:43:50.552893 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:50.552849 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:43:50.552964 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:50.552917 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a05d5d2-a005-4847-b2aa-0f78e327686a-metrics-tls podName:3a05d5d2-a005-4847-b2aa-0f78e327686a nodeName:}" failed. No retries permitted until 2026-03-18 16:43:51.552897856 +0000 UTC m=+35.239020196 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3a05d5d2-a005-4847-b2aa-0f78e327686a-metrics-tls") pod "dns-default-xwx7z" (UID: "3a05d5d2-a005-4847-b2aa-0f78e327686a") : secret "dns-default-metrics-tls" not found Mar 18 16:43:50.653479 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:50.653444 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/008308d5-c00b-40b2-a413-eb4caebc48c0-metrics-certs\") pod \"network-metrics-daemon-dlnjg\" (UID: \"008308d5-c00b-40b2-a413-eb4caebc48c0\") " pod="openshift-multus/network-metrics-daemon-dlnjg" Mar 18 16:43:50.653660 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:50.653603 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 18 16:43:50.653704 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:50.653661 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/008308d5-c00b-40b2-a413-eb4caebc48c0-metrics-certs podName:008308d5-c00b-40b2-a413-eb4caebc48c0 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:22.653647716 +0000 UTC m=+66.339770056 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/008308d5-c00b-40b2-a413-eb4caebc48c0-metrics-certs") pod "network-metrics-daemon-dlnjg" (UID: "008308d5-c00b-40b2-a413-eb4caebc48c0") : secret "metrics-daemon-secret" not found Mar 18 16:43:50.754750 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:50.754727 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z259z\" (UniqueName: \"kubernetes.io/projected/473b4404-da3f-4c35-92c4-a69465dc3f06-kube-api-access-z259z\") pod \"network-check-target-mnsnh\" (UID: \"473b4404-da3f-4c35-92c4-a69465dc3f06\") " pod="openshift-network-diagnostics/network-check-target-mnsnh" Mar 18 16:43:50.757216 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:50.757186 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z259z\" (UniqueName: \"kubernetes.io/projected/473b4404-da3f-4c35-92c4-a69465dc3f06-kube-api-access-z259z\") pod \"network-check-target-mnsnh\" (UID: \"473b4404-da3f-4c35-92c4-a69465dc3f06\") " pod="openshift-network-diagnostics/network-check-target-mnsnh" Mar 18 16:43:50.891408 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:50.891251 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mnsnh" Mar 18 16:43:50.907308 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:50.907284 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-c66b889bd-d58k4"] Mar 18 16:43:50.925753 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:50.925727 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-85bd84d7cc-6q9bh"] Mar 18 16:43:50.932047 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:50.931975 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7f99d8fb77-f6mvg"] Mar 18 16:43:51.028383 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:51.028310 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-mnsnh"] Mar 18 16:43:51.034050 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:51.034017 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod736b2881_9edf_4b48_bf49_24dcdaf409a2.slice/crio-59ca5d4bb65965611b18832568d15968f975471994bcc9da1f70ad8fe3e61fde WatchSource:0}: Error finding container 59ca5d4bb65965611b18832568d15968f975471994bcc9da1f70ad8fe3e61fde: Status 404 returned error can't find the container with id 59ca5d4bb65965611b18832568d15968f975471994bcc9da1f70ad8fe3e61fde Mar 18 16:43:51.034695 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:51.034671 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72d8d7bb_cf42_421f_94be_19b1c6f8778b.slice/crio-afcfc6ef829f6d56e058a2340c394fecd2b3b45ffcacc6388a9e68f153919178 WatchSource:0}: Error finding container afcfc6ef829f6d56e058a2340c394fecd2b3b45ffcacc6388a9e68f153919178: Status 404 returned error can't find the container with id afcfc6ef829f6d56e058a2340c394fecd2b3b45ffcacc6388a9e68f153919178 Mar 18 16:43:51.035655 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:51.035634 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90d8c52e_5d9f_4f3e_960d_eac042763564.slice/crio-627adb3b9fb9b86c444856ff61b2592a84eb0c568d60720124c9aa8ede94be57 WatchSource:0}: Error finding container 627adb3b9fb9b86c444856ff61b2592a84eb0c568d60720124c9aa8ede94be57: Status 404 returned error can't find the container with id 627adb3b9fb9b86c444856ff61b2592a84eb0c568d60720124c9aa8ede94be57 Mar 18 16:43:51.036276 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:51.036258 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod473b4404_da3f_4c35_92c4_a69465dc3f06.slice/crio-3c83a629b45c0766f38668ad23ae35fa578e740c21daee5ef0ce50a3fc604da8 WatchSource:0}: Error finding container 3c83a629b45c0766f38668ad23ae35fa578e740c21daee5ef0ce50a3fc604da8: Status 404 returned error can't find the container with id 3c83a629b45c0766f38668ad23ae35fa578e740c21daee5ef0ce50a3fc604da8 Mar 18 16:43:51.141477 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:51.141437 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-c66b889bd-d58k4" event={"ID":"736b2881-9edf-4b48-bf49-24dcdaf409a2","Type":"ContainerStarted","Data":"59ca5d4bb65965611b18832568d15968f975471994bcc9da1f70ad8fe3e61fde"} Mar 18 16:43:51.142718 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:51.142691 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7f99d8fb77-f6mvg" event={"ID":"90d8c52e-5d9f-4f3e-960d-eac042763564","Type":"ContainerStarted","Data":"627adb3b9fb9b86c444856ff61b2592a84eb0c568d60720124c9aa8ede94be57"} Mar 18 16:43:51.143654 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:51.143627 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-85bd84d7cc-6q9bh" event={"ID":"72d8d7bb-cf42-421f-94be-19b1c6f8778b","Type":"ContainerStarted","Data":"afcfc6ef829f6d56e058a2340c394fecd2b3b45ffcacc6388a9e68f153919178"} Mar 18 16:43:51.144609 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:51.144590 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-mnsnh" event={"ID":"473b4404-da3f-4c35-92c4-a69465dc3f06","Type":"ContainerStarted","Data":"3c83a629b45c0766f38668ad23ae35fa578e740c21daee5ef0ce50a3fc604da8"} Mar 18 16:43:51.462004 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:51.461978 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-registry-tls\") pod \"image-registry-84f6fbc954-l59hf\" (UID: \"678c1a16-a395-4d4a-b4f6-a2ac6e1870e8\") " pod="openshift-image-registry/image-registry-84f6fbc954-l59hf" Mar 18 16:43:51.462116 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:51.462018 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e96119f-c454-4e76-a758-9be470a94be8-cert\") pod \"ingress-canary-t8ccf\" (UID: \"4e96119f-c454-4e76-a758-9be470a94be8\") " pod="openshift-ingress-canary/ingress-canary-t8ccf" Mar 18 16:43:51.462157 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:51.462118 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:43:51.462157 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:51.462123 2562 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 18 16:43:51.462157 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:51.462142 2562 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-84f6fbc954-l59hf: secret "image-registry-tls" not found Mar 18 16:43:51.462244 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:51.462166 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4e96119f-c454-4e76-a758-9be470a94be8-cert podName:4e96119f-c454-4e76-a758-9be470a94be8 nodeName:}" failed. No retries permitted until 2026-03-18 16:43:53.462154004 +0000 UTC m=+37.148276348 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4e96119f-c454-4e76-a758-9be470a94be8-cert") pod "ingress-canary-t8ccf" (UID: "4e96119f-c454-4e76-a758-9be470a94be8") : secret "canary-serving-cert" not found Mar 18 16:43:51.462244 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:51.462185 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-registry-tls podName:678c1a16-a395-4d4a-b4f6-a2ac6e1870e8 nodeName:}" failed. No retries permitted until 2026-03-18 16:43:53.462171375 +0000 UTC m=+37.148293714 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-registry-tls") pod "image-registry-84f6fbc954-l59hf" (UID: "678c1a16-a395-4d4a-b4f6-a2ac6e1870e8") : secret "image-registry-tls" not found Mar 18 16:43:51.562627 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:51.562562 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3a05d5d2-a005-4847-b2aa-0f78e327686a-metrics-tls\") pod \"dns-default-xwx7z\" (UID: \"3a05d5d2-a005-4847-b2aa-0f78e327686a\") " pod="openshift-dns/dns-default-xwx7z" Mar 18 16:43:51.562713 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:51.562698 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:43:51.562768 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:51.562757 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a05d5d2-a005-4847-b2aa-0f78e327686a-metrics-tls podName:3a05d5d2-a005-4847-b2aa-0f78e327686a nodeName:}" failed. No retries permitted until 2026-03-18 16:43:53.562739621 +0000 UTC m=+37.248861967 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3a05d5d2-a005-4847-b2aa-0f78e327686a-metrics-tls") pod "dns-default-xwx7z" (UID: "3a05d5d2-a005-4847-b2aa-0f78e327686a") : secret "dns-default-metrics-tls" not found Mar 18 16:43:51.995991 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:51.994874 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-rn2zb"] Mar 18 16:43:52.025453 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.024563 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-vtj99"] Mar 18 16:43:52.040964 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.040530 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-745d85b678-r6ghs"] Mar 18 16:43:52.040964 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.040624 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-rn2zb" Mar 18 16:43:52.040964 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.040777 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-vtj99" Mar 18 16:43:52.042915 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.042895 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Mar 18 16:43:52.045760 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.043295 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Mar 18 16:43:52.045760 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.043523 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Mar 18 16:43:52.045760 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.043702 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Mar 18 16:43:52.045760 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.043866 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Mar 18 16:43:52.045760 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.044071 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-vwh7v\"" Mar 18 16:43:52.045760 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.044512 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Mar 18 16:43:52.045760 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.044720 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Mar 18 16:43:52.045760 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.044964 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-x49nm\"" Mar 18 16:43:52.065173 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.065103 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-vtj99"] Mar 18 16:43:52.065173 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.065134 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-rn2zb"] Mar 18 16:43:52.065173 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.065148 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-745d85b678-r6ghs"] Mar 18 16:43:52.065369 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.065223 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-745d85b678-r6ghs" Mar 18 16:43:52.068689 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.068153 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Mar 18 16:43:52.068689 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.068198 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-ww6vh\"" Mar 18 16:43:52.068689 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.068253 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Mar 18 16:43:52.068689 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.068529 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Mar 18 16:43:52.069841 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.069823 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Mar 18 16:43:52.070321 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.070101 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Mar 18 16:43:52.071148 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.070984 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Mar 18 16:43:52.072150 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.071276 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-cc88fdd44-h6jqn"] Mar 18 16:43:52.083733 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.083358 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-cc88fdd44-h6jqn" Mar 18 16:43:52.087372 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.087216 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-qbmlt\"" Mar 18 16:43:52.090889 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.090855 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-cc88fdd44-h6jqn"] Mar 18 16:43:52.093789 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.093252 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-67fdcb5769-cwqdm"] Mar 18 16:43:52.108577 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.108008 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-67fdcb5769-cwqdm" Mar 18 16:43:52.111637 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.111319 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-76bdd9f478-kpr46"] Mar 18 16:43:52.119724 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.119703 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-rftr8\"" Mar 18 16:43:52.123444 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.123087 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Mar 18 16:43:52.123444 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.123184 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-55b77584bb-nns88"] Mar 18 16:43:52.123444 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.123283 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Mar 18 16:43:52.124476 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.123750 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-76bdd9f478-kpr46" Mar 18 16:43:52.137099 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.137078 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-f6w67"] Mar 18 16:43:52.137877 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.137321 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-55b77584bb-nns88" Mar 18 16:43:52.142170 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.141439 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Mar 18 16:43:52.142170 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.141695 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-qfskh\"" Mar 18 16:43:52.146087 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.145934 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Mar 18 16:43:52.147351 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.146201 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Mar 18 16:43:52.147351 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.146284 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Mar 18 16:43:52.147351 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.146812 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-d2gbv\"" Mar 18 16:43:52.147351 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.147029 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Mar 18 16:43:52.147351 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.147203 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Mar 18 16:43:52.150374 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.149215 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-67fdcb5769-cwqdm"] Mar 18 16:43:52.150374 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.149240 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-55b77584bb-nns88"] Mar 18 16:43:52.150374 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.149369 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-f6w67" Mar 18 16:43:52.151459 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.151323 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Mar 18 16:43:52.152066 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.151844 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Mar 18 16:43:52.152066 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.151855 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Mar 18 16:43:52.152971 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.152646 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Mar 18 16:43:52.153711 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.152153 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-chmrg\"" Mar 18 16:43:52.153711 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.153403 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-76bdd9f478-kpr46"] Mar 18 16:43:52.158070 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.158017 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-f6w67"] Mar 18 16:43:52.167516 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.165383 2562 generic.go:358] "Generic (PLEG): container finished" podID="dcfc6f29-52f0-4f09-b50d-f044f9886e51" containerID="01fa832febc0530ebf80b6fb70ae48b9899a0cb30688fa95dfbaa029e064ca03" exitCode=0 Mar 18 16:43:52.167516 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.165421 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qhxzz" event={"ID":"dcfc6f29-52f0-4f09-b50d-f044f9886e51","Type":"ContainerDied","Data":"01fa832febc0530ebf80b6fb70ae48b9899a0cb30688fa95dfbaa029e064ca03"} Mar 18 16:43:52.168626 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.167826 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8vnf\" (UniqueName: \"kubernetes.io/projected/355a3baf-f84e-464e-90ff-4cf4165ace30-kube-api-access-s8vnf\") pod \"kube-storage-version-migrator-operator-866f46547-vtj99\" (UID: \"355a3baf-f84e-464e-90ff-4cf4165ace30\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-vtj99" Mar 18 16:43:52.168626 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.167899 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/bdf9649a-73e2-4d1b-9ba7-b03cc47f8426-stats-auth\") pod \"router-default-745d85b678-r6ghs\" (UID: \"bdf9649a-73e2-4d1b-9ba7-b03cc47f8426\") " pod="openshift-ingress/router-default-745d85b678-r6ghs" Mar 18 16:43:52.168626 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.167954 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/355a3baf-f84e-464e-90ff-4cf4165ace30-serving-cert\") pod \"kube-storage-version-migrator-operator-866f46547-vtj99\" (UID: \"355a3baf-f84e-464e-90ff-4cf4165ace30\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-vtj99" Mar 18 16:43:52.168626 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.168029 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/355a3baf-f84e-464e-90ff-4cf4165ace30-config\") pod \"kube-storage-version-migrator-operator-866f46547-vtj99\" (UID: \"355a3baf-f84e-464e-90ff-4cf4165ace30\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-vtj99" Mar 18 16:43:52.168626 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.168071 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5j2p\" (UniqueName: \"kubernetes.io/projected/bdf9649a-73e2-4d1b-9ba7-b03cc47f8426-kube-api-access-r5j2p\") pod \"router-default-745d85b678-r6ghs\" (UID: \"bdf9649a-73e2-4d1b-9ba7-b03cc47f8426\") " pod="openshift-ingress/router-default-745d85b678-r6ghs" Mar 18 16:43:52.168626 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.168107 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bdf9649a-73e2-4d1b-9ba7-b03cc47f8426-metrics-certs\") pod \"router-default-745d85b678-r6ghs\" (UID: \"bdf9649a-73e2-4d1b-9ba7-b03cc47f8426\") " pod="openshift-ingress/router-default-745d85b678-r6ghs" Mar 18 16:43:52.168626 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.168141 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/de591805-36d1-4a59-a89f-2f9aca624e2e-samples-operator-tls\") pod \"cluster-samples-operator-d5df4776c-rn2zb\" (UID: \"de591805-36d1-4a59-a89f-2f9aca624e2e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-rn2zb" Mar 18 16:43:52.168626 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.168235 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bdf9649a-73e2-4d1b-9ba7-b03cc47f8426-service-ca-bundle\") pod \"router-default-745d85b678-r6ghs\" (UID: \"bdf9649a-73e2-4d1b-9ba7-b03cc47f8426\") " pod="openshift-ingress/router-default-745d85b678-r6ghs" Mar 18 16:43:52.168626 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.168271 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbbj2\" (UniqueName: \"kubernetes.io/projected/fee6fee0-2356-450e-9cf1-5d0c7cd03239-kube-api-access-bbbj2\") pod \"network-check-source-cc88fdd44-h6jqn\" (UID: \"fee6fee0-2356-450e-9cf1-5d0c7cd03239\") " pod="openshift-network-diagnostics/network-check-source-cc88fdd44-h6jqn" Mar 18 16:43:52.168626 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.168379 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/bdf9649a-73e2-4d1b-9ba7-b03cc47f8426-default-certificate\") pod \"router-default-745d85b678-r6ghs\" (UID: \"bdf9649a-73e2-4d1b-9ba7-b03cc47f8426\") " pod="openshift-ingress/router-default-745d85b678-r6ghs" Mar 18 16:43:52.168626 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.168451 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc7xz\" (UniqueName: \"kubernetes.io/projected/de591805-36d1-4a59-a89f-2f9aca624e2e-kube-api-access-xc7xz\") pod \"cluster-samples-operator-d5df4776c-rn2zb\" (UID: \"de591805-36d1-4a59-a89f-2f9aca624e2e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-rn2zb" Mar 18 16:43:52.171918 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.171890 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Mar 18 16:43:52.205907 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.205886 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-w2ghk"] Mar 18 16:43:52.245187 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.244069 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-76b8565867-jr9mc"] Mar 18 16:43:52.269743 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.269677 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3e56ca22-2821-4212-ad13-a7f1080372b4-networking-console-plugin-cert\") pod \"networking-console-plugin-55b77584bb-nns88\" (UID: \"3e56ca22-2821-4212-ad13-a7f1080372b4\") " pod="openshift-network-console/networking-console-plugin-55b77584bb-nns88" Mar 18 16:43:52.269743 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.269735 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/bdf9649a-73e2-4d1b-9ba7-b03cc47f8426-stats-auth\") pod \"router-default-745d85b678-r6ghs\" (UID: \"bdf9649a-73e2-4d1b-9ba7-b03cc47f8426\") " pod="openshift-ingress/router-default-745d85b678-r6ghs" Mar 18 16:43:52.269902 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.269767 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/355a3baf-f84e-464e-90ff-4cf4165ace30-serving-cert\") pod \"kube-storage-version-migrator-operator-866f46547-vtj99\" (UID: \"355a3baf-f84e-464e-90ff-4cf4165ace30\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-vtj99" Mar 18 16:43:52.269902 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.269824 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a69e9c12-8943-4e3a-b184-8b4e9b5c45c7-service-ca-bundle\") pod \"insights-operator-76bdd9f478-kpr46\" (UID: \"a69e9c12-8943-4e3a-b184-8b4e9b5c45c7\") " pod="openshift-insights/insights-operator-76bdd9f478-kpr46" Mar 18 16:43:52.270030 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.269978 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/5ffa71b2-a86e-4662-adad-d3882c534d0f-telemetry-config\") pod \"cluster-monitoring-operator-b58cd5d8d-f6w67\" (UID: \"5ffa71b2-a86e-4662-adad-d3882c534d0f\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-f6w67" Mar 18 16:43:52.270030 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.270011 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdjwz\" (UniqueName: \"kubernetes.io/projected/5ffa71b2-a86e-4662-adad-d3882c534d0f-kube-api-access-kdjwz\") pod \"cluster-monitoring-operator-b58cd5d8d-f6w67\" (UID: \"5ffa71b2-a86e-4662-adad-d3882c534d0f\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-f6w67" Mar 18 16:43:52.270137 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.270047 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/355a3baf-f84e-464e-90ff-4cf4165ace30-config\") pod \"kube-storage-version-migrator-operator-866f46547-vtj99\" (UID: \"355a3baf-f84e-464e-90ff-4cf4165ace30\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-vtj99" Mar 18 16:43:52.270137 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.270094 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/5ffa71b2-a86e-4662-adad-d3882c534d0f-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-b58cd5d8d-f6w67\" (UID: \"5ffa71b2-a86e-4662-adad-d3882c534d0f\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-f6w67" Mar 18 16:43:52.270137 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.270121 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/a69e9c12-8943-4e3a-b184-8b4e9b5c45c7-snapshots\") pod \"insights-operator-76bdd9f478-kpr46\" (UID: \"a69e9c12-8943-4e3a-b184-8b4e9b5c45c7\") " pod="openshift-insights/insights-operator-76bdd9f478-kpr46" Mar 18 16:43:52.270334 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.270143 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a69e9c12-8943-4e3a-b184-8b4e9b5c45c7-serving-cert\") pod \"insights-operator-76bdd9f478-kpr46\" (UID: \"a69e9c12-8943-4e3a-b184-8b4e9b5c45c7\") " pod="openshift-insights/insights-operator-76bdd9f478-kpr46" Mar 18 16:43:52.270334 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.270173 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r5j2p\" (UniqueName: \"kubernetes.io/projected/bdf9649a-73e2-4d1b-9ba7-b03cc47f8426-kube-api-access-r5j2p\") pod \"router-default-745d85b678-r6ghs\" (UID: \"bdf9649a-73e2-4d1b-9ba7-b03cc47f8426\") " pod="openshift-ingress/router-default-745d85b678-r6ghs" Mar 18 16:43:52.270334 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.270202 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bdf9649a-73e2-4d1b-9ba7-b03cc47f8426-metrics-certs\") pod \"router-default-745d85b678-r6ghs\" (UID: \"bdf9649a-73e2-4d1b-9ba7-b03cc47f8426\") " pod="openshift-ingress/router-default-745d85b678-r6ghs" Mar 18 16:43:52.270334 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.270228 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/de591805-36d1-4a59-a89f-2f9aca624e2e-samples-operator-tls\") pod \"cluster-samples-operator-d5df4776c-rn2zb\" (UID: \"de591805-36d1-4a59-a89f-2f9aca624e2e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-rn2zb" Mar 18 16:43:52.270334 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.270286 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whlxd\" (UniqueName: \"kubernetes.io/projected/a69e9c12-8943-4e3a-b184-8b4e9b5c45c7-kube-api-access-whlxd\") pod \"insights-operator-76bdd9f478-kpr46\" (UID: \"a69e9c12-8943-4e3a-b184-8b4e9b5c45c7\") " pod="openshift-insights/insights-operator-76bdd9f478-kpr46" Mar 18 16:43:52.270334 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.270331 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bdf9649a-73e2-4d1b-9ba7-b03cc47f8426-service-ca-bundle\") pod \"router-default-745d85b678-r6ghs\" (UID: \"bdf9649a-73e2-4d1b-9ba7-b03cc47f8426\") " pod="openshift-ingress/router-default-745d85b678-r6ghs" Mar 18 16:43:52.270629 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.270361 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bbbj2\" (UniqueName: \"kubernetes.io/projected/fee6fee0-2356-450e-9cf1-5d0c7cd03239-kube-api-access-bbbj2\") pod \"network-check-source-cc88fdd44-h6jqn\" (UID: \"fee6fee0-2356-450e-9cf1-5d0c7cd03239\") " pod="openshift-network-diagnostics/network-check-source-cc88fdd44-h6jqn" Mar 18 16:43:52.270629 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.270395 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/bdf9649a-73e2-4d1b-9ba7-b03cc47f8426-default-certificate\") pod \"router-default-745d85b678-r6ghs\" (UID: \"bdf9649a-73e2-4d1b-9ba7-b03cc47f8426\") " pod="openshift-ingress/router-default-745d85b678-r6ghs" Mar 18 16:43:52.270629 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.270424 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lm7c\" (UniqueName: \"kubernetes.io/projected/70783292-1426-4298-9aae-22ccd0c24067-kube-api-access-7lm7c\") pod \"volume-data-source-validator-67fdcb5769-cwqdm\" (UID: \"70783292-1426-4298-9aae-22ccd0c24067\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-67fdcb5769-cwqdm" Mar 18 16:43:52.270629 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.270458 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3e56ca22-2821-4212-ad13-a7f1080372b4-nginx-conf\") pod \"networking-console-plugin-55b77584bb-nns88\" (UID: \"3e56ca22-2821-4212-ad13-a7f1080372b4\") " pod="openshift-network-console/networking-console-plugin-55b77584bb-nns88" Mar 18 16:43:52.270629 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.270486 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xc7xz\" (UniqueName: \"kubernetes.io/projected/de591805-36d1-4a59-a89f-2f9aca624e2e-kube-api-access-xc7xz\") pod \"cluster-samples-operator-d5df4776c-rn2zb\" (UID: \"de591805-36d1-4a59-a89f-2f9aca624e2e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-rn2zb" Mar 18 16:43:52.270629 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.270526 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a69e9c12-8943-4e3a-b184-8b4e9b5c45c7-trusted-ca-bundle\") pod \"insights-operator-76bdd9f478-kpr46\" (UID: \"a69e9c12-8943-4e3a-b184-8b4e9b5c45c7\") " pod="openshift-insights/insights-operator-76bdd9f478-kpr46" Mar 18 16:43:52.270629 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.270605 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a69e9c12-8943-4e3a-b184-8b4e9b5c45c7-tmp\") pod \"insights-operator-76bdd9f478-kpr46\" (UID: \"a69e9c12-8943-4e3a-b184-8b4e9b5c45c7\") " pod="openshift-insights/insights-operator-76bdd9f478-kpr46" Mar 18 16:43:52.270975 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.270637 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s8vnf\" (UniqueName: \"kubernetes.io/projected/355a3baf-f84e-464e-90ff-4cf4165ace30-kube-api-access-s8vnf\") pod \"kube-storage-version-migrator-operator-866f46547-vtj99\" (UID: \"355a3baf-f84e-464e-90ff-4cf4165ace30\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-vtj99" Mar 18 16:43:52.272905 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.271308 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-w2ghk"] Mar 18 16:43:52.272905 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.271335 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-76b8565867-jr9mc"] Mar 18 16:43:52.272905 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.271449 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-76b8565867-jr9mc" Mar 18 16:43:52.272905 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.271797 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/355a3baf-f84e-464e-90ff-4cf4165ace30-config\") pod \"kube-storage-version-migrator-operator-866f46547-vtj99\" (UID: \"355a3baf-f84e-464e-90ff-4cf4165ace30\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-vtj99" Mar 18 16:43:52.272905 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.271832 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-w2ghk" Mar 18 16:43:52.272905 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:52.272419 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bdf9649a-73e2-4d1b-9ba7-b03cc47f8426-service-ca-bundle podName:bdf9649a-73e2-4d1b-9ba7-b03cc47f8426 nodeName:}" failed. No retries permitted until 2026-03-18 16:43:52.772398179 +0000 UTC m=+36.458520542 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/bdf9649a-73e2-4d1b-9ba7-b03cc47f8426-service-ca-bundle") pod "router-default-745d85b678-r6ghs" (UID: "bdf9649a-73e2-4d1b-9ba7-b03cc47f8426") : configmap references non-existent config key: service-ca.crt Mar 18 16:43:52.272905 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:52.272695 2562 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Mar 18 16:43:52.272905 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:52.272750 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bdf9649a-73e2-4d1b-9ba7-b03cc47f8426-metrics-certs podName:bdf9649a-73e2-4d1b-9ba7-b03cc47f8426 nodeName:}" failed. No retries permitted until 2026-03-18 16:43:52.772735128 +0000 UTC m=+36.458857471 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bdf9649a-73e2-4d1b-9ba7-b03cc47f8426-metrics-certs") pod "router-default-745d85b678-r6ghs" (UID: "bdf9649a-73e2-4d1b-9ba7-b03cc47f8426") : secret "router-metrics-certs-default" not found Mar 18 16:43:52.272905 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:52.272792 2562 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Mar 18 16:43:52.273910 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:52.272891 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de591805-36d1-4a59-a89f-2f9aca624e2e-samples-operator-tls podName:de591805-36d1-4a59-a89f-2f9aca624e2e nodeName:}" failed. No retries permitted until 2026-03-18 16:43:52.772872427 +0000 UTC m=+36.458994775 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/de591805-36d1-4a59-a89f-2f9aca624e2e-samples-operator-tls") pod "cluster-samples-operator-d5df4776c-rn2zb" (UID: "de591805-36d1-4a59-a89f-2f9aca624e2e") : secret "samples-operator-tls" not found Mar 18 16:43:52.274237 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.274196 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Mar 18 16:43:52.274446 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.274429 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Mar 18 16:43:52.274966 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.274918 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Mar 18 16:43:52.285102 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.285079 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/355a3baf-f84e-464e-90ff-4cf4165ace30-serving-cert\") pod \"kube-storage-version-migrator-operator-866f46547-vtj99\" (UID: \"355a3baf-f84e-464e-90ff-4cf4165ace30\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-vtj99" Mar 18 16:43:52.286817 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.286786 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Mar 18 16:43:52.287130 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.287109 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Mar 18 16:43:52.287390 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.287327 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-4qrl8\"" Mar 18 16:43:52.287615 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.287599 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-cw5q6\"" Mar 18 16:43:52.287869 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.287833 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Mar 18 16:43:52.288062 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.288044 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Mar 18 16:43:52.288291 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.288249 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Mar 18 16:43:52.293880 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.293772 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/bdf9649a-73e2-4d1b-9ba7-b03cc47f8426-default-certificate\") pod \"router-default-745d85b678-r6ghs\" (UID: \"bdf9649a-73e2-4d1b-9ba7-b03cc47f8426\") " pod="openshift-ingress/router-default-745d85b678-r6ghs" Mar 18 16:43:52.295455 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.295437 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Mar 18 16:43:52.297967 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.297903 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/bdf9649a-73e2-4d1b-9ba7-b03cc47f8426-stats-auth\") pod \"router-default-745d85b678-r6ghs\" (UID: \"bdf9649a-73e2-4d1b-9ba7-b03cc47f8426\") " pod="openshift-ingress/router-default-745d85b678-r6ghs" Mar 18 16:43:52.300232 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.300208 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc7xz\" (UniqueName: \"kubernetes.io/projected/de591805-36d1-4a59-a89f-2f9aca624e2e-kube-api-access-xc7xz\") pod \"cluster-samples-operator-d5df4776c-rn2zb\" (UID: \"de591805-36d1-4a59-a89f-2f9aca624e2e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-rn2zb" Mar 18 16:43:52.303483 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.303424 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5j2p\" (UniqueName: \"kubernetes.io/projected/bdf9649a-73e2-4d1b-9ba7-b03cc47f8426-kube-api-access-r5j2p\") pod \"router-default-745d85b678-r6ghs\" (UID: \"bdf9649a-73e2-4d1b-9ba7-b03cc47f8426\") " pod="openshift-ingress/router-default-745d85b678-r6ghs" Mar 18 16:43:52.307747 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.307704 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbbj2\" (UniqueName: \"kubernetes.io/projected/fee6fee0-2356-450e-9cf1-5d0c7cd03239-kube-api-access-bbbj2\") pod \"network-check-source-cc88fdd44-h6jqn\" (UID: \"fee6fee0-2356-450e-9cf1-5d0c7cd03239\") " pod="openshift-network-diagnostics/network-check-source-cc88fdd44-h6jqn" Mar 18 16:43:52.312496 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.312453 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8vnf\" (UniqueName: \"kubernetes.io/projected/355a3baf-f84e-464e-90ff-4cf4165ace30-kube-api-access-s8vnf\") pod \"kube-storage-version-migrator-operator-866f46547-vtj99\" (UID: \"355a3baf-f84e-464e-90ff-4cf4165ace30\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-vtj99" Mar 18 16:43:52.371575 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.371538 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/5ffa71b2-a86e-4662-adad-d3882c534d0f-telemetry-config\") pod \"cluster-monitoring-operator-b58cd5d8d-f6w67\" (UID: \"5ffa71b2-a86e-4662-adad-d3882c534d0f\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-f6w67" Mar 18 16:43:52.371670 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.371592 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kdjwz\" (UniqueName: \"kubernetes.io/projected/5ffa71b2-a86e-4662-adad-d3882c534d0f-kube-api-access-kdjwz\") pod \"cluster-monitoring-operator-b58cd5d8d-f6w67\" (UID: \"5ffa71b2-a86e-4662-adad-d3882c534d0f\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-f6w67" Mar 18 16:43:52.371670 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.371624 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/5ffa71b2-a86e-4662-adad-d3882c534d0f-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-b58cd5d8d-f6w67\" (UID: \"5ffa71b2-a86e-4662-adad-d3882c534d0f\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-f6w67" Mar 18 16:43:52.371670 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.371651 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/a69e9c12-8943-4e3a-b184-8b4e9b5c45c7-snapshots\") pod \"insights-operator-76bdd9f478-kpr46\" (UID: \"a69e9c12-8943-4e3a-b184-8b4e9b5c45c7\") " pod="openshift-insights/insights-operator-76bdd9f478-kpr46" Mar 18 16:43:52.371766 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.371674 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a69e9c12-8943-4e3a-b184-8b4e9b5c45c7-serving-cert\") pod \"insights-operator-76bdd9f478-kpr46\" (UID: \"a69e9c12-8943-4e3a-b184-8b4e9b5c45c7\") " pod="openshift-insights/insights-operator-76bdd9f478-kpr46" Mar 18 16:43:52.371766 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.371741 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-whlxd\" (UniqueName: \"kubernetes.io/projected/a69e9c12-8943-4e3a-b184-8b4e9b5c45c7-kube-api-access-whlxd\") pod \"insights-operator-76bdd9f478-kpr46\" (UID: \"a69e9c12-8943-4e3a-b184-8b4e9b5c45c7\") " pod="openshift-insights/insights-operator-76bdd9f478-kpr46" Mar 18 16:43:52.371828 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.371788 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/627f8af5-be81-426b-9540-c096c256323a-trusted-ca\") pod \"console-operator-76b8565867-jr9mc\" (UID: \"627f8af5-be81-426b-9540-c096c256323a\") " pod="openshift-console-operator/console-operator-76b8565867-jr9mc" Mar 18 16:43:52.371828 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.371813 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/627f8af5-be81-426b-9540-c096c256323a-config\") pod \"console-operator-76b8565867-jr9mc\" (UID: \"627f8af5-be81-426b-9540-c096c256323a\") " pod="openshift-console-operator/console-operator-76b8565867-jr9mc" Mar 18 16:43:52.371905 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.371844 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7lm7c\" (UniqueName: \"kubernetes.io/projected/70783292-1426-4298-9aae-22ccd0c24067-kube-api-access-7lm7c\") pod \"volume-data-source-validator-67fdcb5769-cwqdm\" (UID: \"70783292-1426-4298-9aae-22ccd0c24067\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-67fdcb5769-cwqdm" Mar 18 16:43:52.372487 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.372464 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/a69e9c12-8943-4e3a-b184-8b4e9b5c45c7-snapshots\") pod \"insights-operator-76bdd9f478-kpr46\" (UID: \"a69e9c12-8943-4e3a-b184-8b4e9b5c45c7\") " pod="openshift-insights/insights-operator-76bdd9f478-kpr46" Mar 18 16:43:52.372599 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.372571 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/627f8af5-be81-426b-9540-c096c256323a-serving-cert\") pod \"console-operator-76b8565867-jr9mc\" (UID: \"627f8af5-be81-426b-9540-c096c256323a\") " pod="openshift-console-operator/console-operator-76b8565867-jr9mc" Mar 18 16:43:52.372658 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.372629 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3e56ca22-2821-4212-ad13-a7f1080372b4-nginx-conf\") pod \"networking-console-plugin-55b77584bb-nns88\" (UID: \"3e56ca22-2821-4212-ad13-a7f1080372b4\") " pod="openshift-network-console/networking-console-plugin-55b77584bb-nns88" Mar 18 16:43:52.372711 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.372675 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a69e9c12-8943-4e3a-b184-8b4e9b5c45c7-trusted-ca-bundle\") pod \"insights-operator-76bdd9f478-kpr46\" (UID: \"a69e9c12-8943-4e3a-b184-8b4e9b5c45c7\") " pod="openshift-insights/insights-operator-76bdd9f478-kpr46" Mar 18 16:43:52.372756 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.372725 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a69e9c12-8943-4e3a-b184-8b4e9b5c45c7-tmp\") pod \"insights-operator-76bdd9f478-kpr46\" (UID: \"a69e9c12-8943-4e3a-b184-8b4e9b5c45c7\") " pod="openshift-insights/insights-operator-76bdd9f478-kpr46" Mar 18 16:43:52.372814 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.372779 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/199634f3-40e8-46c3-b2cc-89dff9148da4-serving-cert\") pod \"service-ca-operator-56f6f4cbcb-w2ghk\" (UID: \"199634f3-40e8-46c3-b2cc-89dff9148da4\") " pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-w2ghk" Mar 18 16:43:52.372868 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.372828 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3e56ca22-2821-4212-ad13-a7f1080372b4-networking-console-plugin-cert\") pod \"networking-console-plugin-55b77584bb-nns88\" (UID: \"3e56ca22-2821-4212-ad13-a7f1080372b4\") " pod="openshift-network-console/networking-console-plugin-55b77584bb-nns88" Mar 18 16:43:52.372918 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.372872 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr5gs\" (UniqueName: \"kubernetes.io/projected/199634f3-40e8-46c3-b2cc-89dff9148da4-kube-api-access-rr5gs\") pod \"service-ca-operator-56f6f4cbcb-w2ghk\" (UID: \"199634f3-40e8-46c3-b2cc-89dff9148da4\") " pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-w2ghk" Mar 18 16:43:52.372991 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:52.372921 2562 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Mar 18 16:43:52.372991 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.372927 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/199634f3-40e8-46c3-b2cc-89dff9148da4-config\") pod \"service-ca-operator-56f6f4cbcb-w2ghk\" (UID: \"199634f3-40e8-46c3-b2cc-89dff9148da4\") " pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-w2ghk" Mar 18 16:43:52.373095 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:52.373045 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e56ca22-2821-4212-ad13-a7f1080372b4-networking-console-plugin-cert podName:3e56ca22-2821-4212-ad13-a7f1080372b4 nodeName:}" failed. No retries permitted until 2026-03-18 16:43:52.8729728 +0000 UTC m=+36.559095147 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/3e56ca22-2821-4212-ad13-a7f1080372b4-networking-console-plugin-cert") pod "networking-console-plugin-55b77584bb-nns88" (UID: "3e56ca22-2821-4212-ad13-a7f1080372b4") : secret "networking-console-plugin-cert" not found Mar 18 16:43:52.373095 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.373082 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d566n\" (UniqueName: \"kubernetes.io/projected/627f8af5-be81-426b-9540-c096c256323a-kube-api-access-d566n\") pod \"console-operator-76b8565867-jr9mc\" (UID: \"627f8af5-be81-426b-9540-c096c256323a\") " pod="openshift-console-operator/console-operator-76b8565867-jr9mc" Mar 18 16:43:52.373192 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.373117 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a69e9c12-8943-4e3a-b184-8b4e9b5c45c7-service-ca-bundle\") pod \"insights-operator-76bdd9f478-kpr46\" (UID: \"a69e9c12-8943-4e3a-b184-8b4e9b5c45c7\") " pod="openshift-insights/insights-operator-76bdd9f478-kpr46" Mar 18 16:43:52.374702 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.373510 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a69e9c12-8943-4e3a-b184-8b4e9b5c45c7-tmp\") pod \"insights-operator-76bdd9f478-kpr46\" (UID: \"a69e9c12-8943-4e3a-b184-8b4e9b5c45c7\") " pod="openshift-insights/insights-operator-76bdd9f478-kpr46" Mar 18 16:43:52.374702 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:52.373603 2562 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 18 16:43:52.374702 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:52.373659 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ffa71b2-a86e-4662-adad-d3882c534d0f-cluster-monitoring-operator-tls podName:5ffa71b2-a86e-4662-adad-d3882c534d0f nodeName:}" failed. No retries permitted until 2026-03-18 16:43:52.87364284 +0000 UTC m=+36.559765185 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/5ffa71b2-a86e-4662-adad-d3882c534d0f-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-b58cd5d8d-f6w67" (UID: "5ffa71b2-a86e-4662-adad-d3882c534d0f") : secret "cluster-monitoring-operator-tls" not found Mar 18 16:43:52.374702 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.373693 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a69e9c12-8943-4e3a-b184-8b4e9b5c45c7-service-ca-bundle\") pod \"insights-operator-76bdd9f478-kpr46\" (UID: \"a69e9c12-8943-4e3a-b184-8b4e9b5c45c7\") " pod="openshift-insights/insights-operator-76bdd9f478-kpr46" Mar 18 16:43:52.374702 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.374309 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3e56ca22-2821-4212-ad13-a7f1080372b4-nginx-conf\") pod \"networking-console-plugin-55b77584bb-nns88\" (UID: \"3e56ca22-2821-4212-ad13-a7f1080372b4\") " pod="openshift-network-console/networking-console-plugin-55b77584bb-nns88" Mar 18 16:43:52.374702 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.374387 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-vtj99" Mar 18 16:43:52.377354 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.377332 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a69e9c12-8943-4e3a-b184-8b4e9b5c45c7-serving-cert\") pod \"insights-operator-76bdd9f478-kpr46\" (UID: \"a69e9c12-8943-4e3a-b184-8b4e9b5c45c7\") " pod="openshift-insights/insights-operator-76bdd9f478-kpr46" Mar 18 16:43:52.382790 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.382716 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lm7c\" (UniqueName: \"kubernetes.io/projected/70783292-1426-4298-9aae-22ccd0c24067-kube-api-access-7lm7c\") pod \"volume-data-source-validator-67fdcb5769-cwqdm\" (UID: \"70783292-1426-4298-9aae-22ccd0c24067\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-67fdcb5769-cwqdm" Mar 18 16:43:52.383525 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.383488 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdjwz\" (UniqueName: \"kubernetes.io/projected/5ffa71b2-a86e-4662-adad-d3882c534d0f-kube-api-access-kdjwz\") pod \"cluster-monitoring-operator-b58cd5d8d-f6w67\" (UID: \"5ffa71b2-a86e-4662-adad-d3882c534d0f\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-f6w67" Mar 18 16:43:52.384352 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.384075 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/5ffa71b2-a86e-4662-adad-d3882c534d0f-telemetry-config\") pod \"cluster-monitoring-operator-b58cd5d8d-f6w67\" (UID: \"5ffa71b2-a86e-4662-adad-d3882c534d0f\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-f6w67" Mar 18 16:43:52.384352 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.384319 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-whlxd\" (UniqueName: \"kubernetes.io/projected/a69e9c12-8943-4e3a-b184-8b4e9b5c45c7-kube-api-access-whlxd\") pod \"insights-operator-76bdd9f478-kpr46\" (UID: \"a69e9c12-8943-4e3a-b184-8b4e9b5c45c7\") " pod="openshift-insights/insights-operator-76bdd9f478-kpr46" Mar 18 16:43:52.384754 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.384734 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a69e9c12-8943-4e3a-b184-8b4e9b5c45c7-trusted-ca-bundle\") pod \"insights-operator-76bdd9f478-kpr46\" (UID: \"a69e9c12-8943-4e3a-b184-8b4e9b5c45c7\") " pod="openshift-insights/insights-operator-76bdd9f478-kpr46" Mar 18 16:43:52.397096 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.396772 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-cc88fdd44-h6jqn" Mar 18 16:43:52.421514 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.420417 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-67fdcb5769-cwqdm" Mar 18 16:43:52.436978 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.436876 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-76bdd9f478-kpr46" Mar 18 16:43:52.478732 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.474177 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d566n\" (UniqueName: \"kubernetes.io/projected/627f8af5-be81-426b-9540-c096c256323a-kube-api-access-d566n\") pod \"console-operator-76b8565867-jr9mc\" (UID: \"627f8af5-be81-426b-9540-c096c256323a\") " pod="openshift-console-operator/console-operator-76b8565867-jr9mc" Mar 18 16:43:52.478732 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.474494 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/627f8af5-be81-426b-9540-c096c256323a-trusted-ca\") pod \"console-operator-76b8565867-jr9mc\" (UID: \"627f8af5-be81-426b-9540-c096c256323a\") " pod="openshift-console-operator/console-operator-76b8565867-jr9mc" Mar 18 16:43:52.478732 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.474521 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/627f8af5-be81-426b-9540-c096c256323a-config\") pod \"console-operator-76b8565867-jr9mc\" (UID: \"627f8af5-be81-426b-9540-c096c256323a\") " pod="openshift-console-operator/console-operator-76b8565867-jr9mc" Mar 18 16:43:52.478732 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.474550 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/627f8af5-be81-426b-9540-c096c256323a-serving-cert\") pod \"console-operator-76b8565867-jr9mc\" (UID: \"627f8af5-be81-426b-9540-c096c256323a\") " pod="openshift-console-operator/console-operator-76b8565867-jr9mc" Mar 18 16:43:52.478732 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.474642 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/199634f3-40e8-46c3-b2cc-89dff9148da4-serving-cert\") pod \"service-ca-operator-56f6f4cbcb-w2ghk\" (UID: \"199634f3-40e8-46c3-b2cc-89dff9148da4\") " pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-w2ghk" Mar 18 16:43:52.478732 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.474696 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rr5gs\" (UniqueName: \"kubernetes.io/projected/199634f3-40e8-46c3-b2cc-89dff9148da4-kube-api-access-rr5gs\") pod \"service-ca-operator-56f6f4cbcb-w2ghk\" (UID: \"199634f3-40e8-46c3-b2cc-89dff9148da4\") " pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-w2ghk" Mar 18 16:43:52.478732 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.474733 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/199634f3-40e8-46c3-b2cc-89dff9148da4-config\") pod \"service-ca-operator-56f6f4cbcb-w2ghk\" (UID: \"199634f3-40e8-46c3-b2cc-89dff9148da4\") " pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-w2ghk" Mar 18 16:43:52.478732 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.475341 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/199634f3-40e8-46c3-b2cc-89dff9148da4-config\") pod \"service-ca-operator-56f6f4cbcb-w2ghk\" (UID: \"199634f3-40e8-46c3-b2cc-89dff9148da4\") " pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-w2ghk" Mar 18 16:43:52.478732 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.476448 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/627f8af5-be81-426b-9540-c096c256323a-trusted-ca\") pod \"console-operator-76b8565867-jr9mc\" (UID: \"627f8af5-be81-426b-9540-c096c256323a\") " pod="openshift-console-operator/console-operator-76b8565867-jr9mc" Mar 18 16:43:52.481234 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.481178 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/627f8af5-be81-426b-9540-c096c256323a-config\") pod \"console-operator-76b8565867-jr9mc\" (UID: \"627f8af5-be81-426b-9540-c096c256323a\") " pod="openshift-console-operator/console-operator-76b8565867-jr9mc" Mar 18 16:43:52.498704 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.498629 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/199634f3-40e8-46c3-b2cc-89dff9148da4-serving-cert\") pod \"service-ca-operator-56f6f4cbcb-w2ghk\" (UID: \"199634f3-40e8-46c3-b2cc-89dff9148da4\") " pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-w2ghk" Mar 18 16:43:52.499031 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.498977 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/627f8af5-be81-426b-9540-c096c256323a-serving-cert\") pod \"console-operator-76b8565867-jr9mc\" (UID: \"627f8af5-be81-426b-9540-c096c256323a\") " pod="openshift-console-operator/console-operator-76b8565867-jr9mc" Mar 18 16:43:52.500781 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.500753 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr5gs\" (UniqueName: \"kubernetes.io/projected/199634f3-40e8-46c3-b2cc-89dff9148da4-kube-api-access-rr5gs\") pod \"service-ca-operator-56f6f4cbcb-w2ghk\" (UID: \"199634f3-40e8-46c3-b2cc-89dff9148da4\") " pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-w2ghk" Mar 18 16:43:52.503327 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.503283 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d566n\" (UniqueName: \"kubernetes.io/projected/627f8af5-be81-426b-9540-c096c256323a-kube-api-access-d566n\") pod \"console-operator-76b8565867-jr9mc\" (UID: \"627f8af5-be81-426b-9540-c096c256323a\") " pod="openshift-console-operator/console-operator-76b8565867-jr9mc" Mar 18 16:43:52.606770 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:52.606648 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod355a3baf_f84e_464e_90ff_4cf4165ace30.slice/crio-786aa03905dc7c80d38084c07f54ba5c0ab5d0a453f96be5512c3454119baebe WatchSource:0}: Error finding container 786aa03905dc7c80d38084c07f54ba5c0ab5d0a453f96be5512c3454119baebe: Status 404 returned error can't find the container with id 786aa03905dc7c80d38084c07f54ba5c0ab5d0a453f96be5512c3454119baebe Mar 18 16:43:52.618892 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.618115 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-vtj99"] Mar 18 16:43:52.631376 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.631301 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-76b8565867-jr9mc" Mar 18 16:43:52.645968 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.642100 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-w2ghk" Mar 18 16:43:52.680045 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.680014 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-cc88fdd44-h6jqn"] Mar 18 16:43:52.708284 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.708255 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-67fdcb5769-cwqdm"] Mar 18 16:43:52.721394 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.720433 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-76bdd9f478-kpr46"] Mar 18 16:43:52.781349 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.781274 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bdf9649a-73e2-4d1b-9ba7-b03cc47f8426-metrics-certs\") pod \"router-default-745d85b678-r6ghs\" (UID: \"bdf9649a-73e2-4d1b-9ba7-b03cc47f8426\") " pod="openshift-ingress/router-default-745d85b678-r6ghs" Mar 18 16:43:52.781349 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.781323 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/de591805-36d1-4a59-a89f-2f9aca624e2e-samples-operator-tls\") pod \"cluster-samples-operator-d5df4776c-rn2zb\" (UID: \"de591805-36d1-4a59-a89f-2f9aca624e2e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-rn2zb" Mar 18 16:43:52.781545 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.781368 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bdf9649a-73e2-4d1b-9ba7-b03cc47f8426-service-ca-bundle\") pod \"router-default-745d85b678-r6ghs\" (UID: \"bdf9649a-73e2-4d1b-9ba7-b03cc47f8426\") " pod="openshift-ingress/router-default-745d85b678-r6ghs" Mar 18 16:43:52.781601 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:52.781554 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bdf9649a-73e2-4d1b-9ba7-b03cc47f8426-service-ca-bundle podName:bdf9649a-73e2-4d1b-9ba7-b03cc47f8426 nodeName:}" failed. No retries permitted until 2026-03-18 16:43:53.781535796 +0000 UTC m=+37.467658151 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/bdf9649a-73e2-4d1b-9ba7-b03cc47f8426-service-ca-bundle") pod "router-default-745d85b678-r6ghs" (UID: "bdf9649a-73e2-4d1b-9ba7-b03cc47f8426") : configmap references non-existent config key: service-ca.crt Mar 18 16:43:52.781992 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:52.781972 2562 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Mar 18 16:43:52.782113 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:52.782029 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bdf9649a-73e2-4d1b-9ba7-b03cc47f8426-metrics-certs podName:bdf9649a-73e2-4d1b-9ba7-b03cc47f8426 nodeName:}" failed. No retries permitted until 2026-03-18 16:43:53.782012255 +0000 UTC m=+37.468134611 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bdf9649a-73e2-4d1b-9ba7-b03cc47f8426-metrics-certs") pod "router-default-745d85b678-r6ghs" (UID: "bdf9649a-73e2-4d1b-9ba7-b03cc47f8426") : secret "router-metrics-certs-default" not found Mar 18 16:43:52.782113 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:52.782096 2562 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Mar 18 16:43:52.782229 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:52.782130 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de591805-36d1-4a59-a89f-2f9aca624e2e-samples-operator-tls podName:de591805-36d1-4a59-a89f-2f9aca624e2e nodeName:}" failed. No retries permitted until 2026-03-18 16:43:53.782117728 +0000 UTC m=+37.468240072 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/de591805-36d1-4a59-a89f-2f9aca624e2e-samples-operator-tls") pod "cluster-samples-operator-d5df4776c-rn2zb" (UID: "de591805-36d1-4a59-a89f-2f9aca624e2e") : secret "samples-operator-tls" not found Mar 18 16:43:52.884154 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.883109 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/5ffa71b2-a86e-4662-adad-d3882c534d0f-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-b58cd5d8d-f6w67\" (UID: \"5ffa71b2-a86e-4662-adad-d3882c534d0f\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-f6w67" Mar 18 16:43:52.884154 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.883286 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3e56ca22-2821-4212-ad13-a7f1080372b4-networking-console-plugin-cert\") pod \"networking-console-plugin-55b77584bb-nns88\" (UID: \"3e56ca22-2821-4212-ad13-a7f1080372b4\") " pod="openshift-network-console/networking-console-plugin-55b77584bb-nns88" Mar 18 16:43:52.884154 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:52.883440 2562 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Mar 18 16:43:52.884154 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:52.883502 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e56ca22-2821-4212-ad13-a7f1080372b4-networking-console-plugin-cert podName:3e56ca22-2821-4212-ad13-a7f1080372b4 nodeName:}" failed. No retries permitted until 2026-03-18 16:43:53.883483068 +0000 UTC m=+37.569605414 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/3e56ca22-2821-4212-ad13-a7f1080372b4-networking-console-plugin-cert") pod "networking-console-plugin-55b77584bb-nns88" (UID: "3e56ca22-2821-4212-ad13-a7f1080372b4") : secret "networking-console-plugin-cert" not found Mar 18 16:43:52.884154 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:52.884050 2562 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 18 16:43:52.884154 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:52.884137 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ffa71b2-a86e-4662-adad-d3882c534d0f-cluster-monitoring-operator-tls podName:5ffa71b2-a86e-4662-adad-d3882c534d0f nodeName:}" failed. No retries permitted until 2026-03-18 16:43:53.884096877 +0000 UTC m=+37.570219221 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/5ffa71b2-a86e-4662-adad-d3882c534d0f-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-b58cd5d8d-f6w67" (UID: "5ffa71b2-a86e-4662-adad-d3882c534d0f") : secret "cluster-monitoring-operator-tls" not found Mar 18 16:43:52.934805 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.934737 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-w2ghk"] Mar 18 16:43:52.940851 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:52.940819 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod199634f3_40e8_46c3_b2cc_89dff9148da4.slice/crio-f6dc16d09c981eec153281c71d777d59f4506903e395e1f589a28e61fe32e449 WatchSource:0}: Error finding container f6dc16d09c981eec153281c71d777d59f4506903e395e1f589a28e61fe32e449: Status 404 returned error can't find the container with id f6dc16d09c981eec153281c71d777d59f4506903e395e1f589a28e61fe32e449 Mar 18 16:43:52.943730 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:52.943672 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-76b8565867-jr9mc"] Mar 18 16:43:52.948281 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:43:52.948258 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod627f8af5_be81_426b_9540_c096c256323a.slice/crio-e08e68d95da3d630d2530c67b0fa6258002c11e6c8985cc8443bd8188b7d13f7 WatchSource:0}: Error finding container e08e68d95da3d630d2530c67b0fa6258002c11e6c8985cc8443bd8188b7d13f7: Status 404 returned error can't find the container with id e08e68d95da3d630d2530c67b0fa6258002c11e6c8985cc8443bd8188b7d13f7 Mar 18 16:43:53.172190 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:53.172129 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-w2ghk" event={"ID":"199634f3-40e8-46c3-b2cc-89dff9148da4","Type":"ContainerStarted","Data":"f6dc16d09c981eec153281c71d777d59f4506903e395e1f589a28e61fe32e449"} Mar 18 16:43:53.175041 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:53.174980 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-76bdd9f478-kpr46" event={"ID":"a69e9c12-8943-4e3a-b184-8b4e9b5c45c7","Type":"ContainerStarted","Data":"d4093606618029532dc44845f029b85a3c287e47c5a08fcb4d9d36d8f64d51c9"} Mar 18 16:43:53.177591 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:53.177529 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-vtj99" event={"ID":"355a3baf-f84e-464e-90ff-4cf4165ace30","Type":"ContainerStarted","Data":"786aa03905dc7c80d38084c07f54ba5c0ab5d0a453f96be5512c3454119baebe"} Mar 18 16:43:53.194904 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:53.194724 2562 generic.go:358] "Generic (PLEG): container finished" podID="dcfc6f29-52f0-4f09-b50d-f044f9886e51" containerID="e8edc013156ee71925338a38f5277514b5cdac681b08573baacf50106e598bf4" exitCode=0 Mar 18 16:43:53.194904 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:53.194827 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qhxzz" event={"ID":"dcfc6f29-52f0-4f09-b50d-f044f9886e51","Type":"ContainerDied","Data":"e8edc013156ee71925338a38f5277514b5cdac681b08573baacf50106e598bf4"} Mar 18 16:43:53.200035 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:53.199970 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-67fdcb5769-cwqdm" event={"ID":"70783292-1426-4298-9aae-22ccd0c24067","Type":"ContainerStarted","Data":"ca97b19042adc79b9f3a4bec01dcc42679e0d6e503acb9a507b1edb471658ed6"} Mar 18 16:43:53.201969 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:53.201913 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-76b8565867-jr9mc" event={"ID":"627f8af5-be81-426b-9540-c096c256323a","Type":"ContainerStarted","Data":"e08e68d95da3d630d2530c67b0fa6258002c11e6c8985cc8443bd8188b7d13f7"} Mar 18 16:43:53.203666 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:53.203642 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-cc88fdd44-h6jqn" event={"ID":"fee6fee0-2356-450e-9cf1-5d0c7cd03239","Type":"ContainerStarted","Data":"7618c8da7bdbcb134c756ef416d31961251ab78cb2e1932e1ce1237f7726a846"} Mar 18 16:43:53.489542 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:53.488550 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-registry-tls\") pod \"image-registry-84f6fbc954-l59hf\" (UID: \"678c1a16-a395-4d4a-b4f6-a2ac6e1870e8\") " pod="openshift-image-registry/image-registry-84f6fbc954-l59hf" Mar 18 16:43:53.489542 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:53.488717 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e96119f-c454-4e76-a758-9be470a94be8-cert\") pod \"ingress-canary-t8ccf\" (UID: \"4e96119f-c454-4e76-a758-9be470a94be8\") " pod="openshift-ingress-canary/ingress-canary-t8ccf" Mar 18 16:43:53.489542 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:53.488990 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:43:53.489542 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:53.489046 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4e96119f-c454-4e76-a758-9be470a94be8-cert podName:4e96119f-c454-4e76-a758-9be470a94be8 nodeName:}" failed. No retries permitted until 2026-03-18 16:43:57.489028416 +0000 UTC m=+41.175150761 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4e96119f-c454-4e76-a758-9be470a94be8-cert") pod "ingress-canary-t8ccf" (UID: "4e96119f-c454-4e76-a758-9be470a94be8") : secret "canary-serving-cert" not found Mar 18 16:43:53.489542 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:53.489455 2562 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 18 16:43:53.489542 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:53.489471 2562 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-84f6fbc954-l59hf: secret "image-registry-tls" not found Mar 18 16:43:53.489542 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:53.489516 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-registry-tls podName:678c1a16-a395-4d4a-b4f6-a2ac6e1870e8 nodeName:}" failed. No retries permitted until 2026-03-18 16:43:57.489500438 +0000 UTC m=+41.175622783 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-registry-tls") pod "image-registry-84f6fbc954-l59hf" (UID: "678c1a16-a395-4d4a-b4f6-a2ac6e1870e8") : secret "image-registry-tls" not found Mar 18 16:43:53.590593 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:53.590446 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3a05d5d2-a005-4847-b2aa-0f78e327686a-metrics-tls\") pod \"dns-default-xwx7z\" (UID: \"3a05d5d2-a005-4847-b2aa-0f78e327686a\") " pod="openshift-dns/dns-default-xwx7z" Mar 18 16:43:53.590754 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:53.590712 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:43:53.590835 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:53.590767 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a05d5d2-a005-4847-b2aa-0f78e327686a-metrics-tls podName:3a05d5d2-a005-4847-b2aa-0f78e327686a nodeName:}" failed. No retries permitted until 2026-03-18 16:43:57.590748882 +0000 UTC m=+41.276871225 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3a05d5d2-a005-4847-b2aa-0f78e327686a-metrics-tls") pod "dns-default-xwx7z" (UID: "3a05d5d2-a005-4847-b2aa-0f78e327686a") : secret "dns-default-metrics-tls" not found Mar 18 16:43:53.792604 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:53.792495 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bdf9649a-73e2-4d1b-9ba7-b03cc47f8426-metrics-certs\") pod \"router-default-745d85b678-r6ghs\" (UID: \"bdf9649a-73e2-4d1b-9ba7-b03cc47f8426\") " pod="openshift-ingress/router-default-745d85b678-r6ghs" Mar 18 16:43:53.792604 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:53.792552 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/de591805-36d1-4a59-a89f-2f9aca624e2e-samples-operator-tls\") pod \"cluster-samples-operator-d5df4776c-rn2zb\" (UID: \"de591805-36d1-4a59-a89f-2f9aca624e2e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-rn2zb" Mar 18 16:43:53.792604 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:53.792599 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bdf9649a-73e2-4d1b-9ba7-b03cc47f8426-service-ca-bundle\") pod \"router-default-745d85b678-r6ghs\" (UID: \"bdf9649a-73e2-4d1b-9ba7-b03cc47f8426\") " pod="openshift-ingress/router-default-745d85b678-r6ghs" Mar 18 16:43:53.792901 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:53.792875 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bdf9649a-73e2-4d1b-9ba7-b03cc47f8426-service-ca-bundle podName:bdf9649a-73e2-4d1b-9ba7-b03cc47f8426 nodeName:}" failed. No retries permitted until 2026-03-18 16:43:55.792857521 +0000 UTC m=+39.478979888 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/bdf9649a-73e2-4d1b-9ba7-b03cc47f8426-service-ca-bundle") pod "router-default-745d85b678-r6ghs" (UID: "bdf9649a-73e2-4d1b-9ba7-b03cc47f8426") : configmap references non-existent config key: service-ca.crt Mar 18 16:43:53.793314 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:53.793295 2562 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Mar 18 16:43:53.793406 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:53.793350 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bdf9649a-73e2-4d1b-9ba7-b03cc47f8426-metrics-certs podName:bdf9649a-73e2-4d1b-9ba7-b03cc47f8426 nodeName:}" failed. No retries permitted until 2026-03-18 16:43:55.793335442 +0000 UTC m=+39.479457784 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bdf9649a-73e2-4d1b-9ba7-b03cc47f8426-metrics-certs") pod "router-default-745d85b678-r6ghs" (UID: "bdf9649a-73e2-4d1b-9ba7-b03cc47f8426") : secret "router-metrics-certs-default" not found Mar 18 16:43:53.793471 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:53.793407 2562 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Mar 18 16:43:53.793471 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:53.793439 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de591805-36d1-4a59-a89f-2f9aca624e2e-samples-operator-tls podName:de591805-36d1-4a59-a89f-2f9aca624e2e nodeName:}" failed. No retries permitted until 2026-03-18 16:43:55.793428473 +0000 UTC m=+39.479550816 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/de591805-36d1-4a59-a89f-2f9aca624e2e-samples-operator-tls") pod "cluster-samples-operator-d5df4776c-rn2zb" (UID: "de591805-36d1-4a59-a89f-2f9aca624e2e") : secret "samples-operator-tls" not found Mar 18 16:43:53.893959 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:53.893888 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3e56ca22-2821-4212-ad13-a7f1080372b4-networking-console-plugin-cert\") pod \"networking-console-plugin-55b77584bb-nns88\" (UID: \"3e56ca22-2821-4212-ad13-a7f1080372b4\") " pod="openshift-network-console/networking-console-plugin-55b77584bb-nns88" Mar 18 16:43:53.894139 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:53.894013 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/5ffa71b2-a86e-4662-adad-d3882c534d0f-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-b58cd5d8d-f6w67\" (UID: \"5ffa71b2-a86e-4662-adad-d3882c534d0f\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-f6w67" Mar 18 16:43:53.894201 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:53.894159 2562 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 18 16:43:53.894258 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:53.894221 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ffa71b2-a86e-4662-adad-d3882c534d0f-cluster-monitoring-operator-tls podName:5ffa71b2-a86e-4662-adad-d3882c534d0f nodeName:}" failed. No retries permitted until 2026-03-18 16:43:55.894202773 +0000 UTC m=+39.580325128 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/5ffa71b2-a86e-4662-adad-d3882c534d0f-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-b58cd5d8d-f6w67" (UID: "5ffa71b2-a86e-4662-adad-d3882c534d0f") : secret "cluster-monitoring-operator-tls" not found Mar 18 16:43:53.894725 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:53.894632 2562 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Mar 18 16:43:53.894725 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:53.894692 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e56ca22-2821-4212-ad13-a7f1080372b4-networking-console-plugin-cert podName:3e56ca22-2821-4212-ad13-a7f1080372b4 nodeName:}" failed. No retries permitted until 2026-03-18 16:43:55.894675583 +0000 UTC m=+39.580797930 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/3e56ca22-2821-4212-ad13-a7f1080372b4-networking-console-plugin-cert") pod "networking-console-plugin-55b77584bb-nns88" (UID: "3e56ca22-2821-4212-ad13-a7f1080372b4") : secret "networking-console-plugin-cert" not found Mar 18 16:43:54.245182 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:54.245142 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qhxzz" event={"ID":"dcfc6f29-52f0-4f09-b50d-f044f9886e51","Type":"ContainerStarted","Data":"5de4279222b59d6dbab32e96160564965d2180d4f14e5c44c41094be51adbfc1"} Mar 18 16:43:54.273970 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:54.270571 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-qhxzz" podStartSLOduration=5.995898963 podStartE2EDuration="37.270546778s" podCreationTimestamp="2026-03-18 16:43:17 +0000 UTC" firstStartedPulling="2026-03-18 16:43:19.791695712 +0000 UTC m=+3.477818056" lastFinishedPulling="2026-03-18 16:43:51.06634353 +0000 UTC m=+34.752465871" observedRunningTime="2026-03-18 16:43:54.268257949 +0000 UTC m=+37.954380312" watchObservedRunningTime="2026-03-18 16:43:54.270546778 +0000 UTC m=+37.956669140" Mar 18 16:43:55.814816 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:55.814777 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bdf9649a-73e2-4d1b-9ba7-b03cc47f8426-metrics-certs\") pod \"router-default-745d85b678-r6ghs\" (UID: \"bdf9649a-73e2-4d1b-9ba7-b03cc47f8426\") " pod="openshift-ingress/router-default-745d85b678-r6ghs" Mar 18 16:43:55.815269 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:55.814826 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/de591805-36d1-4a59-a89f-2f9aca624e2e-samples-operator-tls\") pod \"cluster-samples-operator-d5df4776c-rn2zb\" (UID: \"de591805-36d1-4a59-a89f-2f9aca624e2e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-rn2zb" Mar 18 16:43:55.815269 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:55.814869 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bdf9649a-73e2-4d1b-9ba7-b03cc47f8426-service-ca-bundle\") pod \"router-default-745d85b678-r6ghs\" (UID: \"bdf9649a-73e2-4d1b-9ba7-b03cc47f8426\") " pod="openshift-ingress/router-default-745d85b678-r6ghs" Mar 18 16:43:55.815269 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:55.814908 2562 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Mar 18 16:43:55.815269 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:55.815003 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bdf9649a-73e2-4d1b-9ba7-b03cc47f8426-metrics-certs podName:bdf9649a-73e2-4d1b-9ba7-b03cc47f8426 nodeName:}" failed. No retries permitted until 2026-03-18 16:43:59.814980651 +0000 UTC m=+43.501103007 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bdf9649a-73e2-4d1b-9ba7-b03cc47f8426-metrics-certs") pod "router-default-745d85b678-r6ghs" (UID: "bdf9649a-73e2-4d1b-9ba7-b03cc47f8426") : secret "router-metrics-certs-default" not found Mar 18 16:43:55.815269 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:55.815065 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bdf9649a-73e2-4d1b-9ba7-b03cc47f8426-service-ca-bundle podName:bdf9649a-73e2-4d1b-9ba7-b03cc47f8426 nodeName:}" failed. No retries permitted until 2026-03-18 16:43:59.815054501 +0000 UTC m=+43.501176846 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/bdf9649a-73e2-4d1b-9ba7-b03cc47f8426-service-ca-bundle") pod "router-default-745d85b678-r6ghs" (UID: "bdf9649a-73e2-4d1b-9ba7-b03cc47f8426") : configmap references non-existent config key: service-ca.crt Mar 18 16:43:55.815269 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:55.815121 2562 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Mar 18 16:43:55.815269 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:55.815162 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de591805-36d1-4a59-a89f-2f9aca624e2e-samples-operator-tls podName:de591805-36d1-4a59-a89f-2f9aca624e2e nodeName:}" failed. No retries permitted until 2026-03-18 16:43:59.815146998 +0000 UTC m=+43.501269341 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/de591805-36d1-4a59-a89f-2f9aca624e2e-samples-operator-tls") pod "cluster-samples-operator-d5df4776c-rn2zb" (UID: "de591805-36d1-4a59-a89f-2f9aca624e2e") : secret "samples-operator-tls" not found Mar 18 16:43:55.916566 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:55.915815 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/5ffa71b2-a86e-4662-adad-d3882c534d0f-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-b58cd5d8d-f6w67\" (UID: \"5ffa71b2-a86e-4662-adad-d3882c534d0f\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-f6w67" Mar 18 16:43:55.916566 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:55.916214 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3e56ca22-2821-4212-ad13-a7f1080372b4-networking-console-plugin-cert\") pod \"networking-console-plugin-55b77584bb-nns88\" (UID: \"3e56ca22-2821-4212-ad13-a7f1080372b4\") " pod="openshift-network-console/networking-console-plugin-55b77584bb-nns88" Mar 18 16:43:55.916566 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:55.915981 2562 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 18 16:43:55.916566 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:55.916431 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ffa71b2-a86e-4662-adad-d3882c534d0f-cluster-monitoring-operator-tls podName:5ffa71b2-a86e-4662-adad-d3882c534d0f nodeName:}" failed. No retries permitted until 2026-03-18 16:43:59.916412809 +0000 UTC m=+43.602535163 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/5ffa71b2-a86e-4662-adad-d3882c534d0f-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-b58cd5d8d-f6w67" (UID: "5ffa71b2-a86e-4662-adad-d3882c534d0f") : secret "cluster-monitoring-operator-tls" not found Mar 18 16:43:55.916890 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:55.916364 2562 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Mar 18 16:43:55.916890 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:55.916805 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e56ca22-2821-4212-ad13-a7f1080372b4-networking-console-plugin-cert podName:3e56ca22-2821-4212-ad13-a7f1080372b4 nodeName:}" failed. No retries permitted until 2026-03-18 16:43:59.916788435 +0000 UTC m=+43.602910782 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/3e56ca22-2821-4212-ad13-a7f1080372b4-networking-console-plugin-cert") pod "networking-console-plugin-55b77584bb-nns88" (UID: "3e56ca22-2821-4212-ad13-a7f1080372b4") : secret "networking-console-plugin-cert" not found Mar 18 16:43:57.532076 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:57.532029 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-registry-tls\") pod \"image-registry-84f6fbc954-l59hf\" (UID: \"678c1a16-a395-4d4a-b4f6-a2ac6e1870e8\") " pod="openshift-image-registry/image-registry-84f6fbc954-l59hf" Mar 18 16:43:57.532076 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:57.532086 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e96119f-c454-4e76-a758-9be470a94be8-cert\") pod \"ingress-canary-t8ccf\" (UID: \"4e96119f-c454-4e76-a758-9be470a94be8\") " pod="openshift-ingress-canary/ingress-canary-t8ccf" Mar 18 16:43:57.532588 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:57.532210 2562 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 18 16:43:57.532588 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:57.532227 2562 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-84f6fbc954-l59hf: secret "image-registry-tls" not found Mar 18 16:43:57.532588 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:57.532300 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:43:57.532588 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:57.532308 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-registry-tls podName:678c1a16-a395-4d4a-b4f6-a2ac6e1870e8 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:05.532286717 +0000 UTC m=+49.218409068 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-registry-tls") pod "image-registry-84f6fbc954-l59hf" (UID: "678c1a16-a395-4d4a-b4f6-a2ac6e1870e8") : secret "image-registry-tls" not found Mar 18 16:43:57.532588 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:57.532389 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4e96119f-c454-4e76-a758-9be470a94be8-cert podName:4e96119f-c454-4e76-a758-9be470a94be8 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:05.532370124 +0000 UTC m=+49.218492464 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4e96119f-c454-4e76-a758-9be470a94be8-cert") pod "ingress-canary-t8ccf" (UID: "4e96119f-c454-4e76-a758-9be470a94be8") : secret "canary-serving-cert" not found Mar 18 16:43:57.633320 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:57.633290 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3a05d5d2-a005-4847-b2aa-0f78e327686a-metrics-tls\") pod \"dns-default-xwx7z\" (UID: \"3a05d5d2-a005-4847-b2aa-0f78e327686a\") " pod="openshift-dns/dns-default-xwx7z" Mar 18 16:43:57.633483 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:57.633425 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:43:57.633529 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:57.633487 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a05d5d2-a005-4847-b2aa-0f78e327686a-metrics-tls podName:3a05d5d2-a005-4847-b2aa-0f78e327686a nodeName:}" failed. No retries permitted until 2026-03-18 16:44:05.633467417 +0000 UTC m=+49.319589759 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3a05d5d2-a005-4847-b2aa-0f78e327686a-metrics-tls") pod "dns-default-xwx7z" (UID: "3a05d5d2-a005-4847-b2aa-0f78e327686a") : secret "dns-default-metrics-tls" not found Mar 18 16:43:59.852642 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:59.852602 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bdf9649a-73e2-4d1b-9ba7-b03cc47f8426-metrics-certs\") pod \"router-default-745d85b678-r6ghs\" (UID: \"bdf9649a-73e2-4d1b-9ba7-b03cc47f8426\") " pod="openshift-ingress/router-default-745d85b678-r6ghs" Mar 18 16:43:59.853263 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:59.852649 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/de591805-36d1-4a59-a89f-2f9aca624e2e-samples-operator-tls\") pod \"cluster-samples-operator-d5df4776c-rn2zb\" (UID: \"de591805-36d1-4a59-a89f-2f9aca624e2e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-rn2zb" Mar 18 16:43:59.853263 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:59.852698 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bdf9649a-73e2-4d1b-9ba7-b03cc47f8426-service-ca-bundle\") pod \"router-default-745d85b678-r6ghs\" (UID: \"bdf9649a-73e2-4d1b-9ba7-b03cc47f8426\") " pod="openshift-ingress/router-default-745d85b678-r6ghs" Mar 18 16:43:59.853263 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:59.852728 2562 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Mar 18 16:43:59.853263 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:59.852789 2562 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Mar 18 16:43:59.853263 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:59.852804 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bdf9649a-73e2-4d1b-9ba7-b03cc47f8426-metrics-certs podName:bdf9649a-73e2-4d1b-9ba7-b03cc47f8426 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:07.852783071 +0000 UTC m=+51.538905410 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bdf9649a-73e2-4d1b-9ba7-b03cc47f8426-metrics-certs") pod "router-default-745d85b678-r6ghs" (UID: "bdf9649a-73e2-4d1b-9ba7-b03cc47f8426") : secret "router-metrics-certs-default" not found Mar 18 16:43:59.853263 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:59.852847 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de591805-36d1-4a59-a89f-2f9aca624e2e-samples-operator-tls podName:de591805-36d1-4a59-a89f-2f9aca624e2e nodeName:}" failed. No retries permitted until 2026-03-18 16:44:07.852834551 +0000 UTC m=+51.538956894 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/de591805-36d1-4a59-a89f-2f9aca624e2e-samples-operator-tls") pod "cluster-samples-operator-d5df4776c-rn2zb" (UID: "de591805-36d1-4a59-a89f-2f9aca624e2e") : secret "samples-operator-tls" not found Mar 18 16:43:59.853263 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:59.852869 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bdf9649a-73e2-4d1b-9ba7-b03cc47f8426-service-ca-bundle podName:bdf9649a-73e2-4d1b-9ba7-b03cc47f8426 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:07.852853007 +0000 UTC m=+51.538975356 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/bdf9649a-73e2-4d1b-9ba7-b03cc47f8426-service-ca-bundle") pod "router-default-745d85b678-r6ghs" (UID: "bdf9649a-73e2-4d1b-9ba7-b03cc47f8426") : configmap references non-existent config key: service-ca.crt Mar 18 16:43:59.953247 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:59.953207 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/5ffa71b2-a86e-4662-adad-d3882c534d0f-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-b58cd5d8d-f6w67\" (UID: \"5ffa71b2-a86e-4662-adad-d3882c534d0f\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-f6w67" Mar 18 16:43:59.953413 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:59.953348 2562 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 18 16:43:59.953468 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:59.953421 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ffa71b2-a86e-4662-adad-d3882c534d0f-cluster-monitoring-operator-tls podName:5ffa71b2-a86e-4662-adad-d3882c534d0f nodeName:}" failed. No retries permitted until 2026-03-18 16:44:07.953403506 +0000 UTC m=+51.639525851 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/5ffa71b2-a86e-4662-adad-d3882c534d0f-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-b58cd5d8d-f6w67" (UID: "5ffa71b2-a86e-4662-adad-d3882c534d0f") : secret "cluster-monitoring-operator-tls" not found Mar 18 16:43:59.953519 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:43:59.953496 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3e56ca22-2821-4212-ad13-a7f1080372b4-networking-console-plugin-cert\") pod \"networking-console-plugin-55b77584bb-nns88\" (UID: \"3e56ca22-2821-4212-ad13-a7f1080372b4\") " pod="openshift-network-console/networking-console-plugin-55b77584bb-nns88" Mar 18 16:43:59.953597 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:59.953583 2562 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Mar 18 16:43:59.953656 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:43:59.953619 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e56ca22-2821-4212-ad13-a7f1080372b4-networking-console-plugin-cert podName:3e56ca22-2821-4212-ad13-a7f1080372b4 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:07.953610844 +0000 UTC m=+51.639733184 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/3e56ca22-2821-4212-ad13-a7f1080372b4-networking-console-plugin-cert") pod "networking-console-plugin-55b77584bb-nns88" (UID: "3e56ca22-2821-4212-ad13-a7f1080372b4") : secret "networking-console-plugin-cert" not found Mar 18 16:44:01.568748 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:01.568715 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9a6bdc88-7df5-45fd-98ac-2f967cc2f192-original-pull-secret\") pod \"global-pull-secret-syncer-cck66\" (UID: \"9a6bdc88-7df5-45fd-98ac-2f967cc2f192\") " pod="kube-system/global-pull-secret-syncer-cck66" Mar 18 16:44:01.571701 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:01.571680 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9a6bdc88-7df5-45fd-98ac-2f967cc2f192-original-pull-secret\") pod \"global-pull-secret-syncer-cck66\" (UID: \"9a6bdc88-7df5-45fd-98ac-2f967cc2f192\") " pod="kube-system/global-pull-secret-syncer-cck66" Mar 18 16:44:01.704344 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:01.704319 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cck66" Mar 18 16:44:04.401684 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:04.401655 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-cck66"] Mar 18 16:44:04.404384 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:44:04.404357 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a6bdc88_7df5_45fd_98ac_2f967cc2f192.slice/crio-29bdaf7cbf3875865e53ce95aa1e545017eca440806ec31a9a3cd116d923fea8 WatchSource:0}: Error finding container 29bdaf7cbf3875865e53ce95aa1e545017eca440806ec31a9a3cd116d923fea8: Status 404 returned error can't find the container with id 29bdaf7cbf3875865e53ce95aa1e545017eca440806ec31a9a3cd116d923fea8 Mar 18 16:44:05.268873 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:05.268823 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-67fdcb5769-cwqdm" event={"ID":"70783292-1426-4298-9aae-22ccd0c24067","Type":"ContainerStarted","Data":"ad189e8a8d79d85edfe010586eeb78982658894172cfd7f792567835a729e699"} Mar 18 16:44:05.270804 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:05.270725 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-mnsnh" event={"ID":"473b4404-da3f-4c35-92c4-a69465dc3f06","Type":"ContainerStarted","Data":"8f21b7968202039f53cf709d0f611bb2beef5ac7fafc1a1a94d4145694aac472"} Mar 18 16:44:05.271088 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:05.271038 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-mnsnh" Mar 18 16:44:05.273279 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:05.273249 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-jr9mc_627f8af5-be81-426b-9540-c096c256323a/console-operator/0.log" Mar 18 16:44:05.273380 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:05.273282 2562 generic.go:358] "Generic (PLEG): container finished" podID="627f8af5-be81-426b-9540-c096c256323a" containerID="ccd74b54b8acb6630ca55c7fd76ae10710c5efcbe936459eb20e9014ac233157" exitCode=255 Mar 18 16:44:05.273380 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:05.273335 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-76b8565867-jr9mc" event={"ID":"627f8af5-be81-426b-9540-c096c256323a","Type":"ContainerDied","Data":"ccd74b54b8acb6630ca55c7fd76ae10710c5efcbe936459eb20e9014ac233157"} Mar 18 16:44:05.273835 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:05.273599 2562 scope.go:117] "RemoveContainer" containerID="ccd74b54b8acb6630ca55c7fd76ae10710c5efcbe936459eb20e9014ac233157" Mar 18 16:44:05.275288 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:05.275262 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-cc88fdd44-h6jqn" event={"ID":"fee6fee0-2356-450e-9cf1-5d0c7cd03239","Type":"ContainerStarted","Data":"6a143de885d1ff20021161497cfb040ae0df362fd49908ff795fe707d052c92f"} Mar 18 16:44:05.276638 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:05.276615 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-cck66" event={"ID":"9a6bdc88-7df5-45fd-98ac-2f967cc2f192","Type":"ContainerStarted","Data":"29bdaf7cbf3875865e53ce95aa1e545017eca440806ec31a9a3cd116d923fea8"} Mar 18 16:44:05.279141 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:05.279111 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-w2ghk" event={"ID":"199634f3-40e8-46c3-b2cc-89dff9148da4","Type":"ContainerStarted","Data":"b3c4b3fda8b7226bb323e24d9c6fac14a5b54260c1a49ee08e61a31da46c0638"} Mar 18 16:44:05.280845 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:05.280822 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-76bdd9f478-kpr46" event={"ID":"a69e9c12-8943-4e3a-b184-8b4e9b5c45c7","Type":"ContainerStarted","Data":"a59a2d21bef9c5439ac42ad5c93346e63dff1029c5747a41952930275912d195"} Mar 18 16:44:05.285397 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:05.283912 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-c66b889bd-d58k4" event={"ID":"736b2881-9edf-4b48-bf49-24dcdaf409a2","Type":"ContainerStarted","Data":"c98de3246abc9041656eabf7401880c2cbf9515633b5ff0c0bb5bd32c5d11789"} Mar 18 16:44:05.285397 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:05.283919 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-67fdcb5769-cwqdm" podStartSLOduration=1.7381378619999999 podStartE2EDuration="13.283905655s" podCreationTimestamp="2026-03-18 16:43:52 +0000 UTC" firstStartedPulling="2026-03-18 16:43:52.742309771 +0000 UTC m=+36.428432119" lastFinishedPulling="2026-03-18 16:44:04.288077563 +0000 UTC m=+47.974199912" observedRunningTime="2026-03-18 16:44:05.283475324 +0000 UTC m=+48.969597687" watchObservedRunningTime="2026-03-18 16:44:05.283905655 +0000 UTC m=+48.970028019" Mar 18 16:44:05.285397 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:05.284453 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-c66b889bd-d58k4" Mar 18 16:44:05.286184 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:05.286144 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-c66b889bd-d58k4" Mar 18 16:44:05.287216 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:05.287188 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-vtj99" event={"ID":"355a3baf-f84e-464e-90ff-4cf4165ace30","Type":"ContainerStarted","Data":"150d03c9f60f0668187f02d27978d143193ec6376a9f3195308a321973583af1"} Mar 18 16:44:05.289302 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:05.289257 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7f99d8fb77-f6mvg" event={"ID":"90d8c52e-5d9f-4f3e-960d-eac042763564","Type":"ContainerStarted","Data":"baa9628f6f7b61ed7937737405e7101291d6f72d6dcbe893d409696d1f5e32bb"} Mar 18 16:44:05.292109 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:05.292087 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-85bd84d7cc-6q9bh" event={"ID":"72d8d7bb-cf42-421f-94be-19b1c6f8778b","Type":"ContainerStarted","Data":"aece125f825f79a24584d09ee660eb820549c57cef1162319570c40548842d0b"} Mar 18 16:44:05.304922 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:05.301831 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-mnsnh" podStartSLOduration=35.058617892 podStartE2EDuration="48.301816883s" podCreationTimestamp="2026-03-18 16:43:17 +0000 UTC" firstStartedPulling="2026-03-18 16:43:51.044712342 +0000 UTC m=+34.730834698" lastFinishedPulling="2026-03-18 16:44:04.287911342 +0000 UTC m=+47.974033689" observedRunningTime="2026-03-18 16:44:05.30002333 +0000 UTC m=+48.986145695" watchObservedRunningTime="2026-03-18 16:44:05.301816883 +0000 UTC m=+48.987939249" Mar 18 16:44:05.320657 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:05.319577 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-76bdd9f478-kpr46" podStartSLOduration=1.7752736470000001 podStartE2EDuration="13.319565248s" podCreationTimestamp="2026-03-18 16:43:52 +0000 UTC" firstStartedPulling="2026-03-18 16:43:52.742883125 +0000 UTC m=+36.429005470" lastFinishedPulling="2026-03-18 16:44:04.287174719 +0000 UTC m=+47.973297071" observedRunningTime="2026-03-18 16:44:05.318651802 +0000 UTC m=+49.004774165" watchObservedRunningTime="2026-03-18 16:44:05.319565248 +0000 UTC m=+49.005687611" Mar 18 16:44:05.370796 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:05.369541 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-w2ghk" podStartSLOduration=2.060710879 podStartE2EDuration="13.369526262s" podCreationTimestamp="2026-03-18 16:43:52 +0000 UTC" firstStartedPulling="2026-03-18 16:43:52.948713007 +0000 UTC m=+36.634835348" lastFinishedPulling="2026-03-18 16:44:04.25752837 +0000 UTC m=+47.943650731" observedRunningTime="2026-03-18 16:44:05.352181575 +0000 UTC m=+49.038303942" watchObservedRunningTime="2026-03-18 16:44:05.369526262 +0000 UTC m=+49.055648624" Mar 18 16:44:05.392078 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:05.391661 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-c66b889bd-d58k4" podStartSLOduration=20.148811748 podStartE2EDuration="33.391646116s" podCreationTimestamp="2026-03-18 16:43:32 +0000 UTC" firstStartedPulling="2026-03-18 16:43:51.044726746 +0000 UTC m=+34.730849101" lastFinishedPulling="2026-03-18 16:44:04.287561112 +0000 UTC m=+47.973683469" observedRunningTime="2026-03-18 16:44:05.390209327 +0000 UTC m=+49.076331691" watchObservedRunningTime="2026-03-18 16:44:05.391646116 +0000 UTC m=+49.077768479" Mar 18 16:44:05.392078 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:05.391909 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-cc88fdd44-h6jqn" podStartSLOduration=1.8371941939999998 podStartE2EDuration="13.391902521s" podCreationTimestamp="2026-03-18 16:43:52 +0000 UTC" firstStartedPulling="2026-03-18 16:43:52.744447706 +0000 UTC m=+36.430570050" lastFinishedPulling="2026-03-18 16:44:04.299156021 +0000 UTC m=+47.985278377" observedRunningTime="2026-03-18 16:44:05.370999484 +0000 UTC m=+49.057121849" watchObservedRunningTime="2026-03-18 16:44:05.391902521 +0000 UTC m=+49.078024879" Mar 18 16:44:05.410709 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:05.410663 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-vtj99" podStartSLOduration=2.733984998 podStartE2EDuration="14.410648168s" podCreationTimestamp="2026-03-18 16:43:51 +0000 UTC" firstStartedPulling="2026-03-18 16:43:52.611742676 +0000 UTC m=+36.297865016" lastFinishedPulling="2026-03-18 16:44:04.288405846 +0000 UTC m=+47.974528186" observedRunningTime="2026-03-18 16:44:05.409181559 +0000 UTC m=+49.095303922" watchObservedRunningTime="2026-03-18 16:44:05.410648168 +0000 UTC m=+49.096770530" Mar 18 16:44:05.428216 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:05.426669 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-85bd84d7cc-6q9bh" podStartSLOduration=20.18585751 podStartE2EDuration="33.426656904s" podCreationTimestamp="2026-03-18 16:43:32 +0000 UTC" firstStartedPulling="2026-03-18 16:43:51.044737478 +0000 UTC m=+34.730859834" lastFinishedPulling="2026-03-18 16:44:04.285536888 +0000 UTC m=+47.971659228" observedRunningTime="2026-03-18 16:44:05.426421223 +0000 UTC m=+49.112543585" watchObservedRunningTime="2026-03-18 16:44:05.426656904 +0000 UTC m=+49.112779268" Mar 18 16:44:05.604242 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:05.603824 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-registry-tls\") pod \"image-registry-84f6fbc954-l59hf\" (UID: \"678c1a16-a395-4d4a-b4f6-a2ac6e1870e8\") " pod="openshift-image-registry/image-registry-84f6fbc954-l59hf" Mar 18 16:44:05.604242 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:05.603874 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e96119f-c454-4e76-a758-9be470a94be8-cert\") pod \"ingress-canary-t8ccf\" (UID: \"4e96119f-c454-4e76-a758-9be470a94be8\") " pod="openshift-ingress-canary/ingress-canary-t8ccf" Mar 18 16:44:05.604242 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:44:05.604068 2562 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 18 16:44:05.604242 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:44:05.604084 2562 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-84f6fbc954-l59hf: secret "image-registry-tls" not found Mar 18 16:44:05.605106 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:44:05.604546 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-registry-tls podName:678c1a16-a395-4d4a-b4f6-a2ac6e1870e8 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:21.604123404 +0000 UTC m=+65.290245761 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-registry-tls") pod "image-registry-84f6fbc954-l59hf" (UID: "678c1a16-a395-4d4a-b4f6-a2ac6e1870e8") : secret "image-registry-tls" not found Mar 18 16:44:05.605106 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:44:05.605036 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:44:05.605106 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:44:05.605084 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4e96119f-c454-4e76-a758-9be470a94be8-cert podName:4e96119f-c454-4e76-a758-9be470a94be8 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:21.605069598 +0000 UTC m=+65.291191944 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4e96119f-c454-4e76-a758-9be470a94be8-cert") pod "ingress-canary-t8ccf" (UID: "4e96119f-c454-4e76-a758-9be470a94be8") : secret "canary-serving-cert" not found Mar 18 16:44:05.708956 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:05.706564 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3a05d5d2-a005-4847-b2aa-0f78e327686a-metrics-tls\") pod \"dns-default-xwx7z\" (UID: \"3a05d5d2-a005-4847-b2aa-0f78e327686a\") " pod="openshift-dns/dns-default-xwx7z" Mar 18 16:44:05.708956 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:44:05.706790 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:44:05.708956 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:44:05.706841 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a05d5d2-a005-4847-b2aa-0f78e327686a-metrics-tls podName:3a05d5d2-a005-4847-b2aa-0f78e327686a nodeName:}" failed. No retries permitted until 2026-03-18 16:44:21.706825316 +0000 UTC m=+65.392947660 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3a05d5d2-a005-4847-b2aa-0f78e327686a-metrics-tls") pod "dns-default-xwx7z" (UID: "3a05d5d2-a005-4847-b2aa-0f78e327686a") : secret "dns-default-metrics-tls" not found Mar 18 16:44:06.297273 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:06.297199 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-jr9mc_627f8af5-be81-426b-9540-c096c256323a/console-operator/1.log" Mar 18 16:44:06.297924 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:06.297902 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-jr9mc_627f8af5-be81-426b-9540-c096c256323a/console-operator/0.log" Mar 18 16:44:06.298059 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:06.297965 2562 generic.go:358] "Generic (PLEG): container finished" podID="627f8af5-be81-426b-9540-c096c256323a" containerID="ddbdffeafd174bc7cce9b55dc22bfb29fb260d7a8d47d828a44061e251317aa3" exitCode=255 Mar 18 16:44:06.299286 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:06.299259 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-76b8565867-jr9mc" event={"ID":"627f8af5-be81-426b-9540-c096c256323a","Type":"ContainerDied","Data":"ddbdffeafd174bc7cce9b55dc22bfb29fb260d7a8d47d828a44061e251317aa3"} Mar 18 16:44:06.299410 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:06.299303 2562 scope.go:117] "RemoveContainer" containerID="ccd74b54b8acb6630ca55c7fd76ae10710c5efcbe936459eb20e9014ac233157" Mar 18 16:44:06.299613 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:06.299550 2562 scope.go:117] "RemoveContainer" containerID="ddbdffeafd174bc7cce9b55dc22bfb29fb260d7a8d47d828a44061e251317aa3" Mar 18 16:44:06.299960 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:44:06.299735 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-76b8565867-jr9mc_openshift-console-operator(627f8af5-be81-426b-9540-c096c256323a)\"" pod="openshift-console-operator/console-operator-76b8565867-jr9mc" podUID="627f8af5-be81-426b-9540-c096c256323a" Mar 18 16:44:07.301897 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:07.301876 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-jr9mc_627f8af5-be81-426b-9540-c096c256323a/console-operator/1.log" Mar 18 16:44:07.302570 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:07.302550 2562 scope.go:117] "RemoveContainer" containerID="ddbdffeafd174bc7cce9b55dc22bfb29fb260d7a8d47d828a44061e251317aa3" Mar 18 16:44:07.302784 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:44:07.302760 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-76b8565867-jr9mc_openshift-console-operator(627f8af5-be81-426b-9540-c096c256323a)\"" pod="openshift-console-operator/console-operator-76b8565867-jr9mc" podUID="627f8af5-be81-426b-9540-c096c256323a" Mar 18 16:44:07.928333 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:07.928299 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bdf9649a-73e2-4d1b-9ba7-b03cc47f8426-metrics-certs\") pod \"router-default-745d85b678-r6ghs\" (UID: \"bdf9649a-73e2-4d1b-9ba7-b03cc47f8426\") " pod="openshift-ingress/router-default-745d85b678-r6ghs" Mar 18 16:44:07.928333 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:07.928335 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/de591805-36d1-4a59-a89f-2f9aca624e2e-samples-operator-tls\") pod \"cluster-samples-operator-d5df4776c-rn2zb\" (UID: \"de591805-36d1-4a59-a89f-2f9aca624e2e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-rn2zb" Mar 18 16:44:07.928566 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:07.928363 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bdf9649a-73e2-4d1b-9ba7-b03cc47f8426-service-ca-bundle\") pod \"router-default-745d85b678-r6ghs\" (UID: \"bdf9649a-73e2-4d1b-9ba7-b03cc47f8426\") " pod="openshift-ingress/router-default-745d85b678-r6ghs" Mar 18 16:44:07.928566 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:44:07.928451 2562 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Mar 18 16:44:07.928566 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:44:07.928475 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bdf9649a-73e2-4d1b-9ba7-b03cc47f8426-service-ca-bundle podName:bdf9649a-73e2-4d1b-9ba7-b03cc47f8426 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:23.928462816 +0000 UTC m=+67.614585155 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/bdf9649a-73e2-4d1b-9ba7-b03cc47f8426-service-ca-bundle") pod "router-default-745d85b678-r6ghs" (UID: "bdf9649a-73e2-4d1b-9ba7-b03cc47f8426") : configmap references non-existent config key: service-ca.crt Mar 18 16:44:07.928566 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:44:07.928510 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bdf9649a-73e2-4d1b-9ba7-b03cc47f8426-metrics-certs podName:bdf9649a-73e2-4d1b-9ba7-b03cc47f8426 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:23.928494952 +0000 UTC m=+67.614617316 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bdf9649a-73e2-4d1b-9ba7-b03cc47f8426-metrics-certs") pod "router-default-745d85b678-r6ghs" (UID: "bdf9649a-73e2-4d1b-9ba7-b03cc47f8426") : secret "router-metrics-certs-default" not found Mar 18 16:44:07.928566 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:44:07.928554 2562 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Mar 18 16:44:07.928730 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:44:07.928621 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de591805-36d1-4a59-a89f-2f9aca624e2e-samples-operator-tls podName:de591805-36d1-4a59-a89f-2f9aca624e2e nodeName:}" failed. No retries permitted until 2026-03-18 16:44:23.928609686 +0000 UTC m=+67.614732026 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/de591805-36d1-4a59-a89f-2f9aca624e2e-samples-operator-tls") pod "cluster-samples-operator-d5df4776c-rn2zb" (UID: "de591805-36d1-4a59-a89f-2f9aca624e2e") : secret "samples-operator-tls" not found Mar 18 16:44:07.986022 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:07.985995 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-zd7qk_184966df-7341-412a-aede-a32364efc520/dns-node-resolver/0.log" Mar 18 16:44:08.029738 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:08.029715 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3e56ca22-2821-4212-ad13-a7f1080372b4-networking-console-plugin-cert\") pod \"networking-console-plugin-55b77584bb-nns88\" (UID: \"3e56ca22-2821-4212-ad13-a7f1080372b4\") " pod="openshift-network-console/networking-console-plugin-55b77584bb-nns88" Mar 18 16:44:08.029878 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:44:08.029860 2562 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Mar 18 16:44:08.029878 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:08.029870 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/5ffa71b2-a86e-4662-adad-d3882c534d0f-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-b58cd5d8d-f6w67\" (UID: \"5ffa71b2-a86e-4662-adad-d3882c534d0f\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-f6w67" Mar 18 16:44:08.030021 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:44:08.029916 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e56ca22-2821-4212-ad13-a7f1080372b4-networking-console-plugin-cert podName:3e56ca22-2821-4212-ad13-a7f1080372b4 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:24.029900455 +0000 UTC m=+67.716022802 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/3e56ca22-2821-4212-ad13-a7f1080372b4-networking-console-plugin-cert") pod "networking-console-plugin-55b77584bb-nns88" (UID: "3e56ca22-2821-4212-ad13-a7f1080372b4") : secret "networking-console-plugin-cert" not found Mar 18 16:44:08.030021 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:44:08.029987 2562 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 18 16:44:08.030129 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:44:08.030038 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ffa71b2-a86e-4662-adad-d3882c534d0f-cluster-monitoring-operator-tls podName:5ffa71b2-a86e-4662-adad-d3882c534d0f nodeName:}" failed. No retries permitted until 2026-03-18 16:44:24.03002401 +0000 UTC m=+67.716146365 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/5ffa71b2-a86e-4662-adad-d3882c534d0f-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-b58cd5d8d-f6w67" (UID: "5ffa71b2-a86e-4662-adad-d3882c534d0f") : secret "cluster-monitoring-operator-tls" not found Mar 18 16:44:08.306526 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:08.306453 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7f99d8fb77-f6mvg" event={"ID":"90d8c52e-5d9f-4f3e-960d-eac042763564","Type":"ContainerStarted","Data":"62116d024046bab550f80dbb9366f9306c2a36bdba11c9ffa69795ab610bf088"} Mar 18 16:44:08.306526 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:08.306484 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7f99d8fb77-f6mvg" event={"ID":"90d8c52e-5d9f-4f3e-960d-eac042763564","Type":"ContainerStarted","Data":"72e46fe91c02e9705c71be6207dd4f737a23057df1a5355046b69f5cb7bceef1"} Mar 18 16:44:08.326372 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:08.326330 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7f99d8fb77-f6mvg" podStartSLOduration=20.081727625 podStartE2EDuration="36.326318969s" podCreationTimestamp="2026-03-18 16:43:32 +0000 UTC" firstStartedPulling="2026-03-18 16:43:51.044711676 +0000 UTC m=+34.730834021" lastFinishedPulling="2026-03-18 16:44:07.289302992 +0000 UTC m=+50.975425365" observedRunningTime="2026-03-18 16:44:08.326118081 +0000 UTC m=+52.012240444" watchObservedRunningTime="2026-03-18 16:44:08.326318969 +0000 UTC m=+52.012441330" Mar 18 16:44:08.585893 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:08.585815 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-9t8zk_abad6a1d-3e28-4f96-90e8-383ed7b5b8e1/node-ca/0.log" Mar 18 16:44:11.315907 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:11.315863 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-cck66" event={"ID":"9a6bdc88-7df5-45fd-98ac-2f967cc2f192","Type":"ContainerStarted","Data":"d89f598f1e619812ba354ff9574e06dfed1a140640090871e36f5990240d7afe"} Mar 18 16:44:11.331560 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:11.331517 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-cck66" podStartSLOduration=36.568043642 podStartE2EDuration="42.331504414s" podCreationTimestamp="2026-03-18 16:43:29 +0000 UTC" firstStartedPulling="2026-03-18 16:44:04.406006699 +0000 UTC m=+48.092129040" lastFinishedPulling="2026-03-18 16:44:10.169467468 +0000 UTC m=+53.855589812" observedRunningTime="2026-03-18 16:44:11.330980189 +0000 UTC m=+55.017102553" watchObservedRunningTime="2026-03-18 16:44:11.331504414 +0000 UTC m=+55.017626816" Mar 18 16:44:12.632394 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:12.632363 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-76b8565867-jr9mc" Mar 18 16:44:12.632394 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:12.632399 2562 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-76b8565867-jr9mc" Mar 18 16:44:12.632773 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:12.632708 2562 scope.go:117] "RemoveContainer" containerID="ddbdffeafd174bc7cce9b55dc22bfb29fb260d7a8d47d828a44061e251317aa3" Mar 18 16:44:12.632869 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:44:12.632850 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-76b8565867-jr9mc_openshift-console-operator(627f8af5-be81-426b-9540-c096c256323a)\"" pod="openshift-console-operator/console-operator-76b8565867-jr9mc" podUID="627f8af5-be81-426b-9540-c096c256323a" Mar 18 16:44:21.639149 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:21.639119 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-registry-tls\") pod \"image-registry-84f6fbc954-l59hf\" (UID: \"678c1a16-a395-4d4a-b4f6-a2ac6e1870e8\") " pod="openshift-image-registry/image-registry-84f6fbc954-l59hf" Mar 18 16:44:21.639149 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:21.639154 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e96119f-c454-4e76-a758-9be470a94be8-cert\") pod \"ingress-canary-t8ccf\" (UID: \"4e96119f-c454-4e76-a758-9be470a94be8\") " pod="openshift-ingress-canary/ingress-canary-t8ccf" Mar 18 16:44:21.641475 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:21.641450 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-registry-tls\") pod \"image-registry-84f6fbc954-l59hf\" (UID: \"678c1a16-a395-4d4a-b4f6-a2ac6e1870e8\") " pod="openshift-image-registry/image-registry-84f6fbc954-l59hf" Mar 18 16:44:21.641592 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:21.641455 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e96119f-c454-4e76-a758-9be470a94be8-cert\") pod \"ingress-canary-t8ccf\" (UID: \"4e96119f-c454-4e76-a758-9be470a94be8\") " pod="openshift-ingress-canary/ingress-canary-t8ccf" Mar 18 16:44:21.739636 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:21.739610 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3a05d5d2-a005-4847-b2aa-0f78e327686a-metrics-tls\") pod \"dns-default-xwx7z\" (UID: \"3a05d5d2-a005-4847-b2aa-0f78e327686a\") " pod="openshift-dns/dns-default-xwx7z" Mar 18 16:44:21.741465 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:21.741445 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3a05d5d2-a005-4847-b2aa-0f78e327686a-metrics-tls\") pod \"dns-default-xwx7z\" (UID: \"3a05d5d2-a005-4847-b2aa-0f78e327686a\") " pod="openshift-dns/dns-default-xwx7z" Mar 18 16:44:21.814892 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:21.814872 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-27ljg\"" Mar 18 16:44:21.823329 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:21.823311 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-84f6fbc954-l59hf" Mar 18 16:44:21.893996 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:21.888241 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-xkmdp\"" Mar 18 16:44:21.901219 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:21.899406 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-t8ccf" Mar 18 16:44:21.901219 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:21.899960 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-7bwt6\"" Mar 18 16:44:21.901533 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:21.901294 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xwx7z" Mar 18 16:44:21.961818 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:21.961784 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-84f6fbc954-l59hf"] Mar 18 16:44:21.967402 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:44:21.967163 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod678c1a16_a395_4d4a_b4f6_a2ac6e1870e8.slice/crio-ef347efd62d77357eb164186c6d09f909253a942f7ccc26ec7781d2dbff03038 WatchSource:0}: Error finding container ef347efd62d77357eb164186c6d09f909253a942f7ccc26ec7781d2dbff03038: Status 404 returned error can't find the container with id ef347efd62d77357eb164186c6d09f909253a942f7ccc26ec7781d2dbff03038 Mar 18 16:44:22.028674 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:22.028609 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-xwx7z"] Mar 18 16:44:22.031605 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:44:22.031576 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a05d5d2_a005_4847_b2aa_0f78e327686a.slice/crio-cfc90de09b700f97fa032bc5a60322cdc6e076cb970d749172e1453bca911be9 WatchSource:0}: Error finding container cfc90de09b700f97fa032bc5a60322cdc6e076cb970d749172e1453bca911be9: Status 404 returned error can't find the container with id cfc90de09b700f97fa032bc5a60322cdc6e076cb970d749172e1453bca911be9 Mar 18 16:44:22.055344 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:22.055323 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-t8ccf"] Mar 18 16:44:22.058487 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:44:22.058450 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e96119f_c454_4e76_a758_9be470a94be8.slice/crio-8da2e19fd27e73b368d87026b8e4d3a2038a3f4b9ed3c95778444e9d9b242803 WatchSource:0}: Error finding container 8da2e19fd27e73b368d87026b8e4d3a2038a3f4b9ed3c95778444e9d9b242803: Status 404 returned error can't find the container with id 8da2e19fd27e73b368d87026b8e4d3a2038a3f4b9ed3c95778444e9d9b242803 Mar 18 16:44:22.360414 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:22.360384 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-84f6fbc954-l59hf" event={"ID":"678c1a16-a395-4d4a-b4f6-a2ac6e1870e8","Type":"ContainerStarted","Data":"2dde2125c4c38660c0da159a511564ec0fed893ff78174e63b53049326c0d92b"} Mar 18 16:44:22.360597 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:22.360423 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-84f6fbc954-l59hf" event={"ID":"678c1a16-a395-4d4a-b4f6-a2ac6e1870e8","Type":"ContainerStarted","Data":"ef347efd62d77357eb164186c6d09f909253a942f7ccc26ec7781d2dbff03038"} Mar 18 16:44:22.360597 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:22.360482 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-84f6fbc954-l59hf" Mar 18 16:44:22.361516 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:22.361494 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xwx7z" event={"ID":"3a05d5d2-a005-4847-b2aa-0f78e327686a","Type":"ContainerStarted","Data":"cfc90de09b700f97fa032bc5a60322cdc6e076cb970d749172e1453bca911be9"} Mar 18 16:44:22.362457 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:22.362435 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-t8ccf" event={"ID":"4e96119f-c454-4e76-a758-9be470a94be8","Type":"ContainerStarted","Data":"8da2e19fd27e73b368d87026b8e4d3a2038a3f4b9ed3c95778444e9d9b242803"} Mar 18 16:44:22.381635 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:22.381601 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-84f6fbc954-l59hf" podStartSLOduration=65.381590275 podStartE2EDuration="1m5.381590275s" podCreationTimestamp="2026-03-18 16:43:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:44:22.380352902 +0000 UTC m=+66.066475274" watchObservedRunningTime="2026-03-18 16:44:22.381590275 +0000 UTC m=+66.067712636" Mar 18 16:44:22.750034 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:22.749971 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/008308d5-c00b-40b2-a413-eb4caebc48c0-metrics-certs\") pod \"network-metrics-daemon-dlnjg\" (UID: \"008308d5-c00b-40b2-a413-eb4caebc48c0\") " pod="openshift-multus/network-metrics-daemon-dlnjg" Mar 18 16:44:22.752363 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:22.752339 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/008308d5-c00b-40b2-a413-eb4caebc48c0-metrics-certs\") pod \"network-metrics-daemon-dlnjg\" (UID: \"008308d5-c00b-40b2-a413-eb4caebc48c0\") " pod="openshift-multus/network-metrics-daemon-dlnjg" Mar 18 16:44:23.000430 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:23.000355 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-pqk66\"" Mar 18 16:44:23.009007 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:23.008975 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dlnjg" Mar 18 16:44:23.177321 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:23.177290 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-dlnjg"] Mar 18 16:44:23.181023 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:44:23.180927 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod008308d5_c00b_40b2_a413_eb4caebc48c0.slice/crio-604a1091f78f7bf0dd7fcb3b4f94925db53f4993eb49b5539be25741043fd4ae WatchSource:0}: Error finding container 604a1091f78f7bf0dd7fcb3b4f94925db53f4993eb49b5539be25741043fd4ae: Status 404 returned error can't find the container with id 604a1091f78f7bf0dd7fcb3b4f94925db53f4993eb49b5539be25741043fd4ae Mar 18 16:44:23.368320 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:23.368277 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dlnjg" event={"ID":"008308d5-c00b-40b2-a413-eb4caebc48c0","Type":"ContainerStarted","Data":"604a1091f78f7bf0dd7fcb3b4f94925db53f4993eb49b5539be25741043fd4ae"} Mar 18 16:44:23.959073 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:23.959032 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bdf9649a-73e2-4d1b-9ba7-b03cc47f8426-metrics-certs\") pod \"router-default-745d85b678-r6ghs\" (UID: \"bdf9649a-73e2-4d1b-9ba7-b03cc47f8426\") " pod="openshift-ingress/router-default-745d85b678-r6ghs" Mar 18 16:44:23.959497 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:23.959078 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/de591805-36d1-4a59-a89f-2f9aca624e2e-samples-operator-tls\") pod \"cluster-samples-operator-d5df4776c-rn2zb\" (UID: \"de591805-36d1-4a59-a89f-2f9aca624e2e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-rn2zb" Mar 18 16:44:23.959497 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:23.959115 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bdf9649a-73e2-4d1b-9ba7-b03cc47f8426-service-ca-bundle\") pod \"router-default-745d85b678-r6ghs\" (UID: \"bdf9649a-73e2-4d1b-9ba7-b03cc47f8426\") " pod="openshift-ingress/router-default-745d85b678-r6ghs" Mar 18 16:44:23.959830 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:23.959804 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bdf9649a-73e2-4d1b-9ba7-b03cc47f8426-service-ca-bundle\") pod \"router-default-745d85b678-r6ghs\" (UID: \"bdf9649a-73e2-4d1b-9ba7-b03cc47f8426\") " pod="openshift-ingress/router-default-745d85b678-r6ghs" Mar 18 16:44:23.961720 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:23.961692 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/de591805-36d1-4a59-a89f-2f9aca624e2e-samples-operator-tls\") pod \"cluster-samples-operator-d5df4776c-rn2zb\" (UID: \"de591805-36d1-4a59-a89f-2f9aca624e2e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-rn2zb" Mar 18 16:44:23.961815 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:23.961743 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bdf9649a-73e2-4d1b-9ba7-b03cc47f8426-metrics-certs\") pod \"router-default-745d85b678-r6ghs\" (UID: \"bdf9649a-73e2-4d1b-9ba7-b03cc47f8426\") " pod="openshift-ingress/router-default-745d85b678-r6ghs" Mar 18 16:44:24.060101 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:24.060069 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/5ffa71b2-a86e-4662-adad-d3882c534d0f-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-b58cd5d8d-f6w67\" (UID: \"5ffa71b2-a86e-4662-adad-d3882c534d0f\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-f6w67" Mar 18 16:44:24.060251 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:24.060127 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3e56ca22-2821-4212-ad13-a7f1080372b4-networking-console-plugin-cert\") pod \"networking-console-plugin-55b77584bb-nns88\" (UID: \"3e56ca22-2821-4212-ad13-a7f1080372b4\") " pod="openshift-network-console/networking-console-plugin-55b77584bb-nns88" Mar 18 16:44:24.062773 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:24.062738 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/5ffa71b2-a86e-4662-adad-d3882c534d0f-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-b58cd5d8d-f6w67\" (UID: \"5ffa71b2-a86e-4662-adad-d3882c534d0f\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-f6w67" Mar 18 16:44:24.062897 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:24.062853 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3e56ca22-2821-4212-ad13-a7f1080372b4-networking-console-plugin-cert\") pod \"networking-console-plugin-55b77584bb-nns88\" (UID: \"3e56ca22-2821-4212-ad13-a7f1080372b4\") " pod="openshift-network-console/networking-console-plugin-55b77584bb-nns88" Mar 18 16:44:24.163876 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:24.163841 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-x49nm\"" Mar 18 16:44:24.172419 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:24.172387 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-rn2zb" Mar 18 16:44:24.185091 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:24.185072 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-ww6vh\"" Mar 18 16:44:24.193456 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:24.193431 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-745d85b678-r6ghs" Mar 18 16:44:24.254124 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:24.254037 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-qfskh\"" Mar 18 16:44:24.262715 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:24.262688 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-55b77584bb-nns88" Mar 18 16:44:24.273408 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:24.273380 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-chmrg\"" Mar 18 16:44:24.282188 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:24.282168 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-f6w67" Mar 18 16:44:25.001789 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:25.001758 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-rn2zb"] Mar 18 16:44:25.023540 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:25.023493 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-55b77584bb-nns88"] Mar 18 16:44:25.053657 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:25.053635 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-745d85b678-r6ghs"] Mar 18 16:44:25.220648 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:25.220617 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-f6w67"] Mar 18 16:44:25.375395 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:25.375359 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-745d85b678-r6ghs" event={"ID":"bdf9649a-73e2-4d1b-9ba7-b03cc47f8426","Type":"ContainerStarted","Data":"2c0546eb8d1a3e2b81419c0f3acd49dcb56faf78402be5222334929dbd76b091"} Mar 18 16:44:25.375575 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:25.375403 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-745d85b678-r6ghs" event={"ID":"bdf9649a-73e2-4d1b-9ba7-b03cc47f8426","Type":"ContainerStarted","Data":"1148d4cdc35103b584c2a9b9c3e5cd8fd8ab9cb889c88be635ce4081697d4676"} Mar 18 16:44:25.376906 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:25.376876 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-55b77584bb-nns88" event={"ID":"3e56ca22-2821-4212-ad13-a7f1080372b4","Type":"ContainerStarted","Data":"6296641cbafe879026c756dd1fc6ddfb9f5d12660e0e490b41bd7474b7b00f0c"} Mar 18 16:44:25.378294 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:25.378271 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xwx7z" event={"ID":"3a05d5d2-a005-4847-b2aa-0f78e327686a","Type":"ContainerStarted","Data":"e8a3058e46ea8cdc60e484e61f79003bb7b19402674ab28d2da1f5456c074362"} Mar 18 16:44:25.379707 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:25.379684 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-t8ccf" event={"ID":"4e96119f-c454-4e76-a758-9be470a94be8","Type":"ContainerStarted","Data":"f2e5e7e0ba2ec83fb1eb8310fa437bfa509089e73303da88929cd6837a4ec52e"} Mar 18 16:44:25.380763 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:25.380742 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-rn2zb" event={"ID":"de591805-36d1-4a59-a89f-2f9aca624e2e","Type":"ContainerStarted","Data":"25042d938cd00c8048b5be2910a7cda86c13d1df4a104a676056d3ab8e0ca395"} Mar 18 16:44:25.398773 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:25.398718 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-745d85b678-r6ghs" podStartSLOduration=34.398702899 podStartE2EDuration="34.398702899s" podCreationTimestamp="2026-03-18 16:43:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:44:25.396910013 +0000 UTC m=+69.083032395" watchObservedRunningTime="2026-03-18 16:44:25.398702899 +0000 UTC m=+69.084825262" Mar 18 16:44:25.412905 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:25.412858 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-t8ccf" podStartSLOduration=33.641048739 podStartE2EDuration="36.412842313s" podCreationTimestamp="2026-03-18 16:43:49 +0000 UTC" firstStartedPulling="2026-03-18 16:44:22.060601511 +0000 UTC m=+65.746723851" lastFinishedPulling="2026-03-18 16:44:24.832395083 +0000 UTC m=+68.518517425" observedRunningTime="2026-03-18 16:44:25.411230411 +0000 UTC m=+69.097352775" watchObservedRunningTime="2026-03-18 16:44:25.412842313 +0000 UTC m=+69.098964741" Mar 18 16:44:25.463472 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:44:25.463448 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ffa71b2_a86e_4662_adad_d3882c534d0f.slice/crio-59a1e5c056e947a7ab7ee7c910b4c913f23304023eb5532508699fdcff2c3c48 WatchSource:0}: Error finding container 59a1e5c056e947a7ab7ee7c910b4c913f23304023eb5532508699fdcff2c3c48: Status 404 returned error can't find the container with id 59a1e5c056e947a7ab7ee7c910b4c913f23304023eb5532508699fdcff2c3c48 Mar 18 16:44:26.193723 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:26.193667 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-745d85b678-r6ghs" Mar 18 16:44:26.196493 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:26.196466 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-745d85b678-r6ghs" Mar 18 16:44:26.386058 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:26.385988 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xwx7z" event={"ID":"3a05d5d2-a005-4847-b2aa-0f78e327686a","Type":"ContainerStarted","Data":"ae66e0ac5ccda7885ba8cf18e8f8c8ddc91dd3826c0875c0cc58410e04687d30"} Mar 18 16:44:26.386058 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:26.386039 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-xwx7z" Mar 18 16:44:26.387074 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:26.387048 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-f6w67" event={"ID":"5ffa71b2-a86e-4662-adad-d3882c534d0f","Type":"ContainerStarted","Data":"59a1e5c056e947a7ab7ee7c910b4c913f23304023eb5532508699fdcff2c3c48"} Mar 18 16:44:26.389151 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:26.389110 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dlnjg" event={"ID":"008308d5-c00b-40b2-a413-eb4caebc48c0","Type":"ContainerStarted","Data":"58ca70ecd8b20abe84f400ca1d20a2b11de8932faa1ef41e6acc501ab36a618d"} Mar 18 16:44:26.389151 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:26.389146 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dlnjg" event={"ID":"008308d5-c00b-40b2-a413-eb4caebc48c0","Type":"ContainerStarted","Data":"33d4d190fc8c84687df9bafa8b5505997301db74d4dcd4264c2ab15b9144c77c"} Mar 18 16:44:26.389324 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:26.389309 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-745d85b678-r6ghs" Mar 18 16:44:26.390575 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:26.390557 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-745d85b678-r6ghs" Mar 18 16:44:26.402654 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:26.402609 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-xwx7z" podStartSLOduration=34.606052126 podStartE2EDuration="37.402597693s" podCreationTimestamp="2026-03-18 16:43:49 +0000 UTC" firstStartedPulling="2026-03-18 16:44:22.033614215 +0000 UTC m=+65.719736561" lastFinishedPulling="2026-03-18 16:44:24.830159774 +0000 UTC m=+68.516282128" observedRunningTime="2026-03-18 16:44:26.402283203 +0000 UTC m=+70.088405565" watchObservedRunningTime="2026-03-18 16:44:26.402597693 +0000 UTC m=+70.088720054" Mar 18 16:44:26.417907 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:26.417859 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-dlnjg" podStartSLOduration=67.036753976 podStartE2EDuration="1m9.417844466s" podCreationTimestamp="2026-03-18 16:43:17 +0000 UTC" firstStartedPulling="2026-03-18 16:44:23.183515077 +0000 UTC m=+66.869637423" lastFinishedPulling="2026-03-18 16:44:25.56460557 +0000 UTC m=+69.250727913" observedRunningTime="2026-03-18 16:44:26.416438789 +0000 UTC m=+70.102561151" watchObservedRunningTime="2026-03-18 16:44:26.417844466 +0000 UTC m=+70.103966830" Mar 18 16:44:26.983878 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:26.983848 2562 scope.go:117] "RemoveContainer" containerID="ddbdffeafd174bc7cce9b55dc22bfb29fb260d7a8d47d828a44061e251317aa3" Mar 18 16:44:27.393872 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:27.393834 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-55b77584bb-nns88" event={"ID":"3e56ca22-2821-4212-ad13-a7f1080372b4","Type":"ContainerStarted","Data":"12aa3e0d0654aae68df4d63cc69f46c99a502030128ec3d378760aefeac375b8"} Mar 18 16:44:27.409219 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:27.409160 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-55b77584bb-nns88" podStartSLOduration=33.85908761 podStartE2EDuration="35.409142996s" podCreationTimestamp="2026-03-18 16:43:52 +0000 UTC" firstStartedPulling="2026-03-18 16:44:25.04275222 +0000 UTC m=+68.728874573" lastFinishedPulling="2026-03-18 16:44:26.59280762 +0000 UTC m=+70.278929959" observedRunningTime="2026-03-18 16:44:27.407989493 +0000 UTC m=+71.094111856" watchObservedRunningTime="2026-03-18 16:44:27.409142996 +0000 UTC m=+71.095265359" Mar 18 16:44:28.398445 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:28.398415 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-f6w67" event={"ID":"5ffa71b2-a86e-4662-adad-d3882c534d0f","Type":"ContainerStarted","Data":"1a4e0c3540ac8504e341e0b6371eff25dd73e23a6f4512f9d3025a1f767c52b2"} Mar 18 16:44:28.408124 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:28.408096 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-jr9mc_627f8af5-be81-426b-9540-c096c256323a/console-operator/1.log" Mar 18 16:44:28.408227 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:28.408186 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-76b8565867-jr9mc" event={"ID":"627f8af5-be81-426b-9540-c096c256323a","Type":"ContainerStarted","Data":"480fc2b3cf2bd2cf141f254f8e45eb6399e2c537a5861e56030571e074471e4e"} Mar 18 16:44:28.408842 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:28.408558 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-76b8565867-jr9mc" Mar 18 16:44:28.410989 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:28.410481 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-rn2zb" event={"ID":"de591805-36d1-4a59-a89f-2f9aca624e2e","Type":"ContainerStarted","Data":"908a7881b4df83a105147b47bdf715a7b6c97a8a10f7f6a761cb79783f5bb84f"} Mar 18 16:44:28.410989 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:28.410509 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-rn2zb" event={"ID":"de591805-36d1-4a59-a89f-2f9aca624e2e","Type":"ContainerStarted","Data":"391eea353b00dc6da13793cf834dcddd1e4826c662d18065a4bf370eff686211"} Mar 18 16:44:28.420125 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:28.420093 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-f6w67" podStartSLOduration=33.762597629 podStartE2EDuration="36.420082219s" podCreationTimestamp="2026-03-18 16:43:52 +0000 UTC" firstStartedPulling="2026-03-18 16:44:25.465345069 +0000 UTC m=+69.151467410" lastFinishedPulling="2026-03-18 16:44:28.122829659 +0000 UTC m=+71.808952000" observedRunningTime="2026-03-18 16:44:28.419405384 +0000 UTC m=+72.105527746" watchObservedRunningTime="2026-03-18 16:44:28.420082219 +0000 UTC m=+72.106204622" Mar 18 16:44:28.438562 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:28.438518 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-76b8565867-jr9mc" podStartSLOduration=25.09932415 podStartE2EDuration="36.438502679s" podCreationTimestamp="2026-03-18 16:43:52 +0000 UTC" firstStartedPulling="2026-03-18 16:43:52.951644181 +0000 UTC m=+36.637766530" lastFinishedPulling="2026-03-18 16:44:04.290822521 +0000 UTC m=+47.976945059" observedRunningTime="2026-03-18 16:44:28.43723002 +0000 UTC m=+72.123352418" watchObservedRunningTime="2026-03-18 16:44:28.438502679 +0000 UTC m=+72.124625043" Mar 18 16:44:28.456286 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:28.456247 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-rn2zb" podStartSLOduration=34.421800554 podStartE2EDuration="37.456234065s" podCreationTimestamp="2026-03-18 16:43:51 +0000 UTC" firstStartedPulling="2026-03-18 16:44:25.087308325 +0000 UTC m=+68.773430665" lastFinishedPulling="2026-03-18 16:44:28.121741798 +0000 UTC m=+71.807864176" observedRunningTime="2026-03-18 16:44:28.455687586 +0000 UTC m=+72.141809945" watchObservedRunningTime="2026-03-18 16:44:28.456234065 +0000 UTC m=+72.142356427" Mar 18 16:44:28.614367 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:28.614292 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-76b8565867-jr9mc" Mar 18 16:44:31.867851 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:31.867819 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-5km25"] Mar 18 16:44:31.871301 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:31.871283 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-5km25" Mar 18 16:44:31.874610 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:31.874590 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Mar 18 16:44:31.875128 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:31.875110 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-5f2hh\"" Mar 18 16:44:31.876683 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:31.876667 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Mar 18 16:44:31.889325 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:31.889306 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-5km25"] Mar 18 16:44:32.022083 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:32.022058 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/593cc8a3-494f-4641-bca3-e7c9b8a54d76-crio-socket\") pod \"insights-runtime-extractor-5km25\" (UID: \"593cc8a3-494f-4641-bca3-e7c9b8a54d76\") " pod="openshift-insights/insights-runtime-extractor-5km25" Mar 18 16:44:32.022192 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:32.022100 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99lpt\" (UniqueName: \"kubernetes.io/projected/593cc8a3-494f-4641-bca3-e7c9b8a54d76-kube-api-access-99lpt\") pod \"insights-runtime-extractor-5km25\" (UID: \"593cc8a3-494f-4641-bca3-e7c9b8a54d76\") " pod="openshift-insights/insights-runtime-extractor-5km25" Mar 18 16:44:32.022192 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:32.022122 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/593cc8a3-494f-4641-bca3-e7c9b8a54d76-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5km25\" (UID: \"593cc8a3-494f-4641-bca3-e7c9b8a54d76\") " pod="openshift-insights/insights-runtime-extractor-5km25" Mar 18 16:44:32.022264 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:32.022198 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/593cc8a3-494f-4641-bca3-e7c9b8a54d76-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5km25\" (UID: \"593cc8a3-494f-4641-bca3-e7c9b8a54d76\") " pod="openshift-insights/insights-runtime-extractor-5km25" Mar 18 16:44:32.022264 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:32.022240 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/593cc8a3-494f-4641-bca3-e7c9b8a54d76-data-volume\") pod \"insights-runtime-extractor-5km25\" (UID: \"593cc8a3-494f-4641-bca3-e7c9b8a54d76\") " pod="openshift-insights/insights-runtime-extractor-5km25" Mar 18 16:44:32.123296 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:32.123271 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/593cc8a3-494f-4641-bca3-e7c9b8a54d76-crio-socket\") pod \"insights-runtime-extractor-5km25\" (UID: \"593cc8a3-494f-4641-bca3-e7c9b8a54d76\") " pod="openshift-insights/insights-runtime-extractor-5km25" Mar 18 16:44:32.123423 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:32.123316 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-99lpt\" (UniqueName: \"kubernetes.io/projected/593cc8a3-494f-4641-bca3-e7c9b8a54d76-kube-api-access-99lpt\") pod \"insights-runtime-extractor-5km25\" (UID: \"593cc8a3-494f-4641-bca3-e7c9b8a54d76\") " pod="openshift-insights/insights-runtime-extractor-5km25" Mar 18 16:44:32.123423 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:32.123339 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/593cc8a3-494f-4641-bca3-e7c9b8a54d76-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5km25\" (UID: \"593cc8a3-494f-4641-bca3-e7c9b8a54d76\") " pod="openshift-insights/insights-runtime-extractor-5km25" Mar 18 16:44:32.123423 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:32.123376 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/593cc8a3-494f-4641-bca3-e7c9b8a54d76-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5km25\" (UID: \"593cc8a3-494f-4641-bca3-e7c9b8a54d76\") " pod="openshift-insights/insights-runtime-extractor-5km25" Mar 18 16:44:32.123423 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:32.123400 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/593cc8a3-494f-4641-bca3-e7c9b8a54d76-data-volume\") pod \"insights-runtime-extractor-5km25\" (UID: \"593cc8a3-494f-4641-bca3-e7c9b8a54d76\") " pod="openshift-insights/insights-runtime-extractor-5km25" Mar 18 16:44:32.123423 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:32.123401 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/593cc8a3-494f-4641-bca3-e7c9b8a54d76-crio-socket\") pod \"insights-runtime-extractor-5km25\" (UID: \"593cc8a3-494f-4641-bca3-e7c9b8a54d76\") " pod="openshift-insights/insights-runtime-extractor-5km25" Mar 18 16:44:32.123799 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:32.123781 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/593cc8a3-494f-4641-bca3-e7c9b8a54d76-data-volume\") pod \"insights-runtime-extractor-5km25\" (UID: \"593cc8a3-494f-4641-bca3-e7c9b8a54d76\") " pod="openshift-insights/insights-runtime-extractor-5km25" Mar 18 16:44:32.123891 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:32.123876 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/593cc8a3-494f-4641-bca3-e7c9b8a54d76-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5km25\" (UID: \"593cc8a3-494f-4641-bca3-e7c9b8a54d76\") " pod="openshift-insights/insights-runtime-extractor-5km25" Mar 18 16:44:32.125760 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:32.125741 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/593cc8a3-494f-4641-bca3-e7c9b8a54d76-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5km25\" (UID: \"593cc8a3-494f-4641-bca3-e7c9b8a54d76\") " pod="openshift-insights/insights-runtime-extractor-5km25" Mar 18 16:44:32.133750 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:32.133728 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-99lpt\" (UniqueName: \"kubernetes.io/projected/593cc8a3-494f-4641-bca3-e7c9b8a54d76-kube-api-access-99lpt\") pod \"insights-runtime-extractor-5km25\" (UID: \"593cc8a3-494f-4641-bca3-e7c9b8a54d76\") " pod="openshift-insights/insights-runtime-extractor-5km25" Mar 18 16:44:32.180972 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:32.180926 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-5km25" Mar 18 16:44:32.300931 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:32.300901 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-5km25"] Mar 18 16:44:32.303095 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:44:32.303067 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod593cc8a3_494f_4641_bca3_e7c9b8a54d76.slice/crio-09c1c10248228ce06fa442508c37daf923488563717a88a3074ffdcb4a2658b5 WatchSource:0}: Error finding container 09c1c10248228ce06fa442508c37daf923488563717a88a3074ffdcb4a2658b5: Status 404 returned error can't find the container with id 09c1c10248228ce06fa442508c37daf923488563717a88a3074ffdcb4a2658b5 Mar 18 16:44:32.423482 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:32.423425 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5km25" event={"ID":"593cc8a3-494f-4641-bca3-e7c9b8a54d76","Type":"ContainerStarted","Data":"8c1bcd667b048ad5f7f8a376ea7f91c8e992b72a105945cf10619da910b5a73f"} Mar 18 16:44:32.423482 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:32.423458 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5km25" event={"ID":"593cc8a3-494f-4641-bca3-e7c9b8a54d76","Type":"ContainerStarted","Data":"09c1c10248228ce06fa442508c37daf923488563717a88a3074ffdcb4a2658b5"} Mar 18 16:44:33.427531 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:33.427501 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5km25" event={"ID":"593cc8a3-494f-4641-bca3-e7c9b8a54d76","Type":"ContainerStarted","Data":"5ef14ce7fd93e47db48975203b7387f193ec91f7dc72c015a7c65441108e6631"} Mar 18 16:44:35.435338 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:35.435297 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5km25" event={"ID":"593cc8a3-494f-4641-bca3-e7c9b8a54d76","Type":"ContainerStarted","Data":"ac7c16e21e5a281fe8a50e7b09c6c0d90cdbaf22d31093579f0f0fede2482265"} Mar 18 16:44:35.452998 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:35.452918 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-5km25" podStartSLOduration=2.2716533070000002 podStartE2EDuration="4.452902837s" podCreationTimestamp="2026-03-18 16:44:31 +0000 UTC" firstStartedPulling="2026-03-18 16:44:32.365895212 +0000 UTC m=+76.052017552" lastFinishedPulling="2026-03-18 16:44:34.547144737 +0000 UTC m=+78.233267082" observedRunningTime="2026-03-18 16:44:35.45209716 +0000 UTC m=+79.138219560" watchObservedRunningTime="2026-03-18 16:44:35.452902837 +0000 UTC m=+79.139025200" Mar 18 16:44:36.301393 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:36.301366 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-mnsnh" Mar 18 16:44:36.396177 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:36.396153 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-xwx7z" Mar 18 16:44:41.827711 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:41.827671 2562 patch_prober.go:28] interesting pod/image-registry-84f6fbc954-l59hf container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Mar 18 16:44:41.828117 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:41.827726 2562 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-84f6fbc954-l59hf" podUID="678c1a16-a395-4d4a-b4f6-a2ac6e1870e8" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:44:43.112147 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:43.112118 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-496n8"] Mar 18 16:44:43.117148 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:43.117123 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-496n8" Mar 18 16:44:43.121060 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:43.121038 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Mar 18 16:44:43.121409 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:43.121386 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-t8t9n\"" Mar 18 16:44:43.121516 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:43.121501 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Mar 18 16:44:43.121663 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:43.121647 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Mar 18 16:44:43.121907 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:43.121890 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Mar 18 16:44:43.198209 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:43.198184 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4b48fdb0-ba43-4916-b399-e8b0ea55e494-root\") pod \"node-exporter-496n8\" (UID: \"4b48fdb0-ba43-4916-b399-e8b0ea55e494\") " pod="openshift-monitoring/node-exporter-496n8" Mar 18 16:44:43.198306 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:43.198211 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4b48fdb0-ba43-4916-b399-e8b0ea55e494-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-496n8\" (UID: \"4b48fdb0-ba43-4916-b399-e8b0ea55e494\") " pod="openshift-monitoring/node-exporter-496n8" Mar 18 16:44:43.198306 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:43.198233 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4b48fdb0-ba43-4916-b399-e8b0ea55e494-node-exporter-tls\") pod \"node-exporter-496n8\" (UID: \"4b48fdb0-ba43-4916-b399-e8b0ea55e494\") " pod="openshift-monitoring/node-exporter-496n8" Mar 18 16:44:43.198306 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:43.198251 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4b48fdb0-ba43-4916-b399-e8b0ea55e494-node-exporter-textfile\") pod \"node-exporter-496n8\" (UID: \"4b48fdb0-ba43-4916-b399-e8b0ea55e494\") " pod="openshift-monitoring/node-exporter-496n8" Mar 18 16:44:43.198405 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:43.198319 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4b48fdb0-ba43-4916-b399-e8b0ea55e494-sys\") pod \"node-exporter-496n8\" (UID: \"4b48fdb0-ba43-4916-b399-e8b0ea55e494\") " pod="openshift-monitoring/node-exporter-496n8" Mar 18 16:44:43.198405 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:43.198356 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4b48fdb0-ba43-4916-b399-e8b0ea55e494-node-exporter-wtmp\") pod \"node-exporter-496n8\" (UID: \"4b48fdb0-ba43-4916-b399-e8b0ea55e494\") " pod="openshift-monitoring/node-exporter-496n8" Mar 18 16:44:43.198405 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:43.198373 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bggp9\" (UniqueName: \"kubernetes.io/projected/4b48fdb0-ba43-4916-b399-e8b0ea55e494-kube-api-access-bggp9\") pod \"node-exporter-496n8\" (UID: \"4b48fdb0-ba43-4916-b399-e8b0ea55e494\") " pod="openshift-monitoring/node-exporter-496n8" Mar 18 16:44:43.198499 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:43.198415 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4b48fdb0-ba43-4916-b399-e8b0ea55e494-node-exporter-accelerators-collector-config\") pod \"node-exporter-496n8\" (UID: \"4b48fdb0-ba43-4916-b399-e8b0ea55e494\") " pod="openshift-monitoring/node-exporter-496n8" Mar 18 16:44:43.198499 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:43.198446 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4b48fdb0-ba43-4916-b399-e8b0ea55e494-metrics-client-ca\") pod \"node-exporter-496n8\" (UID: \"4b48fdb0-ba43-4916-b399-e8b0ea55e494\") " pod="openshift-monitoring/node-exporter-496n8" Mar 18 16:44:43.299006 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:43.298982 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4b48fdb0-ba43-4916-b399-e8b0ea55e494-node-exporter-accelerators-collector-config\") pod \"node-exporter-496n8\" (UID: \"4b48fdb0-ba43-4916-b399-e8b0ea55e494\") " pod="openshift-monitoring/node-exporter-496n8" Mar 18 16:44:43.299164 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:43.299019 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4b48fdb0-ba43-4916-b399-e8b0ea55e494-metrics-client-ca\") pod \"node-exporter-496n8\" (UID: \"4b48fdb0-ba43-4916-b399-e8b0ea55e494\") " pod="openshift-monitoring/node-exporter-496n8" Mar 18 16:44:43.299164 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:43.299039 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4b48fdb0-ba43-4916-b399-e8b0ea55e494-root\") pod \"node-exporter-496n8\" (UID: \"4b48fdb0-ba43-4916-b399-e8b0ea55e494\") " pod="openshift-monitoring/node-exporter-496n8" Mar 18 16:44:43.299164 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:43.299057 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4b48fdb0-ba43-4916-b399-e8b0ea55e494-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-496n8\" (UID: \"4b48fdb0-ba43-4916-b399-e8b0ea55e494\") " pod="openshift-monitoring/node-exporter-496n8" Mar 18 16:44:43.299164 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:43.299086 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4b48fdb0-ba43-4916-b399-e8b0ea55e494-node-exporter-tls\") pod \"node-exporter-496n8\" (UID: \"4b48fdb0-ba43-4916-b399-e8b0ea55e494\") " pod="openshift-monitoring/node-exporter-496n8" Mar 18 16:44:43.299164 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:43.299094 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4b48fdb0-ba43-4916-b399-e8b0ea55e494-root\") pod \"node-exporter-496n8\" (UID: \"4b48fdb0-ba43-4916-b399-e8b0ea55e494\") " pod="openshift-monitoring/node-exporter-496n8" Mar 18 16:44:43.299164 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:43.299104 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4b48fdb0-ba43-4916-b399-e8b0ea55e494-node-exporter-textfile\") pod \"node-exporter-496n8\" (UID: \"4b48fdb0-ba43-4916-b399-e8b0ea55e494\") " pod="openshift-monitoring/node-exporter-496n8" Mar 18 16:44:43.299164 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:43.299154 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4b48fdb0-ba43-4916-b399-e8b0ea55e494-sys\") pod \"node-exporter-496n8\" (UID: \"4b48fdb0-ba43-4916-b399-e8b0ea55e494\") " pod="openshift-monitoring/node-exporter-496n8" Mar 18 16:44:43.299459 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:43.299199 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4b48fdb0-ba43-4916-b399-e8b0ea55e494-sys\") pod \"node-exporter-496n8\" (UID: \"4b48fdb0-ba43-4916-b399-e8b0ea55e494\") " pod="openshift-monitoring/node-exporter-496n8" Mar 18 16:44:43.299459 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:43.299260 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4b48fdb0-ba43-4916-b399-e8b0ea55e494-node-exporter-wtmp\") pod \"node-exporter-496n8\" (UID: \"4b48fdb0-ba43-4916-b399-e8b0ea55e494\") " pod="openshift-monitoring/node-exporter-496n8" Mar 18 16:44:43.299459 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:43.299300 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bggp9\" (UniqueName: \"kubernetes.io/projected/4b48fdb0-ba43-4916-b399-e8b0ea55e494-kube-api-access-bggp9\") pod \"node-exporter-496n8\" (UID: \"4b48fdb0-ba43-4916-b399-e8b0ea55e494\") " pod="openshift-monitoring/node-exporter-496n8" Mar 18 16:44:43.299459 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:43.299404 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4b48fdb0-ba43-4916-b399-e8b0ea55e494-node-exporter-textfile\") pod \"node-exporter-496n8\" (UID: \"4b48fdb0-ba43-4916-b399-e8b0ea55e494\") " pod="openshift-monitoring/node-exporter-496n8" Mar 18 16:44:43.299459 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:43.299446 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4b48fdb0-ba43-4916-b399-e8b0ea55e494-node-exporter-wtmp\") pod \"node-exporter-496n8\" (UID: \"4b48fdb0-ba43-4916-b399-e8b0ea55e494\") " pod="openshift-monitoring/node-exporter-496n8" Mar 18 16:44:43.299704 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:43.299567 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4b48fdb0-ba43-4916-b399-e8b0ea55e494-metrics-client-ca\") pod \"node-exporter-496n8\" (UID: \"4b48fdb0-ba43-4916-b399-e8b0ea55e494\") " pod="openshift-monitoring/node-exporter-496n8" Mar 18 16:44:43.299704 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:43.299650 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4b48fdb0-ba43-4916-b399-e8b0ea55e494-node-exporter-accelerators-collector-config\") pod \"node-exporter-496n8\" (UID: \"4b48fdb0-ba43-4916-b399-e8b0ea55e494\") " pod="openshift-monitoring/node-exporter-496n8" Mar 18 16:44:43.301204 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:43.301180 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4b48fdb0-ba43-4916-b399-e8b0ea55e494-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-496n8\" (UID: \"4b48fdb0-ba43-4916-b399-e8b0ea55e494\") " pod="openshift-monitoring/node-exporter-496n8" Mar 18 16:44:43.301329 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:43.301312 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4b48fdb0-ba43-4916-b399-e8b0ea55e494-node-exporter-tls\") pod \"node-exporter-496n8\" (UID: \"4b48fdb0-ba43-4916-b399-e8b0ea55e494\") " pod="openshift-monitoring/node-exporter-496n8" Mar 18 16:44:43.309653 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:43.309632 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bggp9\" (UniqueName: \"kubernetes.io/projected/4b48fdb0-ba43-4916-b399-e8b0ea55e494-kube-api-access-bggp9\") pod \"node-exporter-496n8\" (UID: \"4b48fdb0-ba43-4916-b399-e8b0ea55e494\") " pod="openshift-monitoring/node-exporter-496n8" Mar 18 16:44:43.373245 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:43.372906 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-84f6fbc954-l59hf" Mar 18 16:44:43.427395 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:43.427372 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-496n8" Mar 18 16:44:43.437192 ip-10-0-131-5 kubenswrapper[2562]: W0318 16:44:43.437163 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b48fdb0_ba43_4916_b399_e8b0ea55e494.slice/crio-0ae5e37fcdaa2d0cb2b654fdf0433822ddc67b1e2b5b40ec75551bec5c84b84e WatchSource:0}: Error finding container 0ae5e37fcdaa2d0cb2b654fdf0433822ddc67b1e2b5b40ec75551bec5c84b84e: Status 404 returned error can't find the container with id 0ae5e37fcdaa2d0cb2b654fdf0433822ddc67b1e2b5b40ec75551bec5c84b84e Mar 18 16:44:43.457504 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:43.457473 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-496n8" event={"ID":"4b48fdb0-ba43-4916-b399-e8b0ea55e494","Type":"ContainerStarted","Data":"0ae5e37fcdaa2d0cb2b654fdf0433822ddc67b1e2b5b40ec75551bec5c84b84e"} Mar 18 16:44:44.461433 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:44.461403 2562 generic.go:358] "Generic (PLEG): container finished" podID="4b48fdb0-ba43-4916-b399-e8b0ea55e494" containerID="f648f2a058751e08781760cd1b225efbeaf00c2b1e4dcac71aabf0f73e0ef401" exitCode=0 Mar 18 16:44:44.461814 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:44.461447 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-496n8" event={"ID":"4b48fdb0-ba43-4916-b399-e8b0ea55e494","Type":"ContainerDied","Data":"f648f2a058751e08781760cd1b225efbeaf00c2b1e4dcac71aabf0f73e0ef401"} Mar 18 16:44:45.466660 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:45.466625 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-496n8" event={"ID":"4b48fdb0-ba43-4916-b399-e8b0ea55e494","Type":"ContainerStarted","Data":"c572db3f43b62560c880ff0a6a9e067c22f9d67008442c09a805e1b922d5680b"} Mar 18 16:44:45.466660 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:45.466656 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-496n8" event={"ID":"4b48fdb0-ba43-4916-b399-e8b0ea55e494","Type":"ContainerStarted","Data":"04b67660692a5aba863523cf4ce9f56cc2a7714d403c132ed145798ea2d764e7"} Mar 18 16:44:45.489188 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:45.489144 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-496n8" podStartSLOduration=1.68502482 podStartE2EDuration="2.489129268s" podCreationTimestamp="2026-03-18 16:44:43 +0000 UTC" firstStartedPulling="2026-03-18 16:44:43.439554769 +0000 UTC m=+87.125677116" lastFinishedPulling="2026-03-18 16:44:44.243659218 +0000 UTC m=+87.929781564" observedRunningTime="2026-03-18 16:44:45.488402671 +0000 UTC m=+89.174525032" watchObservedRunningTime="2026-03-18 16:44:45.489129268 +0000 UTC m=+89.175251612" Mar 18 16:44:53.983203 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:44:53.983130 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-84f6fbc954-l59hf"] Mar 18 16:45:16.557288 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:45:16.557253 2562 generic.go:358] "Generic (PLEG): container finished" podID="355a3baf-f84e-464e-90ff-4cf4165ace30" containerID="150d03c9f60f0668187f02d27978d143193ec6376a9f3195308a321973583af1" exitCode=0 Mar 18 16:45:16.557682 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:45:16.557326 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-vtj99" event={"ID":"355a3baf-f84e-464e-90ff-4cf4165ace30","Type":"ContainerDied","Data":"150d03c9f60f0668187f02d27978d143193ec6376a9f3195308a321973583af1"} Mar 18 16:45:16.557682 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:45:16.557611 2562 scope.go:117] "RemoveContainer" containerID="150d03c9f60f0668187f02d27978d143193ec6376a9f3195308a321973583af1" Mar 18 16:45:17.561756 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:45:17.561728 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-vtj99" event={"ID":"355a3baf-f84e-464e-90ff-4cf4165ace30","Type":"ContainerStarted","Data":"3cf77b51c9b2d955b8ff6c7c52ebabc1aaa0c178d47472b6f172167ea213b9e4"} Mar 18 16:45:19.002418 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:45:19.002341 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-84f6fbc954-l59hf" podUID="678c1a16-a395-4d4a-b4f6-a2ac6e1870e8" containerName="registry" containerID="cri-o://2dde2125c4c38660c0da159a511564ec0fed893ff78174e63b53049326c0d92b" gracePeriod=30 Mar 18 16:45:19.240525 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:45:19.240502 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-84f6fbc954-l59hf" Mar 18 16:45:19.350743 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:45:19.350689 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-image-registry-private-configuration\") pod \"678c1a16-a395-4d4a-b4f6-a2ac6e1870e8\" (UID: \"678c1a16-a395-4d4a-b4f6-a2ac6e1870e8\") " Mar 18 16:45:19.350856 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:45:19.350758 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-installation-pull-secrets\") pod \"678c1a16-a395-4d4a-b4f6-a2ac6e1870e8\" (UID: \"678c1a16-a395-4d4a-b4f6-a2ac6e1870e8\") " Mar 18 16:45:19.350904 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:45:19.350852 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-registry-certificates\") pod \"678c1a16-a395-4d4a-b4f6-a2ac6e1870e8\" (UID: \"678c1a16-a395-4d4a-b4f6-a2ac6e1870e8\") " Mar 18 16:45:19.350904 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:45:19.350882 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-bound-sa-token\") pod \"678c1a16-a395-4d4a-b4f6-a2ac6e1870e8\" (UID: \"678c1a16-a395-4d4a-b4f6-a2ac6e1870e8\") " Mar 18 16:45:19.351037 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:45:19.350914 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-ca-trust-extracted\") pod \"678c1a16-a395-4d4a-b4f6-a2ac6e1870e8\" (UID: \"678c1a16-a395-4d4a-b4f6-a2ac6e1870e8\") " Mar 18 16:45:19.351037 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:45:19.350966 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-registry-tls\") pod \"678c1a16-a395-4d4a-b4f6-a2ac6e1870e8\" (UID: \"678c1a16-a395-4d4a-b4f6-a2ac6e1870e8\") " Mar 18 16:45:19.351037 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:45:19.350985 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-trusted-ca\") pod \"678c1a16-a395-4d4a-b4f6-a2ac6e1870e8\" (UID: \"678c1a16-a395-4d4a-b4f6-a2ac6e1870e8\") " Mar 18 16:45:19.351037 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:45:19.351021 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gpx7\" (UniqueName: \"kubernetes.io/projected/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-kube-api-access-9gpx7\") pod \"678c1a16-a395-4d4a-b4f6-a2ac6e1870e8\" (UID: \"678c1a16-a395-4d4a-b4f6-a2ac6e1870e8\") " Mar 18 16:45:19.351401 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:45:19.351339 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "678c1a16-a395-4d4a-b4f6-a2ac6e1870e8" (UID: "678c1a16-a395-4d4a-b4f6-a2ac6e1870e8"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:45:19.351539 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:45:19.351509 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "678c1a16-a395-4d4a-b4f6-a2ac6e1870e8" (UID: "678c1a16-a395-4d4a-b4f6-a2ac6e1870e8"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:45:19.353304 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:45:19.353236 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "678c1a16-a395-4d4a-b4f6-a2ac6e1870e8" (UID: "678c1a16-a395-4d4a-b4f6-a2ac6e1870e8"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:45:19.353304 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:45:19.353244 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "678c1a16-a395-4d4a-b4f6-a2ac6e1870e8" (UID: "678c1a16-a395-4d4a-b4f6-a2ac6e1870e8"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:45:19.353476 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:45:19.353455 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "678c1a16-a395-4d4a-b4f6-a2ac6e1870e8" (UID: "678c1a16-a395-4d4a-b4f6-a2ac6e1870e8"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:45:19.353549 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:45:19.353531 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "678c1a16-a395-4d4a-b4f6-a2ac6e1870e8" (UID: "678c1a16-a395-4d4a-b4f6-a2ac6e1870e8"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:45:19.353715 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:45:19.353690 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-kube-api-access-9gpx7" (OuterVolumeSpecName: "kube-api-access-9gpx7") pod "678c1a16-a395-4d4a-b4f6-a2ac6e1870e8" (UID: "678c1a16-a395-4d4a-b4f6-a2ac6e1870e8"). InnerVolumeSpecName "kube-api-access-9gpx7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:45:19.359841 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:45:19.359813 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "678c1a16-a395-4d4a-b4f6-a2ac6e1870e8" (UID: "678c1a16-a395-4d4a-b4f6-a2ac6e1870e8"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:45:19.452371 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:45:19.452347 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9gpx7\" (UniqueName: \"kubernetes.io/projected/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-kube-api-access-9gpx7\") on node \"ip-10-0-131-5.ec2.internal\" DevicePath \"\"" Mar 18 16:45:19.452371 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:45:19.452371 2562 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-image-registry-private-configuration\") on node \"ip-10-0-131-5.ec2.internal\" DevicePath \"\"" Mar 18 16:45:19.452515 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:45:19.452381 2562 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-installation-pull-secrets\") on node \"ip-10-0-131-5.ec2.internal\" DevicePath \"\"" Mar 18 16:45:19.452515 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:45:19.452391 2562 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-registry-certificates\") on node \"ip-10-0-131-5.ec2.internal\" DevicePath \"\"" Mar 18 16:45:19.452515 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:45:19.452401 2562 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-bound-sa-token\") on node \"ip-10-0-131-5.ec2.internal\" DevicePath \"\"" Mar 18 16:45:19.452515 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:45:19.452410 2562 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-ca-trust-extracted\") on node \"ip-10-0-131-5.ec2.internal\" DevicePath \"\"" Mar 18 16:45:19.452515 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:45:19.452419 2562 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-registry-tls\") on node \"ip-10-0-131-5.ec2.internal\" DevicePath \"\"" Mar 18 16:45:19.452515 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:45:19.452428 2562 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8-trusted-ca\") on node \"ip-10-0-131-5.ec2.internal\" DevicePath \"\"" Mar 18 16:45:19.569066 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:45:19.569034 2562 generic.go:358] "Generic (PLEG): container finished" podID="678c1a16-a395-4d4a-b4f6-a2ac6e1870e8" containerID="2dde2125c4c38660c0da159a511564ec0fed893ff78174e63b53049326c0d92b" exitCode=0 Mar 18 16:45:19.569190 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:45:19.569105 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-84f6fbc954-l59hf" Mar 18 16:45:19.569190 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:45:19.569119 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-84f6fbc954-l59hf" event={"ID":"678c1a16-a395-4d4a-b4f6-a2ac6e1870e8","Type":"ContainerDied","Data":"2dde2125c4c38660c0da159a511564ec0fed893ff78174e63b53049326c0d92b"} Mar 18 16:45:19.569190 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:45:19.569159 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-84f6fbc954-l59hf" event={"ID":"678c1a16-a395-4d4a-b4f6-a2ac6e1870e8","Type":"ContainerDied","Data":"ef347efd62d77357eb164186c6d09f909253a942f7ccc26ec7781d2dbff03038"} Mar 18 16:45:19.569190 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:45:19.569175 2562 scope.go:117] "RemoveContainer" containerID="2dde2125c4c38660c0da159a511564ec0fed893ff78174e63b53049326c0d92b" Mar 18 16:45:19.577858 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:45:19.577839 2562 scope.go:117] "RemoveContainer" containerID="2dde2125c4c38660c0da159a511564ec0fed893ff78174e63b53049326c0d92b" Mar 18 16:45:19.578130 ip-10-0-131-5 kubenswrapper[2562]: E0318 16:45:19.578107 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2dde2125c4c38660c0da159a511564ec0fed893ff78174e63b53049326c0d92b\": container with ID starting with 2dde2125c4c38660c0da159a511564ec0fed893ff78174e63b53049326c0d92b not found: ID does not exist" containerID="2dde2125c4c38660c0da159a511564ec0fed893ff78174e63b53049326c0d92b" Mar 18 16:45:19.578243 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:45:19.578135 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dde2125c4c38660c0da159a511564ec0fed893ff78174e63b53049326c0d92b"} err="failed to get container status \"2dde2125c4c38660c0da159a511564ec0fed893ff78174e63b53049326c0d92b\": rpc error: code = NotFound desc = could not find container \"2dde2125c4c38660c0da159a511564ec0fed893ff78174e63b53049326c0d92b\": container with ID starting with 2dde2125c4c38660c0da159a511564ec0fed893ff78174e63b53049326c0d92b not found: ID does not exist" Mar 18 16:45:19.608284 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:45:19.608261 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-84f6fbc954-l59hf"] Mar 18 16:45:19.613457 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:45:19.613437 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-84f6fbc954-l59hf"] Mar 18 16:45:20.984008 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:45:20.983968 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="678c1a16-a395-4d4a-b4f6-a2ac6e1870e8" path="/var/lib/kubelet/pods/678c1a16-a395-4d4a-b4f6-a2ac6e1870e8/volumes" Mar 18 16:45:25.589808 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:45:25.589733 2562 generic.go:358] "Generic (PLEG): container finished" podID="199634f3-40e8-46c3-b2cc-89dff9148da4" containerID="b3c4b3fda8b7226bb323e24d9c6fac14a5b54260c1a49ee08e61a31da46c0638" exitCode=0 Mar 18 16:45:25.589808 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:45:25.589786 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-w2ghk" event={"ID":"199634f3-40e8-46c3-b2cc-89dff9148da4","Type":"ContainerDied","Data":"b3c4b3fda8b7226bb323e24d9c6fac14a5b54260c1a49ee08e61a31da46c0638"} Mar 18 16:45:25.590387 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:45:25.590144 2562 scope.go:117] "RemoveContainer" containerID="b3c4b3fda8b7226bb323e24d9c6fac14a5b54260c1a49ee08e61a31da46c0638" Mar 18 16:45:26.594724 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:45:26.594686 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-w2ghk" event={"ID":"199634f3-40e8-46c3-b2cc-89dff9148da4","Type":"ContainerStarted","Data":"acf4da6409ae5473c4ef60a064a0bb01fb13e168e44c98519ec9d3ae73b98947"} Mar 18 16:45:31.612857 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:45:31.612824 2562 generic.go:358] "Generic (PLEG): container finished" podID="a69e9c12-8943-4e3a-b184-8b4e9b5c45c7" containerID="a59a2d21bef9c5439ac42ad5c93346e63dff1029c5747a41952930275912d195" exitCode=0 Mar 18 16:45:31.613270 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:45:31.612893 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-76bdd9f478-kpr46" event={"ID":"a69e9c12-8943-4e3a-b184-8b4e9b5c45c7","Type":"ContainerDied","Data":"a59a2d21bef9c5439ac42ad5c93346e63dff1029c5747a41952930275912d195"} Mar 18 16:45:31.613270 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:45:31.613207 2562 scope.go:117] "RemoveContainer" containerID="a59a2d21bef9c5439ac42ad5c93346e63dff1029c5747a41952930275912d195" Mar 18 16:45:32.621886 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:45:32.621850 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-76bdd9f478-kpr46" event={"ID":"a69e9c12-8943-4e3a-b184-8b4e9b5c45c7","Type":"ContainerStarted","Data":"c039a9ff2d2bf988e3cbd62fdb058911011e34d8f335fda7e3179316a486005e"} Mar 18 16:45:40.052362 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:45:40.052322 2562 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7f99d8fb77-f6mvg" podUID="90d8c52e-5d9f-4f3e-960d-eac042763564" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 16:45:50.052479 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:45:50.052443 2562 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7f99d8fb77-f6mvg" podUID="90d8c52e-5d9f-4f3e-960d-eac042763564" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 16:46:00.052404 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:46:00.052322 2562 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7f99d8fb77-f6mvg" podUID="90d8c52e-5d9f-4f3e-960d-eac042763564" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 16:46:00.052404 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:46:00.052390 2562 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7f99d8fb77-f6mvg" Mar 18 16:46:00.052913 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:46:00.052893 2562 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"62116d024046bab550f80dbb9366f9306c2a36bdba11c9ffa69795ab610bf088"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7f99d8fb77-f6mvg" containerMessage="Container service-proxy failed liveness probe, will be restarted" Mar 18 16:46:00.052980 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:46:00.052958 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7f99d8fb77-f6mvg" podUID="90d8c52e-5d9f-4f3e-960d-eac042763564" containerName="service-proxy" containerID="cri-o://62116d024046bab550f80dbb9366f9306c2a36bdba11c9ffa69795ab610bf088" gracePeriod=30 Mar 18 16:46:00.706692 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:46:00.706651 2562 generic.go:358] "Generic (PLEG): container finished" podID="90d8c52e-5d9f-4f3e-960d-eac042763564" containerID="62116d024046bab550f80dbb9366f9306c2a36bdba11c9ffa69795ab610bf088" exitCode=2 Mar 18 16:46:00.706869 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:46:00.706708 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7f99d8fb77-f6mvg" event={"ID":"90d8c52e-5d9f-4f3e-960d-eac042763564","Type":"ContainerDied","Data":"62116d024046bab550f80dbb9366f9306c2a36bdba11c9ffa69795ab610bf088"} Mar 18 16:46:00.706869 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:46:00.706737 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7f99d8fb77-f6mvg" event={"ID":"90d8c52e-5d9f-4f3e-960d-eac042763564","Type":"ContainerStarted","Data":"f56c55d6cb2e9be55ef8857c1d53b30b3d07782e5902ca7d4b78a06ccea4e028"} Mar 18 16:48:16.891165 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:48:16.891134 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-jr9mc_627f8af5-be81-426b-9540-c096c256323a/console-operator/1.log" Mar 18 16:48:16.891165 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:48:16.891167 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-jr9mc_627f8af5-be81-426b-9540-c096c256323a/console-operator/1.log" Mar 18 16:48:16.899469 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:48:16.899414 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nsv52_09466950-dea7-48b9-b4a4-b9b73d845973/ovn-acl-logging/0.log" Mar 18 16:48:16.899617 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:48:16.899471 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nsv52_09466950-dea7-48b9-b4a4-b9b73d845973/ovn-acl-logging/0.log" Mar 18 16:48:16.903349 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:48:16.903332 2562 kubelet.go:1628] "Image garbage collection succeeded" Mar 18 16:53:16.911868 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:53:16.911833 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-jr9mc_627f8af5-be81-426b-9540-c096c256323a/console-operator/1.log" Mar 18 16:53:16.912895 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:53:16.912876 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-jr9mc_627f8af5-be81-426b-9540-c096c256323a/console-operator/1.log" Mar 18 16:53:16.927480 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:53:16.927443 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nsv52_09466950-dea7-48b9-b4a4-b9b73d845973/ovn-acl-logging/0.log" Mar 18 16:53:16.928063 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:53:16.928044 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nsv52_09466950-dea7-48b9-b4a4-b9b73d845973/ovn-acl-logging/0.log" Mar 18 16:58:16.940125 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:58:16.940089 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-jr9mc_627f8af5-be81-426b-9540-c096c256323a/console-operator/1.log" Mar 18 16:58:16.942840 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:58:16.942807 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-jr9mc_627f8af5-be81-426b-9540-c096c256323a/console-operator/1.log" Mar 18 16:58:16.948885 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:58:16.948864 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nsv52_09466950-dea7-48b9-b4a4-b9b73d845973/ovn-acl-logging/0.log" Mar 18 16:58:16.951655 ip-10-0-131-5 kubenswrapper[2562]: I0318 16:58:16.951637 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nsv52_09466950-dea7-48b9-b4a4-b9b73d845973/ovn-acl-logging/0.log" Mar 18 17:03:16.960572 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:03:16.960535 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-jr9mc_627f8af5-be81-426b-9540-c096c256323a/console-operator/1.log" Mar 18 17:03:16.963901 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:03:16.963875 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-jr9mc_627f8af5-be81-426b-9540-c096c256323a/console-operator/1.log" Mar 18 17:03:16.968789 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:03:16.968766 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nsv52_09466950-dea7-48b9-b4a4-b9b73d845973/ovn-acl-logging/0.log" Mar 18 17:03:16.972170 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:03:16.972152 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nsv52_09466950-dea7-48b9-b4a4-b9b73d845973/ovn-acl-logging/0.log" Mar 18 17:07:44.096757 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:07:44.096698 2562 ???:1] "http: TLS handshake error from 10.0.133.190:35448: EOF" Mar 18 17:07:44.100992 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:07:44.100964 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-cck66_9a6bdc88-7df5-45fd-98ac-2f967cc2f192/global-pull-secret-syncer/0.log" Mar 18 17:07:44.188534 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:07:44.188504 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-62kx2_7177d49a-364c-4c44-9dd5-4fa101f99bdb/konnectivity-agent/0.log" Mar 18 17:07:44.257085 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:07:44.257054 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-131-5.ec2.internal_6da92965ac9d0f6a332c1d675e2f540d/haproxy/0.log" Mar 18 17:07:48.041445 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:07:48.041353 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-b58cd5d8d-f6w67_5ffa71b2-a86e-4662-adad-d3882c534d0f/cluster-monitoring-operator/0.log" Mar 18 17:07:48.192353 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:07:48.192313 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-496n8_4b48fdb0-ba43-4916-b399-e8b0ea55e494/node-exporter/0.log" Mar 18 17:07:48.212747 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:07:48.212714 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-496n8_4b48fdb0-ba43-4916-b399-e8b0ea55e494/kube-rbac-proxy/0.log" Mar 18 17:07:48.237582 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:07:48.237558 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-496n8_4b48fdb0-ba43-4916-b399-e8b0ea55e494/init-textfile/0.log" Mar 18 17:07:50.076802 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:07:50.076774 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-55b77584bb-nns88_3e56ca22-2821-4212-ad13-a7f1080372b4/networking-console-plugin/0.log" Mar 18 17:07:50.551712 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:07:50.551679 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-jr9mc_627f8af5-be81-426b-9540-c096c256323a/console-operator/1.log" Mar 18 17:07:50.559342 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:07:50.559294 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-jr9mc_627f8af5-be81-426b-9540-c096c256323a/console-operator/2.log" Mar 18 17:07:51.333707 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:07:51.333671 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vwfl5/perf-node-gather-daemonset-67zcr"] Mar 18 17:07:51.334138 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:07:51.333955 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="678c1a16-a395-4d4a-b4f6-a2ac6e1870e8" containerName="registry" Mar 18 17:07:51.334138 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:07:51.333967 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="678c1a16-a395-4d4a-b4f6-a2ac6e1870e8" containerName="registry" Mar 18 17:07:51.334138 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:07:51.334025 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="678c1a16-a395-4d4a-b4f6-a2ac6e1870e8" containerName="registry" Mar 18 17:07:51.336784 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:07:51.336765 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-67zcr" Mar 18 17:07:51.338627 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:07:51.338609 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vwfl5\"/\"kube-root-ca.crt\"" Mar 18 17:07:51.339101 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:07:51.339082 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vwfl5\"/\"openshift-service-ca.crt\"" Mar 18 17:07:51.339190 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:07:51.339083 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-vwfl5\"/\"default-dockercfg-cfrxv\"" Mar 18 17:07:51.342749 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:07:51.342726 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-67fdcb5769-cwqdm_70783292-1426-4298-9aae-22ccd0c24067/volume-data-source-validator/0.log" Mar 18 17:07:51.346305 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:07:51.346285 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vwfl5/perf-node-gather-daemonset-67zcr"] Mar 18 17:07:51.384924 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:07:51.384897 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvph9\" (UniqueName: \"kubernetes.io/projected/f947a52b-b435-4c49-b494-84fa91d92404-kube-api-access-bvph9\") pod \"perf-node-gather-daemonset-67zcr\" (UID: \"f947a52b-b435-4c49-b494-84fa91d92404\") " pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-67zcr" Mar 18 17:07:51.384924 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:07:51.384928 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f947a52b-b435-4c49-b494-84fa91d92404-podres\") pod \"perf-node-gather-daemonset-67zcr\" (UID: \"f947a52b-b435-4c49-b494-84fa91d92404\") " pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-67zcr" Mar 18 17:07:51.385118 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:07:51.385009 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f947a52b-b435-4c49-b494-84fa91d92404-lib-modules\") pod \"perf-node-gather-daemonset-67zcr\" (UID: \"f947a52b-b435-4c49-b494-84fa91d92404\") " pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-67zcr" Mar 18 17:07:51.385118 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:07:51.385060 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f947a52b-b435-4c49-b494-84fa91d92404-sys\") pod \"perf-node-gather-daemonset-67zcr\" (UID: \"f947a52b-b435-4c49-b494-84fa91d92404\") " pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-67zcr" Mar 18 17:07:51.385118 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:07:51.385077 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f947a52b-b435-4c49-b494-84fa91d92404-proc\") pod \"perf-node-gather-daemonset-67zcr\" (UID: \"f947a52b-b435-4c49-b494-84fa91d92404\") " pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-67zcr" Mar 18 17:07:51.486197 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:07:51.486166 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bvph9\" (UniqueName: \"kubernetes.io/projected/f947a52b-b435-4c49-b494-84fa91d92404-kube-api-access-bvph9\") pod \"perf-node-gather-daemonset-67zcr\" (UID: \"f947a52b-b435-4c49-b494-84fa91d92404\") " pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-67zcr" Mar 18 17:07:51.486197 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:07:51.486201 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f947a52b-b435-4c49-b494-84fa91d92404-podres\") pod \"perf-node-gather-daemonset-67zcr\" (UID: \"f947a52b-b435-4c49-b494-84fa91d92404\") " pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-67zcr" Mar 18 17:07:51.486398 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:07:51.486226 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f947a52b-b435-4c49-b494-84fa91d92404-lib-modules\") pod \"perf-node-gather-daemonset-67zcr\" (UID: \"f947a52b-b435-4c49-b494-84fa91d92404\") " pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-67zcr" Mar 18 17:07:51.486398 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:07:51.486270 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f947a52b-b435-4c49-b494-84fa91d92404-sys\") pod \"perf-node-gather-daemonset-67zcr\" (UID: \"f947a52b-b435-4c49-b494-84fa91d92404\") " pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-67zcr" Mar 18 17:07:51.486398 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:07:51.486293 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f947a52b-b435-4c49-b494-84fa91d92404-proc\") pod \"perf-node-gather-daemonset-67zcr\" (UID: \"f947a52b-b435-4c49-b494-84fa91d92404\") " pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-67zcr" Mar 18 17:07:51.486398 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:07:51.486347 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f947a52b-b435-4c49-b494-84fa91d92404-podres\") pod \"perf-node-gather-daemonset-67zcr\" (UID: \"f947a52b-b435-4c49-b494-84fa91d92404\") " pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-67zcr" Mar 18 17:07:51.486398 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:07:51.486380 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f947a52b-b435-4c49-b494-84fa91d92404-sys\") pod \"perf-node-gather-daemonset-67zcr\" (UID: \"f947a52b-b435-4c49-b494-84fa91d92404\") " pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-67zcr" Mar 18 17:07:51.486555 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:07:51.486414 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f947a52b-b435-4c49-b494-84fa91d92404-lib-modules\") pod \"perf-node-gather-daemonset-67zcr\" (UID: \"f947a52b-b435-4c49-b494-84fa91d92404\") " pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-67zcr" Mar 18 17:07:51.486555 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:07:51.486421 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f947a52b-b435-4c49-b494-84fa91d92404-proc\") pod \"perf-node-gather-daemonset-67zcr\" (UID: \"f947a52b-b435-4c49-b494-84fa91d92404\") " pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-67zcr" Mar 18 17:07:51.493765 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:07:51.493738 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvph9\" (UniqueName: \"kubernetes.io/projected/f947a52b-b435-4c49-b494-84fa91d92404-kube-api-access-bvph9\") pod \"perf-node-gather-daemonset-67zcr\" (UID: \"f947a52b-b435-4c49-b494-84fa91d92404\") " pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-67zcr" Mar 18 17:07:51.647494 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:07:51.647466 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-67zcr" Mar 18 17:07:51.765098 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:07:51.765074 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vwfl5/perf-node-gather-daemonset-67zcr"] Mar 18 17:07:51.767451 ip-10-0-131-5 kubenswrapper[2562]: W0318 17:07:51.767423 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podf947a52b_b435_4c49_b494_84fa91d92404.slice/crio-bac76f25f1e26dcf54beb43c8b6655e1b0873da73a26483111d11f255e1c4572 WatchSource:0}: Error finding container bac76f25f1e26dcf54beb43c8b6655e1b0873da73a26483111d11f255e1c4572: Status 404 returned error can't find the container with id bac76f25f1e26dcf54beb43c8b6655e1b0873da73a26483111d11f255e1c4572 Mar 18 17:07:51.769048 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:07:51.769028 2562 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 17:07:52.034705 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:07:52.034615 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-xwx7z_3a05d5d2-a005-4847-b2aa-0f78e327686a/dns/0.log" Mar 18 17:07:52.053843 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:07:52.053819 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-xwx7z_3a05d5d2-a005-4847-b2aa-0f78e327686a/kube-rbac-proxy/0.log" Mar 18 17:07:52.160363 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:07:52.160339 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-zd7qk_184966df-7341-412a-aede-a32364efc520/dns-node-resolver/0.log" Mar 18 17:07:52.358184 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:07:52.358148 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-67zcr" event={"ID":"f947a52b-b435-4c49-b494-84fa91d92404","Type":"ContainerStarted","Data":"bec8838aad05edbbdaf8a31ff4a1af9e1e9da12758073f702b4daf036b28e731"} Mar 18 17:07:52.358549 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:07:52.358189 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-67zcr" event={"ID":"f947a52b-b435-4c49-b494-84fa91d92404","Type":"ContainerStarted","Data":"bac76f25f1e26dcf54beb43c8b6655e1b0873da73a26483111d11f255e1c4572"} Mar 18 17:07:52.358549 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:07:52.358218 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-67zcr" Mar 18 17:07:52.373914 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:07:52.373863 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-67zcr" podStartSLOduration=1.373848429 podStartE2EDuration="1.373848429s" podCreationTimestamp="2026-03-18 17:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:07:52.371957537 +0000 UTC m=+1476.058079889" watchObservedRunningTime="2026-03-18 17:07:52.373848429 +0000 UTC m=+1476.059970791" Mar 18 17:07:52.572685 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:07:52.572657 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-9t8zk_abad6a1d-3e28-4f96-90e8-383ed7b5b8e1/node-ca/0.log" Mar 18 17:07:53.320396 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:07:53.320359 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-745d85b678-r6ghs_bdf9649a-73e2-4d1b-9ba7-b03cc47f8426/router/0.log" Mar 18 17:07:53.672856 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:07:53.672827 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-t8ccf_4e96119f-c454-4e76-a758-9be470a94be8/serve-healthcheck-canary/0.log" Mar 18 17:07:54.024733 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:07:54.024657 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-76bdd9f478-kpr46_a69e9c12-8943-4e3a-b184-8b4e9b5c45c7/insights-operator/0.log" Mar 18 17:07:54.024733 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:07:54.024685 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-76bdd9f478-kpr46_a69e9c12-8943-4e3a-b184-8b4e9b5c45c7/insights-operator/1.log" Mar 18 17:07:54.045532 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:07:54.045501 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5km25_593cc8a3-494f-4641-bca3-e7c9b8a54d76/kube-rbac-proxy/0.log" Mar 18 17:07:54.063954 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:07:54.063920 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5km25_593cc8a3-494f-4641-bca3-e7c9b8a54d76/exporter/0.log" Mar 18 17:07:54.085080 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:07:54.085060 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5km25_593cc8a3-494f-4641-bca3-e7c9b8a54d76/extractor/0.log" Mar 18 17:07:58.369689 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:07:58.369663 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-67zcr" Mar 18 17:08:00.093048 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:08:00.092959 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-866f46547-vtj99_355a3baf-f84e-464e-90ff-4cf4165ace30/kube-storage-version-migrator-operator/1.log" Mar 18 17:08:00.094916 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:08:00.094886 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-866f46547-vtj99_355a3baf-f84e-464e-90ff-4cf4165ace30/kube-storage-version-migrator-operator/0.log" Mar 18 17:08:01.291400 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:08:01.291329 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qhxzz_dcfc6f29-52f0-4f09-b50d-f044f9886e51/kube-multus-additional-cni-plugins/0.log" Mar 18 17:08:01.315081 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:08:01.315043 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qhxzz_dcfc6f29-52f0-4f09-b50d-f044f9886e51/egress-router-binary-copy/0.log" Mar 18 17:08:01.341126 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:08:01.341097 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qhxzz_dcfc6f29-52f0-4f09-b50d-f044f9886e51/cni-plugins/0.log" Mar 18 17:08:01.364656 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:08:01.364633 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qhxzz_dcfc6f29-52f0-4f09-b50d-f044f9886e51/bond-cni-plugin/0.log" Mar 18 17:08:01.384725 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:08:01.384696 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qhxzz_dcfc6f29-52f0-4f09-b50d-f044f9886e51/routeoverride-cni/0.log" Mar 18 17:08:01.403745 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:08:01.403721 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qhxzz_dcfc6f29-52f0-4f09-b50d-f044f9886e51/whereabouts-cni-bincopy/0.log" Mar 18 17:08:01.422045 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:08:01.422022 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qhxzz_dcfc6f29-52f0-4f09-b50d-f044f9886e51/whereabouts-cni/0.log" Mar 18 17:08:01.598630 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:08:01.598546 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-whxl2_e010756a-9f4e-48da-b759-b8f2ff9d1f1b/kube-multus/0.log" Mar 18 17:08:01.721026 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:08:01.720997 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-dlnjg_008308d5-c00b-40b2-a413-eb4caebc48c0/network-metrics-daemon/0.log" Mar 18 17:08:01.738099 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:08:01.738071 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-dlnjg_008308d5-c00b-40b2-a413-eb4caebc48c0/kube-rbac-proxy/0.log" Mar 18 17:08:03.075124 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:08:03.075029 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nsv52_09466950-dea7-48b9-b4a4-b9b73d845973/ovn-controller/0.log" Mar 18 17:08:03.090305 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:08:03.090266 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nsv52_09466950-dea7-48b9-b4a4-b9b73d845973/ovn-acl-logging/0.log" Mar 18 17:08:03.103093 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:08:03.103057 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nsv52_09466950-dea7-48b9-b4a4-b9b73d845973/ovn-acl-logging/1.log" Mar 18 17:08:03.123417 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:08:03.123393 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nsv52_09466950-dea7-48b9-b4a4-b9b73d845973/kube-rbac-proxy-node/0.log" Mar 18 17:08:03.145406 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:08:03.145379 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nsv52_09466950-dea7-48b9-b4a4-b9b73d845973/kube-rbac-proxy-ovn-metrics/0.log" Mar 18 17:08:03.164273 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:08:03.164242 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nsv52_09466950-dea7-48b9-b4a4-b9b73d845973/northd/0.log" Mar 18 17:08:03.184738 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:08:03.184704 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nsv52_09466950-dea7-48b9-b4a4-b9b73d845973/nbdb/0.log" Mar 18 17:08:03.206609 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:08:03.206582 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nsv52_09466950-dea7-48b9-b4a4-b9b73d845973/sbdb/0.log" Mar 18 17:08:03.365197 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:08:03.365170 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nsv52_09466950-dea7-48b9-b4a4-b9b73d845973/ovnkube-controller/0.log" Mar 18 17:08:04.420210 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:08:04.420176 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-cc88fdd44-h6jqn_fee6fee0-2356-450e-9cf1-5d0c7cd03239/check-endpoints/0.log" Mar 18 17:08:04.656681 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:08:04.656650 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-mnsnh_473b4404-da3f-4c35-92c4-a69465dc3f06/network-check-target-container/0.log" Mar 18 17:08:05.602121 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:08:05.602084 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-s4vkz_203ce5e3-84d7-4528-841c-52a57d9ccb6e/iptables-alerter/0.log" Mar 18 17:08:06.219596 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:08:06.219561 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-bhcbr_ba39ce75-7939-4e19-9800-7072765d139b/tuned/0.log" Mar 18 17:08:07.946332 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:08:07.946301 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-d5df4776c-rn2zb_de591805-36d1-4a59-a89f-2f9aca624e2e/cluster-samples-operator/0.log" Mar 18 17:08:07.960078 ip-10-0-131-5 kubenswrapper[2562]: I0318 17:08:07.960051 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-d5df4776c-rn2zb_de591805-36d1-4a59-a89f-2f9aca624e2e/cluster-samples-operator-watch/0.log"