Apr 17 07:48:39.161810 ip-10-0-134-176 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 07:48:39.161824 ip-10-0-134-176 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 07:48:39.161833 ip-10-0-134-176 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 07:48:39.162179 ip-10-0-134-176 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 07:48:49.201216 ip-10-0-134-176 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 07:48:49.201232 ip-10-0-134-176 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot bc1bb6fbfb9a4516a85d862d99342cde -- Apr 17 07:51:19.935415 ip-10-0-134-176 systemd[1]: Starting Kubernetes Kubelet... Apr 17 07:51:20.296176 ip-10-0-134-176 kubenswrapper[2578]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 07:51:20.296176 ip-10-0-134-176 kubenswrapper[2578]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 07:51:20.296176 ip-10-0-134-176 kubenswrapper[2578]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 07:51:20.296176 ip-10-0-134-176 kubenswrapper[2578]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 07:51:20.296176 ip-10-0-134-176 kubenswrapper[2578]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 07:51:20.298842 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.298759 2578 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 07:51:20.301937 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.301923 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 07:51:20.301937 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.301938 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 07:51:20.302005 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.301942 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 07:51:20.302005 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.301946 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 07:51:20.302005 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.301949 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 07:51:20.302005 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.301954 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 07:51:20.302005 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.301958 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 07:51:20.302005 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.301961 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 07:51:20.302005 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.301965 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 07:51:20.302005 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.301968 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 07:51:20.302005 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.301976 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 07:51:20.302005 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.301980 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 07:51:20.302005 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.301983 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 07:51:20.302005 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.301986 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 07:51:20.302005 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.301988 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 07:51:20.302005 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.301991 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 07:51:20.302005 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.301994 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 07:51:20.302005 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.301997 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 07:51:20.302005 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302000 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 07:51:20.302005 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302002 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 07:51:20.302005 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302005 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 07:51:20.302466 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302008 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 07:51:20.302466 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302011 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 07:51:20.302466 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302014 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 07:51:20.302466 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302017 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 07:51:20.302466 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302019 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 07:51:20.302466 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302022 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 07:51:20.302466 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302025 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 07:51:20.302466 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302027 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 07:51:20.302466 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302030 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 07:51:20.302466 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302034 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 07:51:20.302466 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302038 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 07:51:20.302466 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302041 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 07:51:20.302466 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302043 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 07:51:20.302466 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302046 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 07:51:20.302466 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302048 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 07:51:20.302466 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302051 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 07:51:20.302466 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302053 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 07:51:20.302466 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302056 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 07:51:20.302466 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302058 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 07:51:20.302466 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302061 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 07:51:20.302949 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302064 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 07:51:20.302949 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302067 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 07:51:20.302949 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302069 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 07:51:20.302949 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302072 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 07:51:20.302949 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302074 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 07:51:20.302949 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302077 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 07:51:20.302949 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302079 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 07:51:20.302949 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302081 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 07:51:20.302949 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302084 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 07:51:20.302949 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302086 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 07:51:20.302949 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302089 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 07:51:20.302949 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302091 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 07:51:20.302949 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302094 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 07:51:20.302949 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302098 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 07:51:20.302949 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302100 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 17 07:51:20.302949 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302103 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 07:51:20.302949 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302105 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 07:51:20.302949 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302108 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 07:51:20.302949 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302110 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 07:51:20.302949 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302113 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 07:51:20.303481 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302116 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 07:51:20.303481 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302118 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 07:51:20.303481 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302121 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 07:51:20.303481 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302123 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 07:51:20.303481 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302126 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 07:51:20.303481 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302128 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 07:51:20.303481 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302131 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 07:51:20.303481 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302133 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 07:51:20.303481 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302136 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 07:51:20.303481 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302150 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 07:51:20.303481 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302154 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 07:51:20.303481 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302157 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 07:51:20.303481 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302160 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 07:51:20.303481 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302163 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 07:51:20.303481 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302166 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 07:51:20.303481 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302169 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 07:51:20.303481 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302171 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 07:51:20.303481 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302174 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 07:51:20.303481 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302177 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 07:51:20.303939 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302180 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 07:51:20.303939 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302182 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 07:51:20.303939 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302185 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 07:51:20.303939 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302187 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 07:51:20.303939 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302190 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 07:51:20.303939 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.302192 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 07:51:20.304435 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304423 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 07:51:20.304435 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304434 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 07:51:20.304498 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304437 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 07:51:20.304498 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304440 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 07:51:20.304498 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304444 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 07:51:20.304498 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304447 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 07:51:20.304498 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304450 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 07:51:20.304498 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304453 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 07:51:20.304498 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304456 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 07:51:20.304498 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304459 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 07:51:20.304498 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304462 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 07:51:20.304498 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304465 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 07:51:20.304498 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304468 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 07:51:20.304498 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304471 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 07:51:20.304498 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304473 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 07:51:20.304498 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304476 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 07:51:20.304498 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304478 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 07:51:20.304498 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304481 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 07:51:20.304498 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304484 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 07:51:20.304498 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304486 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 07:51:20.304498 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304489 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 07:51:20.304498 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304491 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 07:51:20.305243 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304534 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 07:51:20.305243 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304574 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 07:51:20.305243 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304578 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 07:51:20.305243 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304581 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 07:51:20.305243 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304584 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 07:51:20.305243 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304586 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 07:51:20.305243 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304628 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 07:51:20.305243 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304634 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 07:51:20.305243 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304638 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 07:51:20.305243 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304643 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 07:51:20.305243 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304648 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 07:51:20.305243 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304653 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 07:51:20.305243 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304657 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 07:51:20.305243 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304664 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 07:51:20.305243 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304678 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 07:51:20.305243 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304685 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 07:51:20.305243 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304690 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 07:51:20.305243 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304695 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 07:51:20.305243 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304699 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 07:51:20.305809 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304704 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 07:51:20.305809 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304708 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 07:51:20.305809 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304713 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 07:51:20.305809 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304717 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 07:51:20.305809 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304721 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 07:51:20.305809 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304725 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 07:51:20.305809 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304734 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 07:51:20.305809 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304738 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 07:51:20.305809 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304742 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 07:51:20.305809 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304746 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 07:51:20.305809 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304750 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 07:51:20.305809 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304754 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 07:51:20.305809 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304759 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 07:51:20.305809 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304763 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 07:51:20.305809 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304778 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 07:51:20.305809 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304783 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 07:51:20.305809 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304787 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 07:51:20.305809 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304792 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 07:51:20.305809 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304797 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 07:51:20.305809 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304806 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 07:51:20.306401 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304810 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 07:51:20.306401 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304814 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 07:51:20.306401 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304819 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 07:51:20.306401 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304823 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 07:51:20.306401 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304827 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 07:51:20.306401 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304832 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 07:51:20.306401 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304836 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 07:51:20.306401 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304840 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 07:51:20.306401 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304844 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 07:51:20.306401 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304848 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 07:51:20.306401 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304852 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 07:51:20.306401 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304856 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 07:51:20.306401 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304866 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 07:51:20.306401 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304870 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 07:51:20.306401 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304874 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 07:51:20.306401 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304878 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 07:51:20.306401 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304882 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 07:51:20.306401 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304886 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 07:51:20.306401 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304890 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 07:51:20.306858 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304894 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 07:51:20.306858 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304900 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 07:51:20.306858 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304904 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 07:51:20.306858 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304908 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 07:51:20.306858 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304917 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 17 07:51:20.306858 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.304922 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 07:51:20.306858 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305068 2578 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 07:51:20.306858 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305079 2578 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 07:51:20.306858 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305107 2578 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 07:51:20.306858 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305114 2578 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 07:51:20.306858 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305126 2578 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 07:51:20.306858 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305132 2578 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 07:51:20.306858 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305154 2578 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 07:51:20.306858 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305162 2578 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 07:51:20.306858 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305167 2578 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 07:51:20.306858 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305172 2578 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 07:51:20.306858 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305178 2578 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 07:51:20.306858 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305183 2578 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 07:51:20.306858 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305193 2578 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 07:51:20.306858 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305198 2578 flags.go:64] FLAG: --cgroup-root="" Apr 17 07:51:20.306858 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305203 2578 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 07:51:20.306858 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305208 2578 flags.go:64] FLAG: --client-ca-file="" Apr 17 07:51:20.306858 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305213 2578 flags.go:64] FLAG: --cloud-config="" Apr 17 07:51:20.307422 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305218 2578 flags.go:64] FLAG: --cloud-provider="external" Apr 17 07:51:20.307422 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305223 2578 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 07:51:20.307422 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305230 2578 flags.go:64] FLAG: --cluster-domain="" Apr 17 07:51:20.307422 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305236 2578 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 07:51:20.307422 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305246 2578 flags.go:64] FLAG: --config-dir="" Apr 17 07:51:20.307422 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305251 2578 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 07:51:20.307422 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305257 2578 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 07:51:20.307422 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305264 2578 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 07:51:20.307422 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305269 2578 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 07:51:20.307422 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305274 2578 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 07:51:20.307422 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305279 2578 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 07:51:20.307422 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305285 2578 flags.go:64] FLAG: --contention-profiling="false" Apr 17 07:51:20.307422 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305294 2578 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 07:51:20.307422 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305299 2578 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 07:51:20.307422 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305309 2578 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 07:51:20.307422 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305314 2578 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 07:51:20.307422 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305320 2578 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 07:51:20.307422 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305325 2578 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 07:51:20.307422 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305331 2578 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 07:51:20.307422 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305336 2578 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 07:51:20.307422 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305341 2578 flags.go:64] FLAG: --enable-server="true" Apr 17 07:51:20.307422 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305350 2578 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 07:51:20.307422 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305357 2578 flags.go:64] FLAG: --event-burst="100" Apr 17 07:51:20.307422 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305362 2578 flags.go:64] FLAG: --event-qps="50" Apr 17 07:51:20.307422 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305367 2578 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 07:51:20.308029 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305372 2578 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 07:51:20.308029 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305377 2578 flags.go:64] FLAG: --eviction-hard="" Apr 17 07:51:20.308029 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305383 2578 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 07:51:20.308029 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305387 2578 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 07:51:20.308029 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305398 2578 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 07:51:20.308029 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305403 2578 flags.go:64] FLAG: --eviction-soft="" Apr 17 07:51:20.308029 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305407 2578 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 07:51:20.308029 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305412 2578 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 07:51:20.308029 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305469 2578 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 07:51:20.308029 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305474 2578 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 07:51:20.308029 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305478 2578 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 07:51:20.308029 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305482 2578 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 07:51:20.308029 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305485 2578 flags.go:64] FLAG: --feature-gates="" Apr 17 07:51:20.308029 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305491 2578 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 07:51:20.308029 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305497 2578 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 07:51:20.308029 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305510 2578 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 07:51:20.308029 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305609 2578 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 07:51:20.308029 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305615 2578 flags.go:64] FLAG: --healthz-port="10248" Apr 17 07:51:20.308029 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305620 2578 flags.go:64] FLAG: --help="false" Apr 17 07:51:20.308029 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305625 2578 flags.go:64] FLAG: --hostname-override="ip-10-0-134-176.ec2.internal" Apr 17 07:51:20.308029 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305630 2578 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 07:51:20.308029 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305637 2578 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 07:51:20.308029 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305642 2578 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 07:51:20.308603 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305647 2578 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 07:51:20.308603 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305652 2578 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 07:51:20.308603 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305657 2578 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 07:51:20.308603 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305661 2578 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 07:51:20.308603 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305666 2578 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 07:51:20.308603 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305670 2578 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 07:51:20.308603 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305675 2578 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 07:51:20.308603 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305680 2578 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 07:51:20.308603 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305685 2578 flags.go:64] FLAG: --kube-reserved="" Apr 17 07:51:20.308603 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305690 2578 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 07:51:20.308603 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305694 2578 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 07:51:20.308603 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305699 2578 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 07:51:20.308603 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305703 2578 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 07:51:20.308603 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305708 2578 flags.go:64] FLAG: --lock-file="" Apr 17 07:51:20.308603 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305713 2578 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 07:51:20.308603 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305717 2578 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 07:51:20.308603 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305722 2578 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 07:51:20.308603 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305730 2578 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 07:51:20.308603 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305735 2578 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 07:51:20.308603 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305740 2578 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 07:51:20.308603 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305744 2578 flags.go:64] FLAG: --logging-format="text" Apr 17 07:51:20.308603 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305749 2578 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 07:51:20.308603 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305754 2578 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 07:51:20.308603 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305759 2578 flags.go:64] FLAG: --manifest-url="" Apr 17 07:51:20.309226 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305765 2578 flags.go:64] FLAG: --manifest-url-header="" Apr 17 07:51:20.309226 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305771 2578 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 07:51:20.309226 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305777 2578 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 07:51:20.309226 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305781 2578 flags.go:64] FLAG: --max-pods="110" Apr 17 07:51:20.309226 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305784 2578 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 07:51:20.309226 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305787 2578 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 07:51:20.309226 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305791 2578 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 07:51:20.309226 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305794 2578 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 07:51:20.309226 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305797 2578 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 07:51:20.309226 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305800 2578 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 07:51:20.309226 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305804 2578 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 07:51:20.309226 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305812 2578 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 07:51:20.309226 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305815 2578 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 07:51:20.309226 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305818 2578 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 07:51:20.309226 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305821 2578 flags.go:64] FLAG: --pod-cidr="" Apr 17 07:51:20.309226 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305825 2578 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 07:51:20.309226 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305830 2578 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 07:51:20.309226 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305833 2578 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 07:51:20.309226 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305836 2578 flags.go:64] FLAG: --pods-per-core="0" Apr 17 07:51:20.309226 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305839 2578 flags.go:64] FLAG: --port="10250" Apr 17 07:51:20.309226 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305842 2578 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 07:51:20.309226 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305845 2578 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-060f0a4e5adf88674" Apr 17 07:51:20.309226 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305848 2578 flags.go:64] FLAG: --qos-reserved="" Apr 17 07:51:20.309226 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305851 2578 flags.go:64] FLAG: --read-only-port="10255" Apr 17 07:51:20.309823 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305854 2578 flags.go:64] FLAG: --register-node="true" Apr 17 07:51:20.309823 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305857 2578 flags.go:64] FLAG: --register-schedulable="true" Apr 17 07:51:20.309823 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305859 2578 flags.go:64] FLAG: --register-with-taints="" Apr 17 07:51:20.309823 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305863 2578 flags.go:64] FLAG: --registry-burst="10" Apr 17 07:51:20.309823 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305866 2578 flags.go:64] FLAG: --registry-qps="5" Apr 17 07:51:20.309823 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305869 2578 flags.go:64] FLAG: --reserved-cpus="" Apr 17 07:51:20.309823 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305872 2578 flags.go:64] FLAG: --reserved-memory="" Apr 17 07:51:20.309823 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305876 2578 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 07:51:20.309823 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305879 2578 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 07:51:20.309823 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305882 2578 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 07:51:20.309823 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305884 2578 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 07:51:20.309823 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305888 2578 flags.go:64] FLAG: --runonce="false" Apr 17 07:51:20.309823 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305891 2578 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 07:51:20.309823 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305894 2578 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 07:51:20.309823 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305897 2578 flags.go:64] FLAG: --seccomp-default="false" Apr 17 07:51:20.309823 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305901 2578 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 07:51:20.309823 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305904 2578 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 07:51:20.309823 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305907 2578 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 07:51:20.309823 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305910 2578 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 07:51:20.309823 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305913 2578 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 07:51:20.309823 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305916 2578 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 07:51:20.309823 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305919 2578 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 07:51:20.309823 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305922 2578 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 07:51:20.309823 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305926 2578 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 07:51:20.309823 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305929 2578 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 07:51:20.309823 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305932 2578 flags.go:64] FLAG: --system-cgroups="" Apr 17 07:51:20.310481 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305935 2578 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 07:51:20.310481 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305941 2578 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 07:51:20.310481 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305944 2578 flags.go:64] FLAG: --tls-cert-file="" Apr 17 07:51:20.310481 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305946 2578 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 07:51:20.310481 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305951 2578 flags.go:64] FLAG: --tls-min-version="" Apr 17 07:51:20.310481 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305954 2578 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 07:51:20.310481 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305956 2578 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 07:51:20.310481 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305967 2578 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 07:51:20.310481 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305970 2578 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 07:51:20.310481 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305973 2578 flags.go:64] FLAG: --v="2" Apr 17 07:51:20.310481 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305978 2578 flags.go:64] FLAG: --version="false" Apr 17 07:51:20.310481 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305981 2578 flags.go:64] FLAG: --vmodule="" Apr 17 07:51:20.310481 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305986 2578 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 07:51:20.310481 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.305989 2578 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 07:51:20.310481 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306078 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 07:51:20.310481 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306082 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 07:51:20.310481 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306085 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 07:51:20.310481 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306088 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 07:51:20.310481 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306090 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 07:51:20.310481 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306093 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 07:51:20.310481 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306096 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 07:51:20.310481 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306099 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 07:51:20.310481 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306102 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 07:51:20.311025 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306105 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 07:51:20.311025 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306108 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 07:51:20.311025 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306110 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 07:51:20.311025 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306113 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 07:51:20.311025 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306115 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 07:51:20.311025 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306119 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 07:51:20.311025 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306123 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 07:51:20.311025 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306125 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 07:51:20.311025 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306128 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 07:51:20.311025 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306130 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 07:51:20.311025 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306133 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 07:51:20.311025 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306136 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 07:51:20.311025 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306152 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 07:51:20.311025 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306155 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 07:51:20.311025 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306158 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 07:51:20.311025 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306160 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 07:51:20.311025 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306163 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 07:51:20.311025 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306172 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 07:51:20.311025 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306175 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 07:51:20.311025 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306177 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 07:51:20.311532 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306180 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 07:51:20.311532 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306182 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 07:51:20.311532 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306185 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 07:51:20.311532 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306188 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 07:51:20.311532 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306190 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 07:51:20.311532 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306192 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 07:51:20.311532 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306195 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 07:51:20.311532 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306198 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 07:51:20.311532 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306200 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 07:51:20.311532 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306203 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 07:51:20.311532 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306205 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 07:51:20.311532 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306208 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 07:51:20.311532 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306210 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 07:51:20.311532 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306213 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 07:51:20.311532 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306215 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 07:51:20.311532 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306218 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 07:51:20.311532 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306221 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 07:51:20.311532 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306223 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 07:51:20.311532 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306227 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 07:51:20.311532 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306230 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 07:51:20.312045 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306232 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 07:51:20.312045 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306235 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 07:51:20.312045 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306237 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 07:51:20.312045 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306240 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 07:51:20.312045 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306242 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 07:51:20.312045 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306245 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 07:51:20.312045 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306247 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 07:51:20.312045 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306250 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 07:51:20.312045 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306252 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 17 07:51:20.312045 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306254 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 07:51:20.312045 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306262 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 07:51:20.312045 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306265 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 07:51:20.312045 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306268 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 07:51:20.312045 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306270 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 07:51:20.312045 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306272 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 07:51:20.312045 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306275 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 07:51:20.312045 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306278 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 07:51:20.312045 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306280 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 07:51:20.312045 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306283 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 07:51:20.312530 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306285 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 07:51:20.312530 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306288 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 07:51:20.312530 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306290 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 07:51:20.312530 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306293 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 07:51:20.312530 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306296 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 07:51:20.312530 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306298 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 07:51:20.312530 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306300 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 07:51:20.312530 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306303 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 07:51:20.312530 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306305 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 07:51:20.312530 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306308 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 07:51:20.312530 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306310 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 07:51:20.312530 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306316 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 07:51:20.312530 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306319 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 07:51:20.312530 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306323 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 07:51:20.312530 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306326 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 07:51:20.312530 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306329 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 07:51:20.312530 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306331 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 07:51:20.312530 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.306333 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 07:51:20.313008 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.307046 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 07:51:20.313937 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.313920 2578 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 07:51:20.313973 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.313938 2578 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 07:51:20.314005 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.313983 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 07:51:20.314005 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.313988 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 07:51:20.314005 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.313991 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 07:51:20.314005 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.313994 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 07:51:20.314005 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.313997 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 07:51:20.314005 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.313999 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 07:51:20.314005 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314002 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 07:51:20.314005 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314005 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 07:51:20.314005 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314008 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 07:51:20.314233 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314011 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 07:51:20.314233 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314013 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 07:51:20.314233 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314016 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 07:51:20.314233 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314019 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 07:51:20.314233 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314021 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 07:51:20.314233 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314024 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 07:51:20.314233 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314027 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 07:51:20.314233 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314029 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 07:51:20.314233 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314032 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 07:51:20.314233 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314035 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 07:51:20.314233 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314037 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 07:51:20.314233 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314040 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 07:51:20.314233 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314042 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 07:51:20.314233 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314045 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 07:51:20.314233 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314047 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 07:51:20.314233 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314050 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 07:51:20.314233 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314052 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 07:51:20.314233 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314055 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 07:51:20.314233 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314057 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 07:51:20.314233 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314060 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 07:51:20.314715 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314063 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 07:51:20.314715 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314065 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 07:51:20.314715 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314069 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 07:51:20.314715 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314072 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 07:51:20.314715 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314074 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 07:51:20.314715 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314077 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 07:51:20.314715 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314079 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 07:51:20.314715 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314082 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 17 07:51:20.314715 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314084 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 07:51:20.314715 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314086 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 07:51:20.314715 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314089 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 07:51:20.314715 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314092 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 07:51:20.314715 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314094 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 07:51:20.314715 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314097 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 07:51:20.314715 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314099 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 07:51:20.314715 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314102 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 07:51:20.314715 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314104 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 07:51:20.314715 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314107 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 07:51:20.314715 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314109 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 07:51:20.314715 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314112 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 07:51:20.315243 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314114 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 07:51:20.315243 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314117 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 07:51:20.315243 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314119 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 07:51:20.315243 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314123 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 07:51:20.315243 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314127 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 07:51:20.315243 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314130 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 07:51:20.315243 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314132 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 07:51:20.315243 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314135 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 07:51:20.315243 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314151 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 07:51:20.315243 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314154 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 07:51:20.315243 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314157 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 07:51:20.315243 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314160 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 07:51:20.315243 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314163 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 07:51:20.315243 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314165 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 07:51:20.315243 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314168 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 07:51:20.315243 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314171 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 07:51:20.315243 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314174 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 07:51:20.315243 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314177 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 07:51:20.315243 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314180 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 07:51:20.315243 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314182 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 07:51:20.315723 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314185 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 07:51:20.315723 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314187 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 07:51:20.315723 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314190 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 07:51:20.315723 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314193 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 07:51:20.315723 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314195 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 07:51:20.315723 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314198 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 07:51:20.315723 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314201 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 07:51:20.315723 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314203 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 07:51:20.315723 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314206 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 07:51:20.315723 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314208 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 07:51:20.315723 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314211 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 07:51:20.315723 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314214 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 07:51:20.315723 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314218 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 07:51:20.315723 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314222 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 07:51:20.315723 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314225 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 07:51:20.315723 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314228 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 07:51:20.315723 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314230 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 07:51:20.316148 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.314235 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 07:51:20.316148 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314343 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 07:51:20.316148 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314349 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 07:51:20.316148 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314352 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 07:51:20.316148 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314355 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 07:51:20.316148 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314358 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 07:51:20.316148 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314360 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 07:51:20.316148 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314363 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 07:51:20.316148 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314366 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 07:51:20.316148 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314370 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 07:51:20.316148 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314375 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 07:51:20.316148 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314378 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 07:51:20.316148 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314381 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 07:51:20.316148 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314384 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 07:51:20.316148 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314386 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 07:51:20.316524 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314389 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 07:51:20.316524 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314392 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 07:51:20.316524 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314394 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 07:51:20.316524 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314397 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 07:51:20.316524 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314399 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 07:51:20.316524 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314402 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 07:51:20.316524 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314405 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 07:51:20.316524 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314407 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 07:51:20.316524 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314409 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 07:51:20.316524 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314412 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 07:51:20.316524 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314414 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 07:51:20.316524 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314417 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 07:51:20.316524 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314419 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 07:51:20.316524 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314422 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 07:51:20.316524 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314424 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 07:51:20.316524 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314427 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 07:51:20.316524 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314429 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 07:51:20.316524 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314432 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 07:51:20.316524 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314434 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 07:51:20.316976 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314436 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 07:51:20.316976 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314439 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 07:51:20.316976 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314441 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 07:51:20.316976 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314444 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 07:51:20.316976 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314447 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 07:51:20.316976 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314449 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 07:51:20.316976 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314452 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 07:51:20.316976 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314454 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 07:51:20.316976 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314457 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 07:51:20.316976 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314459 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 07:51:20.316976 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314462 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 07:51:20.316976 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314465 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 07:51:20.316976 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314467 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 07:51:20.316976 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314470 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 07:51:20.316976 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314472 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 17 07:51:20.316976 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314475 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 07:51:20.316976 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314477 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 07:51:20.316976 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314480 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 07:51:20.316976 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314482 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 07:51:20.316976 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314485 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 07:51:20.317471 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314487 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 07:51:20.317471 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314490 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 07:51:20.317471 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314492 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 07:51:20.317471 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314495 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 07:51:20.317471 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314498 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 07:51:20.317471 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314500 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 07:51:20.317471 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314502 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 07:51:20.317471 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314505 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 07:51:20.317471 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314507 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 07:51:20.317471 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314510 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 07:51:20.317471 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314512 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 07:51:20.317471 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314515 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 07:51:20.317471 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314518 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 07:51:20.317471 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314522 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 07:51:20.317471 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314525 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 07:51:20.317471 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314528 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 07:51:20.317471 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314531 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 07:51:20.317471 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314534 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 07:51:20.317471 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314536 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 07:51:20.317471 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314539 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 07:51:20.317952 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314542 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 07:51:20.317952 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314545 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 07:51:20.317952 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314547 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 07:51:20.317952 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314550 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 07:51:20.317952 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314553 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 07:51:20.317952 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314556 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 07:51:20.317952 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314558 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 07:51:20.317952 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314561 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 07:51:20.317952 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314563 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 07:51:20.317952 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314566 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 07:51:20.317952 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314568 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 07:51:20.317952 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314570 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 07:51:20.317952 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:20.314573 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 07:51:20.317952 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.314577 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 07:51:20.317952 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.315292 2578 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 07:51:20.318456 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.317322 2578 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 07:51:20.318456 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.318188 2578 server.go:1019] "Starting client certificate rotation" Apr 17 07:51:20.318456 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.318280 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 07:51:20.318456 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.318318 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 07:51:20.339761 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.339741 2578 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 07:51:20.342655 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.342631 2578 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 07:51:20.359488 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.359465 2578 log.go:25] "Validated CRI v1 runtime API" Apr 17 07:51:20.364604 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.364583 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 07:51:20.364935 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.364922 2578 log.go:25] "Validated CRI v1 image API" Apr 17 07:51:20.366320 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.366306 2578 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 07:51:20.373157 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.373121 2578 fs.go:135] Filesystem UUIDs: map[2b6a0229-36bb-4af8-a1e3-59aba1f3d84c:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 c50e29fe-5ca9-46b1-8360-cacb09ee1ec6:/dev/nvme0n1p3] Apr 17 07:51:20.373240 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.373159 2578 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 07:51:20.377892 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.377783 2578 manager.go:217] Machine: {Timestamp:2026-04-17 07:51:20.376761471 +0000 UTC m=+0.342798251 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3094994 MemoryCapacity:33164484608 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2927a6a02148aa411de1e4844f1d4b SystemUUID:ec2927a6-a021-48aa-411d-e1e4844f1d4b BootID:bc1bb6fb-fb9a-4516-a85d-862d99342cde Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:06:f4:44:bb:13 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:06:f4:44:bb:13 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ca:c7:63:e1:ff:5b Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164484608 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 07:51:20.377892 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.377880 2578 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 07:51:20.378053 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.377979 2578 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 07:51:20.379634 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.379606 2578 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 07:51:20.379793 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.379636 2578 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-134-176.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 07:51:20.379875 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.379809 2578 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 07:51:20.379875 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.379824 2578 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 07:51:20.379875 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.379847 2578 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 07:51:20.380474 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.380462 2578 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 07:51:20.381385 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.381373 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 17 07:51:20.381502 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.381492 2578 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 07:51:20.383468 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.383455 2578 kubelet.go:491] "Attempting to sync node with API server" Apr 17 07:51:20.383528 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.383483 2578 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 07:51:20.383528 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.383500 2578 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 07:51:20.383528 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.383513 2578 kubelet.go:397] "Adding apiserver pod source" Apr 17 07:51:20.383641 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.383535 2578 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 07:51:20.383641 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.383567 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-bwlvx" Apr 17 07:51:20.384516 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.384503 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 07:51:20.384595 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.384525 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 07:51:20.387083 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.387059 2578 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 07:51:20.388409 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.388396 2578 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 07:51:20.390108 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.390096 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 07:51:20.390176 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.390114 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 07:51:20.390176 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.390120 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 07:51:20.390176 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.390126 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 07:51:20.390176 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.390131 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 07:51:20.390176 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.390137 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 07:51:20.390176 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.390158 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 07:51:20.390176 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.390164 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 07:51:20.390176 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.390172 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 07:51:20.390176 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.390177 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 07:51:20.390416 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.390194 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 07:51:20.390416 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.390203 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 07:51:20.391621 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.391602 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 07:51:20.391621 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.391618 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 07:51:20.391719 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.391660 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-bwlvx" Apr 17 07:51:20.393368 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:20.393343 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-134-176.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 07:51:20.393619 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:20.393403 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 07:51:20.395965 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.395949 2578 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 07:51:20.396037 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.395986 2578 server.go:1295] "Started kubelet" Apr 17 07:51:20.396199 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.396132 2578 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 07:51:20.396268 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.396222 2578 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 07:51:20.396323 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.396112 2578 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 07:51:20.396823 ip-10-0-134-176 systemd[1]: Started Kubernetes Kubelet. Apr 17 07:51:20.397883 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.397869 2578 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 07:51:20.398591 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.398575 2578 server.go:317] "Adding debug handlers to kubelet server" Apr 17 07:51:20.402432 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.402416 2578 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-134-176.ec2.internal" not found Apr 17 07:51:20.403076 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.403059 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 07:51:20.403547 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.403530 2578 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 07:51:20.404386 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.404223 2578 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 07:51:20.404386 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.404224 2578 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 07:51:20.404386 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:20.404381 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-176.ec2.internal\" not found" Apr 17 07:51:20.404526 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.404395 2578 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 07:51:20.404526 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.404492 2578 reconstruct.go:97] "Volume reconstruction finished" Apr 17 07:51:20.404526 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.404503 2578 reconciler.go:26] "Reconciler: start to sync state" Apr 17 07:51:20.405554 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.405048 2578 factory.go:55] Registering systemd factory Apr 17 07:51:20.405554 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.405087 2578 factory.go:223] Registration of the systemd container factory successfully Apr 17 07:51:20.405554 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.405490 2578 factory.go:153] Registering CRI-O factory Apr 17 07:51:20.405554 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.405506 2578 factory.go:223] Registration of the crio container factory successfully Apr 17 07:51:20.405771 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.405619 2578 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 07:51:20.405771 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.405669 2578 factory.go:103] Registering Raw factory Apr 17 07:51:20.405771 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.405687 2578 manager.go:1196] Started watching for new ooms in manager Apr 17 07:51:20.405771 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.405704 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:51:20.406057 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.406042 2578 manager.go:319] Starting recovery of all containers Apr 17 07:51:20.408257 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:20.408225 2578 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 07:51:20.408520 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:20.408498 2578 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-134-176.ec2.internal\" not found" node="ip-10-0-134-176.ec2.internal" Apr 17 07:51:20.417683 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.417662 2578 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-134-176.ec2.internal" not found Apr 17 07:51:20.418681 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.418658 2578 manager.go:324] Recovery completed Apr 17 07:51:20.420276 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:20.420256 2578 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 17 07:51:20.423129 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.423113 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 07:51:20.425440 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.425416 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-176.ec2.internal" event="NodeHasSufficientMemory" Apr 17 07:51:20.425526 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.425443 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-176.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 07:51:20.425526 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.425455 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-176.ec2.internal" event="NodeHasSufficientPID" Apr 17 07:51:20.425949 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.425929 2578 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 07:51:20.425949 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.425948 2578 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 07:51:20.426050 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.425966 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 17 07:51:20.429269 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.429258 2578 policy_none.go:49] "None policy: Start" Apr 17 07:51:20.429314 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.429275 2578 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 07:51:20.429314 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.429284 2578 state_mem.go:35] "Initializing new in-memory state store" Apr 17 07:51:20.480494 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.474655 2578 manager.go:341] "Starting Device Plugin manager" Apr 17 07:51:20.480494 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:20.474681 2578 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 07:51:20.480494 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.474690 2578 server.go:85] "Starting device plugin registration server" Apr 17 07:51:20.480494 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.474917 2578 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 07:51:20.480494 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.474928 2578 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 07:51:20.480494 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.475038 2578 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 07:51:20.480494 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.475121 2578 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 07:51:20.480494 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.475132 2578 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 07:51:20.480494 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:20.475596 2578 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 07:51:20.480494 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:20.475632 2578 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-134-176.ec2.internal\" not found" Apr 17 07:51:20.480494 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.475967 2578 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-134-176.ec2.internal" not found Apr 17 07:51:20.527705 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.527677 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 07:51:20.528844 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.528828 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 07:51:20.528905 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.528855 2578 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 07:51:20.528905 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.528872 2578 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 07:51:20.528905 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.528880 2578 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 07:51:20.529007 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:20.528916 2578 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 07:51:20.531849 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.531830 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:51:20.575803 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.575752 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 07:51:20.576645 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.576629 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-176.ec2.internal" event="NodeHasSufficientMemory" Apr 17 07:51:20.576731 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.576656 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-176.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 07:51:20.576731 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.576667 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-176.ec2.internal" event="NodeHasSufficientPID" Apr 17 07:51:20.576731 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.576691 2578 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-134-176.ec2.internal" Apr 17 07:51:20.582808 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.582795 2578 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-134-176.ec2.internal" Apr 17 07:51:20.582856 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:20.582816 2578 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-134-176.ec2.internal\": node \"ip-10-0-134-176.ec2.internal\" not found" Apr 17 07:51:20.593345 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:20.593326 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-176.ec2.internal\" not found" Apr 17 07:51:20.629918 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.629888 2578 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-176.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-134-176.ec2.internal"] Apr 17 07:51:20.629975 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.629966 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 07:51:20.630683 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.630662 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-176.ec2.internal" event="NodeHasSufficientMemory" Apr 17 07:51:20.630789 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.630694 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-176.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 07:51:20.630789 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.630709 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-176.ec2.internal" event="NodeHasSufficientPID" Apr 17 07:51:20.633083 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.633068 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 07:51:20.633293 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.633264 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-176.ec2.internal" Apr 17 07:51:20.633293 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.633290 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 07:51:20.633776 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.633759 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-176.ec2.internal" event="NodeHasSufficientMemory" Apr 17 07:51:20.633855 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.633784 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-176.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 07:51:20.633855 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.633764 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-176.ec2.internal" event="NodeHasSufficientMemory" Apr 17 07:51:20.633855 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.633816 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-176.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 07:51:20.633855 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.633830 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-176.ec2.internal" event="NodeHasSufficientPID" Apr 17 07:51:20.633855 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.633793 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-176.ec2.internal" event="NodeHasSufficientPID" Apr 17 07:51:20.635995 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.635980 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-176.ec2.internal" Apr 17 07:51:20.636079 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.636002 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 07:51:20.636648 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.636626 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-176.ec2.internal" event="NodeHasSufficientMemory" Apr 17 07:51:20.636715 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.636665 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-176.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 07:51:20.636715 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.636680 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-176.ec2.internal" event="NodeHasSufficientPID" Apr 17 07:51:20.653674 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:20.653657 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-176.ec2.internal\" not found" node="ip-10-0-134-176.ec2.internal" Apr 17 07:51:20.657067 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:20.657050 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-176.ec2.internal\" not found" node="ip-10-0-134-176.ec2.internal" Apr 17 07:51:20.694055 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:20.694035 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-176.ec2.internal\" not found" Apr 17 07:51:20.706763 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.706742 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/660b5cf6d517612ce055ec178beb9c3f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-176.ec2.internal\" (UID: \"660b5cf6d517612ce055ec178beb9c3f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-176.ec2.internal" Apr 17 07:51:20.706836 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.706769 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/660b5cf6d517612ce055ec178beb9c3f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-176.ec2.internal\" (UID: \"660b5cf6d517612ce055ec178beb9c3f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-176.ec2.internal" Apr 17 07:51:20.706836 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.706789 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f5dc70df488813169343e62031a0c95f-config\") pod \"kube-apiserver-proxy-ip-10-0-134-176.ec2.internal\" (UID: \"f5dc70df488813169343e62031a0c95f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-176.ec2.internal" Apr 17 07:51:20.794808 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:20.794787 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-176.ec2.internal\" not found" Apr 17 07:51:20.807198 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.807182 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f5dc70df488813169343e62031a0c95f-config\") pod \"kube-apiserver-proxy-ip-10-0-134-176.ec2.internal\" (UID: \"f5dc70df488813169343e62031a0c95f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-176.ec2.internal" Apr 17 07:51:20.807250 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.807204 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/660b5cf6d517612ce055ec178beb9c3f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-176.ec2.internal\" (UID: \"660b5cf6d517612ce055ec178beb9c3f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-176.ec2.internal" Apr 17 07:51:20.807250 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.807224 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/660b5cf6d517612ce055ec178beb9c3f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-176.ec2.internal\" (UID: \"660b5cf6d517612ce055ec178beb9c3f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-176.ec2.internal" Apr 17 07:51:20.807344 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.807260 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/660b5cf6d517612ce055ec178beb9c3f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-176.ec2.internal\" (UID: \"660b5cf6d517612ce055ec178beb9c3f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-176.ec2.internal" Apr 17 07:51:20.807344 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.807276 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f5dc70df488813169343e62031a0c95f-config\") pod \"kube-apiserver-proxy-ip-10-0-134-176.ec2.internal\" (UID: \"f5dc70df488813169343e62031a0c95f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-176.ec2.internal" Apr 17 07:51:20.807344 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.807311 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/660b5cf6d517612ce055ec178beb9c3f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-176.ec2.internal\" (UID: \"660b5cf6d517612ce055ec178beb9c3f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-176.ec2.internal" Apr 17 07:51:20.895629 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:20.895558 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-176.ec2.internal\" not found" Apr 17 07:51:20.956016 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.955984 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-176.ec2.internal" Apr 17 07:51:20.959604 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:20.959589 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-176.ec2.internal" Apr 17 07:51:20.996373 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:20.996346 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-176.ec2.internal\" not found" Apr 17 07:51:21.096906 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:21.096855 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-176.ec2.internal\" not found" Apr 17 07:51:21.197408 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:21.197385 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-176.ec2.internal\" not found" Apr 17 07:51:21.297962 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:21.297936 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-176.ec2.internal\" not found" Apr 17 07:51:21.318448 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:21.318429 2578 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 07:51:21.318565 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:21.318549 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 07:51:21.318612 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:21.318592 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 07:51:21.394022 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:21.393980 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 07:46:20 +0000 UTC" deadline="2027-10-24 04:56:34.979958564 +0000 UTC" Apr 17 07:51:21.394022 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:21.394024 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13317h5m13.585943458s" Apr 17 07:51:21.398025 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:21.398004 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-176.ec2.internal\" not found" Apr 17 07:51:21.404153 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:21.404125 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 07:51:21.424614 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:21.424591 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 07:51:21.448877 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:21.448813 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-6zcpc" Apr 17 07:51:21.455943 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:21.455927 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-6zcpc" Apr 17 07:51:21.489042 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:21.489011 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5dc70df488813169343e62031a0c95f.slice/crio-c9e18c749a49c46367d2341153b6c738fdf5e1e175f83c39fe2e1cdf530560ed WatchSource:0}: Error finding container c9e18c749a49c46367d2341153b6c738fdf5e1e175f83c39fe2e1cdf530560ed: Status 404 returned error can't find the container with id c9e18c749a49c46367d2341153b6c738fdf5e1e175f83c39fe2e1cdf530560ed Apr 17 07:51:21.489299 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:21.489276 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod660b5cf6d517612ce055ec178beb9c3f.slice/crio-d8986bfbac5eb033504854f253f63c56d004fc1b2c6a7f9aeccd314338cb5073 WatchSource:0}: Error finding container d8986bfbac5eb033504854f253f63c56d004fc1b2c6a7f9aeccd314338cb5073: Status 404 returned error can't find the container with id d8986bfbac5eb033504854f253f63c56d004fc1b2c6a7f9aeccd314338cb5073 Apr 17 07:51:21.494206 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:21.494190 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 07:51:21.498795 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:21.498776 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-176.ec2.internal\" not found" Apr 17 07:51:21.532291 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:21.532212 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-176.ec2.internal" event={"ID":"660b5cf6d517612ce055ec178beb9c3f","Type":"ContainerStarted","Data":"d8986bfbac5eb033504854f253f63c56d004fc1b2c6a7f9aeccd314338cb5073"} Apr 17 07:51:21.533064 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:21.533031 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-176.ec2.internal" event={"ID":"f5dc70df488813169343e62031a0c95f","Type":"ContainerStarted","Data":"c9e18c749a49c46367d2341153b6c738fdf5e1e175f83c39fe2e1cdf530560ed"} Apr 17 07:51:21.599040 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:21.599010 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-176.ec2.internal\" not found" Apr 17 07:51:21.600823 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:21.600806 2578 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:51:21.604641 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:21.604621 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-176.ec2.internal" Apr 17 07:51:21.615708 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:21.615687 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 07:51:21.617363 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:21.617350 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-176.ec2.internal" Apr 17 07:51:21.623378 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:21.623361 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 07:51:21.910341 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:21.910274 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:51:22.384631 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.384555 2578 apiserver.go:52] "Watching apiserver" Apr 17 07:51:22.386749 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.386719 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:51:22.390391 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.390369 2578 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 07:51:22.391593 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.391567 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-6vh7t","kube-system/kube-apiserver-proxy-ip-10-0-134-176.ec2.internal","openshift-multus/multus-additional-cni-plugins-mk6gb","openshift-multus/multus-g768l","openshift-network-diagnostics/network-check-target-s895b","openshift-network-operator/iptables-alerter-l64cj","openshift-ovn-kubernetes/ovnkube-node-rnznv","kube-system/global-pull-secret-syncer-k9qmk","kube-system/konnectivity-agent-wwr4f","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-77jwq","openshift-cluster-node-tuning-operator/tuned-h8mxt","openshift-dns/node-resolver-cfdh5","openshift-image-registry/node-ca-jf4kz","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-176.ec2.internal"] Apr 17 07:51:22.394569 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.394551 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.396747 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.396711 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jf4kz" Apr 17 07:51:22.397162 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.396903 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 07:51:22.397162 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.396982 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 07:51:22.397162 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.396906 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 07:51:22.397344 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.397266 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 07:51:22.397493 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.397476 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 07:51:22.397570 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.397523 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-gp54w\"" Apr 17 07:51:22.397657 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.397627 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 07:51:22.398745 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.398724 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 07:51:22.399098 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.399076 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 07:51:22.399202 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.399176 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-kg88g\"" Apr 17 07:51:22.399202 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.399188 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 07:51:22.401603 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.401581 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-mk6gb" Apr 17 07:51:22.401870 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.401851 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-g768l" Apr 17 07:51:22.403735 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.403715 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 07:51:22.404094 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.403947 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-gfkzv\"" Apr 17 07:51:22.404094 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.403963 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 07:51:22.404094 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.403978 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 07:51:22.404094 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.404018 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 07:51:22.404094 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.404031 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 07:51:22.404094 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.403954 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 07:51:22.404408 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.404286 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-fv5cs\"" Apr 17 07:51:22.405993 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.405974 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s895b" Apr 17 07:51:22.406085 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.406047 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-l64cj" Apr 17 07:51:22.406312 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:22.406040 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s895b" podUID="c47a920d-1db5-42ad-9b8e-ae9649778582" Apr 17 07:51:22.408102 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.408067 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-xrt5x\"" Apr 17 07:51:22.408230 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.408109 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 07:51:22.408230 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.408119 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 07:51:22.408230 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.408209 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 07:51:22.408508 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.408488 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-wwr4f" Apr 17 07:51:22.410748 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.410523 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 07:51:22.410748 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.410574 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-hrtps\"" Apr 17 07:51:22.411289 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.410975 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 07:51:22.411456 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.411438 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vh7t" Apr 17 07:51:22.411532 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:22.411512 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vh7t" podUID="85afae1f-542f-4ccc-b1bd-45ba0e0c418f" Apr 17 07:51:22.415166 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.415127 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-cfdh5" Apr 17 07:51:22.415429 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.415404 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e22e30ae-58f8-41be-9023-53dbea7c6e98-env-overrides\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.415531 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.415443 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppt6d\" (UniqueName: \"kubernetes.io/projected/e22e30ae-58f8-41be-9023-53dbea7c6e98-kube-api-access-ppt6d\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.415531 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.415511 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ca63f18c-753f-468e-b6ab-a7a1608ee9ef-host\") pod \"node-ca-jf4kz\" (UID: \"ca63f18c-753f-468e-b6ab-a7a1608ee9ef\") " pod="openshift-image-registry/node-ca-jf4kz" Apr 17 07:51:22.415654 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.415539 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3b851da0-b5d9-4467-80b9-e5ec59af0f5b-system-cni-dir\") pod \"multus-additional-cni-plugins-mk6gb\" (UID: \"3b851da0-b5d9-4467-80b9-e5ec59af0f5b\") " pod="openshift-multus/multus-additional-cni-plugins-mk6gb" Apr 17 07:51:22.415654 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.415567 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/551b9d4f-0c1e-440f-8580-a99be726c79b-cnibin\") pod \"multus-g768l\" (UID: \"551b9d4f-0c1e-440f-8580-a99be726c79b\") " pod="openshift-multus/multus-g768l" Apr 17 07:51:22.415654 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.415624 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/551b9d4f-0c1e-440f-8580-a99be726c79b-os-release\") pod \"multus-g768l\" (UID: \"551b9d4f-0c1e-440f-8580-a99be726c79b\") " pod="openshift-multus/multus-g768l" Apr 17 07:51:22.415654 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.415649 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/551b9d4f-0c1e-440f-8580-a99be726c79b-host-run-k8s-cni-cncf-io\") pod \"multus-g768l\" (UID: \"551b9d4f-0c1e-440f-8580-a99be726c79b\") " pod="openshift-multus/multus-g768l" Apr 17 07:51:22.415850 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.415673 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/551b9d4f-0c1e-440f-8580-a99be726c79b-etc-kubernetes\") pod \"multus-g768l\" (UID: \"551b9d4f-0c1e-440f-8580-a99be726c79b\") " pod="openshift-multus/multus-g768l" Apr 17 07:51:22.415850 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.415699 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e22e30ae-58f8-41be-9023-53dbea7c6e98-host-cni-netd\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.415850 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.415721 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3b851da0-b5d9-4467-80b9-e5ec59af0f5b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mk6gb\" (UID: \"3b851da0-b5d9-4467-80b9-e5ec59af0f5b\") " pod="openshift-multus/multus-additional-cni-plugins-mk6gb" Apr 17 07:51:22.415850 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.415759 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/551b9d4f-0c1e-440f-8580-a99be726c79b-cni-binary-copy\") pod \"multus-g768l\" (UID: \"551b9d4f-0c1e-440f-8580-a99be726c79b\") " pod="openshift-multus/multus-g768l" Apr 17 07:51:22.415850 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.415778 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/551b9d4f-0c1e-440f-8580-a99be726c79b-multus-socket-dir-parent\") pod \"multus-g768l\" (UID: \"551b9d4f-0c1e-440f-8580-a99be726c79b\") " pod="openshift-multus/multus-g768l" Apr 17 07:51:22.415850 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.415801 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fc13c055-cb05-4a25-a5e5-93cef3f0760b-host-slash\") pod \"iptables-alerter-l64cj\" (UID: \"fc13c055-cb05-4a25-a5e5-93cef3f0760b\") " pod="openshift-network-operator/iptables-alerter-l64cj" Apr 17 07:51:22.415850 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.415823 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e22e30ae-58f8-41be-9023-53dbea7c6e98-run-systemd\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.416199 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.415879 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e22e30ae-58f8-41be-9023-53dbea7c6e98-host-run-ovn-kubernetes\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.416199 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.415914 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ca63f18c-753f-468e-b6ab-a7a1608ee9ef-serviceca\") pod \"node-ca-jf4kz\" (UID: \"ca63f18c-753f-468e-b6ab-a7a1608ee9ef\") " pod="openshift-image-registry/node-ca-jf4kz" Apr 17 07:51:22.416199 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.415940 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgxbl\" (UniqueName: \"kubernetes.io/projected/ca63f18c-753f-468e-b6ab-a7a1608ee9ef-kube-api-access-dgxbl\") pod \"node-ca-jf4kz\" (UID: \"ca63f18c-753f-468e-b6ab-a7a1608ee9ef\") " pod="openshift-image-registry/node-ca-jf4kz" Apr 17 07:51:22.416199 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.415965 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/551b9d4f-0c1e-440f-8580-a99be726c79b-host-run-multus-certs\") pod \"multus-g768l\" (UID: \"551b9d4f-0c1e-440f-8580-a99be726c79b\") " pod="openshift-multus/multus-g768l" Apr 17 07:51:22.416199 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.415990 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e22e30ae-58f8-41be-9023-53dbea7c6e98-host-cni-bin\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.416199 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.416014 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3b851da0-b5d9-4467-80b9-e5ec59af0f5b-cnibin\") pod \"multus-additional-cni-plugins-mk6gb\" (UID: \"3b851da0-b5d9-4467-80b9-e5ec59af0f5b\") " pod="openshift-multus/multus-additional-cni-plugins-mk6gb" Apr 17 07:51:22.416199 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.416037 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3b851da0-b5d9-4467-80b9-e5ec59af0f5b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mk6gb\" (UID: \"3b851da0-b5d9-4467-80b9-e5ec59af0f5b\") " pod="openshift-multus/multus-additional-cni-plugins-mk6gb" Apr 17 07:51:22.416199 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.416060 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/551b9d4f-0c1e-440f-8580-a99be726c79b-host-var-lib-cni-bin\") pod \"multus-g768l\" (UID: \"551b9d4f-0c1e-440f-8580-a99be726c79b\") " pod="openshift-multus/multus-g768l" Apr 17 07:51:22.416199 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.416084 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/551b9d4f-0c1e-440f-8580-a99be726c79b-hostroot\") pod \"multus-g768l\" (UID: \"551b9d4f-0c1e-440f-8580-a99be726c79b\") " pod="openshift-multus/multus-g768l" Apr 17 07:51:22.416199 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.416108 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/fc13c055-cb05-4a25-a5e5-93cef3f0760b-iptables-alerter-script\") pod \"iptables-alerter-l64cj\" (UID: \"fc13c055-cb05-4a25-a5e5-93cef3f0760b\") " pod="openshift-network-operator/iptables-alerter-l64cj" Apr 17 07:51:22.416199 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.416185 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e22e30ae-58f8-41be-9023-53dbea7c6e98-var-lib-openvswitch\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.416647 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.416227 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e22e30ae-58f8-41be-9023-53dbea7c6e98-node-log\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.416647 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.416259 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e22e30ae-58f8-41be-9023-53dbea7c6e98-ovn-node-metrics-cert\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.416647 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.416302 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/551b9d4f-0c1e-440f-8580-a99be726c79b-host-var-lib-kubelet\") pod \"multus-g768l\" (UID: \"551b9d4f-0c1e-440f-8580-a99be726c79b\") " pod="openshift-multus/multus-g768l" Apr 17 07:51:22.416647 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.416325 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/551b9d4f-0c1e-440f-8580-a99be726c79b-multus-conf-dir\") pod \"multus-g768l\" (UID: \"551b9d4f-0c1e-440f-8580-a99be726c79b\") " pod="openshift-multus/multus-g768l" Apr 17 07:51:22.416647 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.416343 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/551b9d4f-0c1e-440f-8580-a99be726c79b-multus-daemon-config\") pod \"multus-g768l\" (UID: \"551b9d4f-0c1e-440f-8580-a99be726c79b\") " pod="openshift-multus/multus-g768l" Apr 17 07:51:22.416647 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.416365 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e22e30ae-58f8-41be-9023-53dbea7c6e98-run-openvswitch\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.416647 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.416390 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e22e30ae-58f8-41be-9023-53dbea7c6e98-run-ovn\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.416647 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.416410 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e22e30ae-58f8-41be-9023-53dbea7c6e98-ovnkube-script-lib\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.416647 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.416426 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3b851da0-b5d9-4467-80b9-e5ec59af0f5b-cni-binary-copy\") pod \"multus-additional-cni-plugins-mk6gb\" (UID: \"3b851da0-b5d9-4467-80b9-e5ec59af0f5b\") " pod="openshift-multus/multus-additional-cni-plugins-mk6gb" Apr 17 07:51:22.416647 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.416440 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3b851da0-b5d9-4467-80b9-e5ec59af0f5b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-mk6gb\" (UID: \"3b851da0-b5d9-4467-80b9-e5ec59af0f5b\") " pod="openshift-multus/multus-additional-cni-plugins-mk6gb" Apr 17 07:51:22.416647 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.416463 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e22e30ae-58f8-41be-9023-53dbea7c6e98-host-kubelet\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.416647 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.416485 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e22e30ae-58f8-41be-9023-53dbea7c6e98-systemd-units\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.416647 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.416515 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e22e30ae-58f8-41be-9023-53dbea7c6e98-host-slash\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.416647 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.416541 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e22e30ae-58f8-41be-9023-53dbea7c6e98-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.416647 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.416568 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3b851da0-b5d9-4467-80b9-e5ec59af0f5b-os-release\") pod \"multus-additional-cni-plugins-mk6gb\" (UID: \"3b851da0-b5d9-4467-80b9-e5ec59af0f5b\") " pod="openshift-multus/multus-additional-cni-plugins-mk6gb" Apr 17 07:51:22.416647 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.416592 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6jn4\" (UniqueName: \"kubernetes.io/projected/3b851da0-b5d9-4467-80b9-e5ec59af0f5b-kube-api-access-t6jn4\") pod \"multus-additional-cni-plugins-mk6gb\" (UID: \"3b851da0-b5d9-4467-80b9-e5ec59af0f5b\") " pod="openshift-multus/multus-additional-cni-plugins-mk6gb" Apr 17 07:51:22.417242 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.416622 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/551b9d4f-0c1e-440f-8580-a99be726c79b-system-cni-dir\") pod \"multus-g768l\" (UID: \"551b9d4f-0c1e-440f-8580-a99be726c79b\") " pod="openshift-multus/multus-g768l" Apr 17 07:51:22.417242 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.416654 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/551b9d4f-0c1e-440f-8580-a99be726c79b-host-run-netns\") pod \"multus-g768l\" (UID: \"551b9d4f-0c1e-440f-8580-a99be726c79b\") " pod="openshift-multus/multus-g768l" Apr 17 07:51:22.417242 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.416672 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/551b9d4f-0c1e-440f-8580-a99be726c79b-host-var-lib-cni-multus\") pod \"multus-g768l\" (UID: \"551b9d4f-0c1e-440f-8580-a99be726c79b\") " pod="openshift-multus/multus-g768l" Apr 17 07:51:22.417242 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.416691 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x87jc\" (UniqueName: \"kubernetes.io/projected/c47a920d-1db5-42ad-9b8e-ae9649778582-kube-api-access-x87jc\") pod \"network-check-target-s895b\" (UID: \"c47a920d-1db5-42ad-9b8e-ae9649778582\") " pod="openshift-network-diagnostics/network-check-target-s895b" Apr 17 07:51:22.417242 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.416716 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mt89\" (UniqueName: \"kubernetes.io/projected/fc13c055-cb05-4a25-a5e5-93cef3f0760b-kube-api-access-6mt89\") pod \"iptables-alerter-l64cj\" (UID: \"fc13c055-cb05-4a25-a5e5-93cef3f0760b\") " pod="openshift-network-operator/iptables-alerter-l64cj" Apr 17 07:51:22.417242 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.416763 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e22e30ae-58f8-41be-9023-53dbea7c6e98-host-run-netns\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.417242 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.416806 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e22e30ae-58f8-41be-9023-53dbea7c6e98-etc-openvswitch\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.417242 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.416830 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/551b9d4f-0c1e-440f-8580-a99be726c79b-multus-cni-dir\") pod \"multus-g768l\" (UID: \"551b9d4f-0c1e-440f-8580-a99be726c79b\") " pod="openshift-multus/multus-g768l" Apr 17 07:51:22.417242 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.416854 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52cjm\" (UniqueName: \"kubernetes.io/projected/551b9d4f-0c1e-440f-8580-a99be726c79b-kube-api-access-52cjm\") pod \"multus-g768l\" (UID: \"551b9d4f-0c1e-440f-8580-a99be726c79b\") " pod="openshift-multus/multus-g768l" Apr 17 07:51:22.417242 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.416876 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e22e30ae-58f8-41be-9023-53dbea7c6e98-log-socket\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.417242 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.416902 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e22e30ae-58f8-41be-9023-53dbea7c6e98-ovnkube-config\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.417685 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.417548 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 07:51:22.417685 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.417575 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 07:51:22.417685 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.417668 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-77jwq" Apr 17 07:51:22.417807 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.417792 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-n2pqw\"" Apr 17 07:51:22.419611 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.419590 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 07:51:22.419691 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.419609 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 07:51:22.419955 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.419939 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 07:51:22.420015 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.419998 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-v8vrc\"" Apr 17 07:51:22.420903 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.420879 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-h8mxt" Apr 17 07:51:22.422881 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.422861 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-jwllh\"" Apr 17 07:51:22.422969 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.422914 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 07:51:22.423208 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.423189 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 07:51:22.423367 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.423346 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k9qmk" Apr 17 07:51:22.423458 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:22.423411 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k9qmk" podUID="34f61707-b762-47d3-b7c3-a54999ad703b" Apr 17 07:51:22.456585 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.456559 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 07:46:21 +0000 UTC" deadline="2027-10-14 11:13:55.405536993 +0000 UTC" Apr 17 07:51:22.456733 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.456714 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13083h22m32.94882939s" Apr 17 07:51:22.506698 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.506663 2578 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 07:51:22.517218 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.517193 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/231772c6-755f-4c20-83c4-6013d0df7223-etc-sysctl-d\") pod \"tuned-h8mxt\" (UID: \"231772c6-755f-4c20-83c4-6013d0df7223\") " pod="openshift-cluster-node-tuning-operator/tuned-h8mxt" Apr 17 07:51:22.517330 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.517225 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/551b9d4f-0c1e-440f-8580-a99be726c79b-multus-socket-dir-parent\") pod \"multus-g768l\" (UID: \"551b9d4f-0c1e-440f-8580-a99be726c79b\") " pod="openshift-multus/multus-g768l" Apr 17 07:51:22.517330 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.517251 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ca63f18c-753f-468e-b6ab-a7a1608ee9ef-serviceca\") pod \"node-ca-jf4kz\" (UID: \"ca63f18c-753f-468e-b6ab-a7a1608ee9ef\") " pod="openshift-image-registry/node-ca-jf4kz" Apr 17 07:51:22.517330 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.517270 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dgxbl\" (UniqueName: \"kubernetes.io/projected/ca63f18c-753f-468e-b6ab-a7a1608ee9ef-kube-api-access-dgxbl\") pod \"node-ca-jf4kz\" (UID: \"ca63f18c-753f-468e-b6ab-a7a1608ee9ef\") " pod="openshift-image-registry/node-ca-jf4kz" Apr 17 07:51:22.517330 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.517295 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a6642e2d-0acd-4e4b-8013-72420908123d-hosts-file\") pod \"node-resolver-cfdh5\" (UID: \"a6642e2d-0acd-4e4b-8013-72420908123d\") " pod="openshift-dns/node-resolver-cfdh5" Apr 17 07:51:22.517330 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.517320 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/551b9d4f-0c1e-440f-8580-a99be726c79b-multus-socket-dir-parent\") pod \"multus-g768l\" (UID: \"551b9d4f-0c1e-440f-8580-a99be726c79b\") " pod="openshift-multus/multus-g768l" Apr 17 07:51:22.517584 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.517323 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/34f61707-b762-47d3-b7c3-a54999ad703b-dbus\") pod \"global-pull-secret-syncer-k9qmk\" (UID: \"34f61707-b762-47d3-b7c3-a54999ad703b\") " pod="kube-system/global-pull-secret-syncer-k9qmk" Apr 17 07:51:22.517584 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.517380 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/4f3abb21-4fac-471f-a620-eac7abc32e29-agent-certs\") pod \"konnectivity-agent-wwr4f\" (UID: \"4f3abb21-4fac-471f-a620-eac7abc32e29\") " pod="kube-system/konnectivity-agent-wwr4f" Apr 17 07:51:22.517584 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.517411 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/4f3abb21-4fac-471f-a620-eac7abc32e29-konnectivity-ca\") pod \"konnectivity-agent-wwr4f\" (UID: \"4f3abb21-4fac-471f-a620-eac7abc32e29\") " pod="kube-system/konnectivity-agent-wwr4f" Apr 17 07:51:22.517584 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.517433 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ce5742ba-e78c-4173-abe6-deceee95d64e-sys-fs\") pod \"aws-ebs-csi-driver-node-77jwq\" (UID: \"ce5742ba-e78c-4173-abe6-deceee95d64e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-77jwq" Apr 17 07:51:22.517584 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.517454 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/231772c6-755f-4c20-83c4-6013d0df7223-etc-systemd\") pod \"tuned-h8mxt\" (UID: \"231772c6-755f-4c20-83c4-6013d0df7223\") " pod="openshift-cluster-node-tuning-operator/tuned-h8mxt" Apr 17 07:51:22.517584 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.517519 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/551b9d4f-0c1e-440f-8580-a99be726c79b-host-var-lib-cni-bin\") pod \"multus-g768l\" (UID: \"551b9d4f-0c1e-440f-8580-a99be726c79b\") " pod="openshift-multus/multus-g768l" Apr 17 07:51:22.517584 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.517549 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e22e30ae-58f8-41be-9023-53dbea7c6e98-node-log\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.517912 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.517601 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e22e30ae-58f8-41be-9023-53dbea7c6e98-ovn-node-metrics-cert\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.517912 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.517603 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e22e30ae-58f8-41be-9023-53dbea7c6e98-node-log\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.517912 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.517626 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ce5742ba-e78c-4173-abe6-deceee95d64e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-77jwq\" (UID: \"ce5742ba-e78c-4173-abe6-deceee95d64e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-77jwq" Apr 17 07:51:22.517912 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.517600 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/551b9d4f-0c1e-440f-8580-a99be726c79b-host-var-lib-cni-bin\") pod \"multus-g768l\" (UID: \"551b9d4f-0c1e-440f-8580-a99be726c79b\") " pod="openshift-multus/multus-g768l" Apr 17 07:51:22.517912 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.517647 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/551b9d4f-0c1e-440f-8580-a99be726c79b-multus-daemon-config\") pod \"multus-g768l\" (UID: \"551b9d4f-0c1e-440f-8580-a99be726c79b\") " pod="openshift-multus/multus-g768l" Apr 17 07:51:22.517912 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.517681 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3b851da0-b5d9-4467-80b9-e5ec59af0f5b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-mk6gb\" (UID: \"3b851da0-b5d9-4467-80b9-e5ec59af0f5b\") " pod="openshift-multus/multus-additional-cni-plugins-mk6gb" Apr 17 07:51:22.517912 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.517705 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t6jn4\" (UniqueName: \"kubernetes.io/projected/3b851da0-b5d9-4467-80b9-e5ec59af0f5b-kube-api-access-t6jn4\") pod \"multus-additional-cni-plugins-mk6gb\" (UID: \"3b851da0-b5d9-4467-80b9-e5ec59af0f5b\") " pod="openshift-multus/multus-additional-cni-plugins-mk6gb" Apr 17 07:51:22.517912 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.517730 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ce5742ba-e78c-4173-abe6-deceee95d64e-registration-dir\") pod \"aws-ebs-csi-driver-node-77jwq\" (UID: \"ce5742ba-e78c-4173-abe6-deceee95d64e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-77jwq" Apr 17 07:51:22.517912 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.517777 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ca63f18c-753f-468e-b6ab-a7a1608ee9ef-serviceca\") pod \"node-ca-jf4kz\" (UID: \"ca63f18c-753f-468e-b6ab-a7a1608ee9ef\") " pod="openshift-image-registry/node-ca-jf4kz" Apr 17 07:51:22.518359 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.517959 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/231772c6-755f-4c20-83c4-6013d0df7223-etc-kubernetes\") pod \"tuned-h8mxt\" (UID: \"231772c6-755f-4c20-83c4-6013d0df7223\") " pod="openshift-cluster-node-tuning-operator/tuned-h8mxt" Apr 17 07:51:22.518359 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.517966 2578 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 07:51:22.518359 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.517994 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e22e30ae-58f8-41be-9023-53dbea7c6e98-host-kubelet\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.518359 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.518019 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e22e30ae-58f8-41be-9023-53dbea7c6e98-systemd-units\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.518359 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.518046 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e22e30ae-58f8-41be-9023-53dbea7c6e98-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.518359 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.518072 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3b851da0-b5d9-4467-80b9-e5ec59af0f5b-os-release\") pod \"multus-additional-cni-plugins-mk6gb\" (UID: \"3b851da0-b5d9-4467-80b9-e5ec59af0f5b\") " pod="openshift-multus/multus-additional-cni-plugins-mk6gb" Apr 17 07:51:22.518359 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.518099 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/85afae1f-542f-4ccc-b1bd-45ba0e0c418f-metrics-certs\") pod \"network-metrics-daemon-6vh7t\" (UID: \"85afae1f-542f-4ccc-b1bd-45ba0e0c418f\") " pod="openshift-multus/network-metrics-daemon-6vh7t" Apr 17 07:51:22.518359 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.518127 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/551b9d4f-0c1e-440f-8580-a99be726c79b-host-var-lib-cni-multus\") pod \"multus-g768l\" (UID: \"551b9d4f-0c1e-440f-8580-a99be726c79b\") " pod="openshift-multus/multus-g768l" Apr 17 07:51:22.518359 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.518174 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x87jc\" (UniqueName: \"kubernetes.io/projected/c47a920d-1db5-42ad-9b8e-ae9649778582-kube-api-access-x87jc\") pod \"network-check-target-s895b\" (UID: \"c47a920d-1db5-42ad-9b8e-ae9649778582\") " pod="openshift-network-diagnostics/network-check-target-s895b" Apr 17 07:51:22.518359 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.518200 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e22e30ae-58f8-41be-9023-53dbea7c6e98-host-run-netns\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.518359 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.518224 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e22e30ae-58f8-41be-9023-53dbea7c6e98-etc-openvswitch\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.518359 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.518255 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e22e30ae-58f8-41be-9023-53dbea7c6e98-ovnkube-config\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.518359 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.518310 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3b851da0-b5d9-4467-80b9-e5ec59af0f5b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-mk6gb\" (UID: \"3b851da0-b5d9-4467-80b9-e5ec59af0f5b\") " pod="openshift-multus/multus-additional-cni-plugins-mk6gb" Apr 17 07:51:22.518359 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.518329 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/551b9d4f-0c1e-440f-8580-a99be726c79b-multus-daemon-config\") pod \"multus-g768l\" (UID: \"551b9d4f-0c1e-440f-8580-a99be726c79b\") " pod="openshift-multus/multus-g768l" Apr 17 07:51:22.518359 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.518334 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3b851da0-b5d9-4467-80b9-e5ec59af0f5b-os-release\") pod \"multus-additional-cni-plugins-mk6gb\" (UID: \"3b851da0-b5d9-4467-80b9-e5ec59af0f5b\") " pod="openshift-multus/multus-additional-cni-plugins-mk6gb" Apr 17 07:51:22.518359 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.518359 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ca63f18c-753f-468e-b6ab-a7a1608ee9ef-host\") pod \"node-ca-jf4kz\" (UID: \"ca63f18c-753f-468e-b6ab-a7a1608ee9ef\") " pod="openshift-image-registry/node-ca-jf4kz" Apr 17 07:51:22.519069 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.518384 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e22e30ae-58f8-41be-9023-53dbea7c6e98-host-kubelet\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.519069 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.518407 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e22e30ae-58f8-41be-9023-53dbea7c6e98-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.519069 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.518413 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3b851da0-b5d9-4467-80b9-e5ec59af0f5b-system-cni-dir\") pod \"multus-additional-cni-plugins-mk6gb\" (UID: \"3b851da0-b5d9-4467-80b9-e5ec59af0f5b\") " pod="openshift-multus/multus-additional-cni-plugins-mk6gb" Apr 17 07:51:22.519069 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.518425 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ca63f18c-753f-468e-b6ab-a7a1608ee9ef-host\") pod \"node-ca-jf4kz\" (UID: \"ca63f18c-753f-468e-b6ab-a7a1608ee9ef\") " pod="openshift-image-registry/node-ca-jf4kz" Apr 17 07:51:22.519069 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.518436 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e22e30ae-58f8-41be-9023-53dbea7c6e98-log-socket\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.519069 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.518469 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/551b9d4f-0c1e-440f-8580-a99be726c79b-host-var-lib-cni-multus\") pod \"multus-g768l\" (UID: \"551b9d4f-0c1e-440f-8580-a99be726c79b\") " pod="openshift-multus/multus-g768l" Apr 17 07:51:22.519069 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.518482 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e22e30ae-58f8-41be-9023-53dbea7c6e98-etc-openvswitch\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.519069 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.518510 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e22e30ae-58f8-41be-9023-53dbea7c6e98-log-socket\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.519069 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.518558 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3b851da0-b5d9-4467-80b9-e5ec59af0f5b-system-cni-dir\") pod \"multus-additional-cni-plugins-mk6gb\" (UID: \"3b851da0-b5d9-4467-80b9-e5ec59af0f5b\") " pod="openshift-multus/multus-additional-cni-plugins-mk6gb" Apr 17 07:51:22.519069 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.518591 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e22e30ae-58f8-41be-9023-53dbea7c6e98-env-overrides\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.519069 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.518620 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ppt6d\" (UniqueName: \"kubernetes.io/projected/e22e30ae-58f8-41be-9023-53dbea7c6e98-kube-api-access-ppt6d\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.519069 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.518663 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e22e30ae-58f8-41be-9023-53dbea7c6e98-host-run-netns\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.519069 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.518670 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcqvh\" (UniqueName: \"kubernetes.io/projected/85afae1f-542f-4ccc-b1bd-45ba0e0c418f-kube-api-access-tcqvh\") pod \"network-metrics-daemon-6vh7t\" (UID: \"85afae1f-542f-4ccc-b1bd-45ba0e0c418f\") " pod="openshift-multus/network-metrics-daemon-6vh7t" Apr 17 07:51:22.519069 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.518705 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e22e30ae-58f8-41be-9023-53dbea7c6e98-systemd-units\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.519069 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.518834 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e22e30ae-58f8-41be-9023-53dbea7c6e98-ovnkube-config\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.519069 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.518858 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ce5742ba-e78c-4173-abe6-deceee95d64e-device-dir\") pod \"aws-ebs-csi-driver-node-77jwq\" (UID: \"ce5742ba-e78c-4173-abe6-deceee95d64e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-77jwq" Apr 17 07:51:22.519069 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.518890 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/231772c6-755f-4c20-83c4-6013d0df7223-etc-modprobe-d\") pod \"tuned-h8mxt\" (UID: \"231772c6-755f-4c20-83c4-6013d0df7223\") " pod="openshift-cluster-node-tuning-operator/tuned-h8mxt" Apr 17 07:51:22.519795 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.518912 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/231772c6-755f-4c20-83c4-6013d0df7223-host\") pod \"tuned-h8mxt\" (UID: \"231772c6-755f-4c20-83c4-6013d0df7223\") " pod="openshift-cluster-node-tuning-operator/tuned-h8mxt" Apr 17 07:51:22.519795 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.518935 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/34f61707-b762-47d3-b7c3-a54999ad703b-original-pull-secret\") pod \"global-pull-secret-syncer-k9qmk\" (UID: \"34f61707-b762-47d3-b7c3-a54999ad703b\") " pod="kube-system/global-pull-secret-syncer-k9qmk" Apr 17 07:51:22.519795 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.518965 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/551b9d4f-0c1e-440f-8580-a99be726c79b-cnibin\") pod \"multus-g768l\" (UID: \"551b9d4f-0c1e-440f-8580-a99be726c79b\") " pod="openshift-multus/multus-g768l" Apr 17 07:51:22.519795 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.518992 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/551b9d4f-0c1e-440f-8580-a99be726c79b-host-run-k8s-cni-cncf-io\") pod \"multus-g768l\" (UID: \"551b9d4f-0c1e-440f-8580-a99be726c79b\") " pod="openshift-multus/multus-g768l" Apr 17 07:51:22.519795 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.519012 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e22e30ae-58f8-41be-9023-53dbea7c6e98-env-overrides\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.519795 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.519017 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3b851da0-b5d9-4467-80b9-e5ec59af0f5b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mk6gb\" (UID: \"3b851da0-b5d9-4467-80b9-e5ec59af0f5b\") " pod="openshift-multus/multus-additional-cni-plugins-mk6gb" Apr 17 07:51:22.519795 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.519033 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/551b9d4f-0c1e-440f-8580-a99be726c79b-host-run-k8s-cni-cncf-io\") pod \"multus-g768l\" (UID: \"551b9d4f-0c1e-440f-8580-a99be726c79b\") " pod="openshift-multus/multus-g768l" Apr 17 07:51:22.519795 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.519040 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/551b9d4f-0c1e-440f-8580-a99be726c79b-cnibin\") pod \"multus-g768l\" (UID: \"551b9d4f-0c1e-440f-8580-a99be726c79b\") " pod="openshift-multus/multus-g768l" Apr 17 07:51:22.519795 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.519060 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/551b9d4f-0c1e-440f-8580-a99be726c79b-cni-binary-copy\") pod \"multus-g768l\" (UID: \"551b9d4f-0c1e-440f-8580-a99be726c79b\") " pod="openshift-multus/multus-g768l" Apr 17 07:51:22.519795 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.519089 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fc13c055-cb05-4a25-a5e5-93cef3f0760b-host-slash\") pod \"iptables-alerter-l64cj\" (UID: \"fc13c055-cb05-4a25-a5e5-93cef3f0760b\") " pod="openshift-network-operator/iptables-alerter-l64cj" Apr 17 07:51:22.519795 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.519109 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e22e30ae-58f8-41be-9023-53dbea7c6e98-run-systemd\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.519795 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.519134 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e22e30ae-58f8-41be-9023-53dbea7c6e98-host-run-ovn-kubernetes\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.519795 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.519181 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/551b9d4f-0c1e-440f-8580-a99be726c79b-host-run-multus-certs\") pod \"multus-g768l\" (UID: \"551b9d4f-0c1e-440f-8580-a99be726c79b\") " pod="openshift-multus/multus-g768l" Apr 17 07:51:22.519795 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.519197 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fc13c055-cb05-4a25-a5e5-93cef3f0760b-host-slash\") pod \"iptables-alerter-l64cj\" (UID: \"fc13c055-cb05-4a25-a5e5-93cef3f0760b\") " pod="openshift-network-operator/iptables-alerter-l64cj" Apr 17 07:51:22.519795 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.519204 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e22e30ae-58f8-41be-9023-53dbea7c6e98-host-cni-bin\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.519795 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.519229 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3b851da0-b5d9-4467-80b9-e5ec59af0f5b-cnibin\") pod \"multus-additional-cni-plugins-mk6gb\" (UID: \"3b851da0-b5d9-4467-80b9-e5ec59af0f5b\") " pod="openshift-multus/multus-additional-cni-plugins-mk6gb" Apr 17 07:51:22.519795 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.519254 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3b851da0-b5d9-4467-80b9-e5ec59af0f5b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mk6gb\" (UID: \"3b851da0-b5d9-4467-80b9-e5ec59af0f5b\") " pod="openshift-multus/multus-additional-cni-plugins-mk6gb" Apr 17 07:51:22.520644 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.519259 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e22e30ae-58f8-41be-9023-53dbea7c6e98-run-systemd\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.520644 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.519281 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ce5742ba-e78c-4173-abe6-deceee95d64e-socket-dir\") pod \"aws-ebs-csi-driver-node-77jwq\" (UID: \"ce5742ba-e78c-4173-abe6-deceee95d64e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-77jwq" Apr 17 07:51:22.520644 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.519294 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e22e30ae-58f8-41be-9023-53dbea7c6e98-host-run-ovn-kubernetes\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.520644 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.519309 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtdsz\" (UniqueName: \"kubernetes.io/projected/ce5742ba-e78c-4173-abe6-deceee95d64e-kube-api-access-xtdsz\") pod \"aws-ebs-csi-driver-node-77jwq\" (UID: \"ce5742ba-e78c-4173-abe6-deceee95d64e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-77jwq" Apr 17 07:51:22.520644 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.519333 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/231772c6-755f-4c20-83c4-6013d0df7223-etc-sysconfig\") pod \"tuned-h8mxt\" (UID: \"231772c6-755f-4c20-83c4-6013d0df7223\") " pod="openshift-cluster-node-tuning-operator/tuned-h8mxt" Apr 17 07:51:22.520644 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.519342 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/551b9d4f-0c1e-440f-8580-a99be726c79b-host-run-multus-certs\") pod \"multus-g768l\" (UID: \"551b9d4f-0c1e-440f-8580-a99be726c79b\") " pod="openshift-multus/multus-g768l" Apr 17 07:51:22.520644 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.519357 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/551b9d4f-0c1e-440f-8580-a99be726c79b-hostroot\") pod \"multus-g768l\" (UID: \"551b9d4f-0c1e-440f-8580-a99be726c79b\") " pod="openshift-multus/multus-g768l" Apr 17 07:51:22.520644 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.519378 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e22e30ae-58f8-41be-9023-53dbea7c6e98-host-cni-bin\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.520644 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.519384 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/fc13c055-cb05-4a25-a5e5-93cef3f0760b-iptables-alerter-script\") pod \"iptables-alerter-l64cj\" (UID: \"fc13c055-cb05-4a25-a5e5-93cef3f0760b\") " pod="openshift-network-operator/iptables-alerter-l64cj" Apr 17 07:51:22.520644 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.519413 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e22e30ae-58f8-41be-9023-53dbea7c6e98-var-lib-openvswitch\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.520644 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.519439 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a6642e2d-0acd-4e4b-8013-72420908123d-tmp-dir\") pod \"node-resolver-cfdh5\" (UID: \"a6642e2d-0acd-4e4b-8013-72420908123d\") " pod="openshift-dns/node-resolver-cfdh5" Apr 17 07:51:22.520644 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.519482 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkpjn\" (UniqueName: \"kubernetes.io/projected/a6642e2d-0acd-4e4b-8013-72420908123d-kube-api-access-pkpjn\") pod \"node-resolver-cfdh5\" (UID: \"a6642e2d-0acd-4e4b-8013-72420908123d\") " pod="openshift-dns/node-resolver-cfdh5" Apr 17 07:51:22.520644 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.519498 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3b851da0-b5d9-4467-80b9-e5ec59af0f5b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mk6gb\" (UID: \"3b851da0-b5d9-4467-80b9-e5ec59af0f5b\") " pod="openshift-multus/multus-additional-cni-plugins-mk6gb" Apr 17 07:51:22.520644 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.519510 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/551b9d4f-0c1e-440f-8580-a99be726c79b-host-var-lib-kubelet\") pod \"multus-g768l\" (UID: \"551b9d4f-0c1e-440f-8580-a99be726c79b\") " pod="openshift-multus/multus-g768l" Apr 17 07:51:22.520644 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.519561 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/551b9d4f-0c1e-440f-8580-a99be726c79b-multus-conf-dir\") pod \"multus-g768l\" (UID: \"551b9d4f-0c1e-440f-8580-a99be726c79b\") " pod="openshift-multus/multus-g768l" Apr 17 07:51:22.520644 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.519591 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/551b9d4f-0c1e-440f-8580-a99be726c79b-hostroot\") pod \"multus-g768l\" (UID: \"551b9d4f-0c1e-440f-8580-a99be726c79b\") " pod="openshift-multus/multus-g768l" Apr 17 07:51:22.520644 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.519626 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e22e30ae-58f8-41be-9023-53dbea7c6e98-run-openvswitch\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.521492 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.519634 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e22e30ae-58f8-41be-9023-53dbea7c6e98-var-lib-openvswitch\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.521492 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.519673 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e22e30ae-58f8-41be-9023-53dbea7c6e98-run-ovn\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.521492 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.519695 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3b851da0-b5d9-4467-80b9-e5ec59af0f5b-cnibin\") pod \"multus-additional-cni-plugins-mk6gb\" (UID: \"3b851da0-b5d9-4467-80b9-e5ec59af0f5b\") " pod="openshift-multus/multus-additional-cni-plugins-mk6gb" Apr 17 07:51:22.521492 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.519699 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e22e30ae-58f8-41be-9023-53dbea7c6e98-ovnkube-script-lib\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.521492 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.519727 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3b851da0-b5d9-4467-80b9-e5ec59af0f5b-cni-binary-copy\") pod \"multus-additional-cni-plugins-mk6gb\" (UID: \"3b851da0-b5d9-4467-80b9-e5ec59af0f5b\") " pod="openshift-multus/multus-additional-cni-plugins-mk6gb" Apr 17 07:51:22.521492 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.519737 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/551b9d4f-0c1e-440f-8580-a99be726c79b-host-var-lib-kubelet\") pod \"multus-g768l\" (UID: \"551b9d4f-0c1e-440f-8580-a99be726c79b\") " pod="openshift-multus/multus-g768l" Apr 17 07:51:22.521492 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.519746 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e22e30ae-58f8-41be-9023-53dbea7c6e98-run-openvswitch\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.521492 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.519785 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/551b9d4f-0c1e-440f-8580-a99be726c79b-multus-conf-dir\") pod \"multus-g768l\" (UID: \"551b9d4f-0c1e-440f-8580-a99be726c79b\") " pod="openshift-multus/multus-g768l" Apr 17 07:51:22.521492 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.519790 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/231772c6-755f-4c20-83c4-6013d0df7223-etc-sysctl-conf\") pod \"tuned-h8mxt\" (UID: \"231772c6-755f-4c20-83c4-6013d0df7223\") " pod="openshift-cluster-node-tuning-operator/tuned-h8mxt" Apr 17 07:51:22.521492 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.519806 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e22e30ae-58f8-41be-9023-53dbea7c6e98-run-ovn\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.521492 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.519818 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e22e30ae-58f8-41be-9023-53dbea7c6e98-host-slash\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.521492 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.519936 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/231772c6-755f-4c20-83c4-6013d0df7223-run\") pod \"tuned-h8mxt\" (UID: \"231772c6-755f-4c20-83c4-6013d0df7223\") " pod="openshift-cluster-node-tuning-operator/tuned-h8mxt" Apr 17 07:51:22.521492 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.519960 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/231772c6-755f-4c20-83c4-6013d0df7223-sys\") pod \"tuned-h8mxt\" (UID: \"231772c6-755f-4c20-83c4-6013d0df7223\") " pod="openshift-cluster-node-tuning-operator/tuned-h8mxt" Apr 17 07:51:22.521492 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.519982 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/231772c6-755f-4c20-83c4-6013d0df7223-tmp\") pod \"tuned-h8mxt\" (UID: \"231772c6-755f-4c20-83c4-6013d0df7223\") " pod="openshift-cluster-node-tuning-operator/tuned-h8mxt" Apr 17 07:51:22.521492 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.520008 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hsg8\" (UniqueName: \"kubernetes.io/projected/231772c6-755f-4c20-83c4-6013d0df7223-kube-api-access-8hsg8\") pod \"tuned-h8mxt\" (UID: \"231772c6-755f-4c20-83c4-6013d0df7223\") " pod="openshift-cluster-node-tuning-operator/tuned-h8mxt" Apr 17 07:51:22.521492 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.520048 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/34f61707-b762-47d3-b7c3-a54999ad703b-kubelet-config\") pod \"global-pull-secret-syncer-k9qmk\" (UID: \"34f61707-b762-47d3-b7c3-a54999ad703b\") " pod="kube-system/global-pull-secret-syncer-k9qmk" Apr 17 07:51:22.521492 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.520079 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/551b9d4f-0c1e-440f-8580-a99be726c79b-system-cni-dir\") pod \"multus-g768l\" (UID: \"551b9d4f-0c1e-440f-8580-a99be726c79b\") " pod="openshift-multus/multus-g768l" Apr 17 07:51:22.522274 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.520106 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/551b9d4f-0c1e-440f-8580-a99be726c79b-host-run-netns\") pod \"multus-g768l\" (UID: \"551b9d4f-0c1e-440f-8580-a99be726c79b\") " pod="openshift-multus/multus-g768l" Apr 17 07:51:22.522274 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.520172 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6mt89\" (UniqueName: \"kubernetes.io/projected/fc13c055-cb05-4a25-a5e5-93cef3f0760b-kube-api-access-6mt89\") pod \"iptables-alerter-l64cj\" (UID: \"fc13c055-cb05-4a25-a5e5-93cef3f0760b\") " pod="openshift-network-operator/iptables-alerter-l64cj" Apr 17 07:51:22.522274 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.520201 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/231772c6-755f-4c20-83c4-6013d0df7223-lib-modules\") pod \"tuned-h8mxt\" (UID: \"231772c6-755f-4c20-83c4-6013d0df7223\") " pod="openshift-cluster-node-tuning-operator/tuned-h8mxt" Apr 17 07:51:22.522274 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.520229 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/231772c6-755f-4c20-83c4-6013d0df7223-var-lib-kubelet\") pod \"tuned-h8mxt\" (UID: \"231772c6-755f-4c20-83c4-6013d0df7223\") " pod="openshift-cluster-node-tuning-operator/tuned-h8mxt" Apr 17 07:51:22.522274 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.520263 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/551b9d4f-0c1e-440f-8580-a99be726c79b-multus-cni-dir\") pod \"multus-g768l\" (UID: \"551b9d4f-0c1e-440f-8580-a99be726c79b\") " pod="openshift-multus/multus-g768l" Apr 17 07:51:22.522274 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.520299 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/551b9d4f-0c1e-440f-8580-a99be726c79b-cni-binary-copy\") pod \"multus-g768l\" (UID: \"551b9d4f-0c1e-440f-8580-a99be726c79b\") " pod="openshift-multus/multus-g768l" Apr 17 07:51:22.522274 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.520315 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e22e30ae-58f8-41be-9023-53dbea7c6e98-host-slash\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.522274 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.520345 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/551b9d4f-0c1e-440f-8580-a99be726c79b-multus-cni-dir\") pod \"multus-g768l\" (UID: \"551b9d4f-0c1e-440f-8580-a99be726c79b\") " pod="openshift-multus/multus-g768l" Apr 17 07:51:22.522274 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.520355 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/fc13c055-cb05-4a25-a5e5-93cef3f0760b-iptables-alerter-script\") pod \"iptables-alerter-l64cj\" (UID: \"fc13c055-cb05-4a25-a5e5-93cef3f0760b\") " pod="openshift-network-operator/iptables-alerter-l64cj" Apr 17 07:51:22.522274 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.520366 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/551b9d4f-0c1e-440f-8580-a99be726c79b-system-cni-dir\") pod \"multus-g768l\" (UID: \"551b9d4f-0c1e-440f-8580-a99be726c79b\") " pod="openshift-multus/multus-g768l" Apr 17 07:51:22.522274 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.520689 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3b851da0-b5d9-4467-80b9-e5ec59af0f5b-cni-binary-copy\") pod \"multus-additional-cni-plugins-mk6gb\" (UID: \"3b851da0-b5d9-4467-80b9-e5ec59af0f5b\") " pod="openshift-multus/multus-additional-cni-plugins-mk6gb" Apr 17 07:51:22.522274 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.520710 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e22e30ae-58f8-41be-9023-53dbea7c6e98-ovnkube-script-lib\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.522274 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.520492 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3b851da0-b5d9-4467-80b9-e5ec59af0f5b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mk6gb\" (UID: \"3b851da0-b5d9-4467-80b9-e5ec59af0f5b\") " pod="openshift-multus/multus-additional-cni-plugins-mk6gb" Apr 17 07:51:22.522274 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.520410 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/551b9d4f-0c1e-440f-8580-a99be726c79b-host-run-netns\") pod \"multus-g768l\" (UID: \"551b9d4f-0c1e-440f-8580-a99be726c79b\") " pod="openshift-multus/multus-g768l" Apr 17 07:51:22.522274 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.520438 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-52cjm\" (UniqueName: \"kubernetes.io/projected/551b9d4f-0c1e-440f-8580-a99be726c79b-kube-api-access-52cjm\") pod \"multus-g768l\" (UID: \"551b9d4f-0c1e-440f-8580-a99be726c79b\") " pod="openshift-multus/multus-g768l" Apr 17 07:51:22.522274 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.520774 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/231772c6-755f-4c20-83c4-6013d0df7223-etc-tuned\") pod \"tuned-h8mxt\" (UID: \"231772c6-755f-4c20-83c4-6013d0df7223\") " pod="openshift-cluster-node-tuning-operator/tuned-h8mxt" Apr 17 07:51:22.522274 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.520800 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/551b9d4f-0c1e-440f-8580-a99be726c79b-os-release\") pod \"multus-g768l\" (UID: \"551b9d4f-0c1e-440f-8580-a99be726c79b\") " pod="openshift-multus/multus-g768l" Apr 17 07:51:22.522274 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.520831 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/551b9d4f-0c1e-440f-8580-a99be726c79b-etc-kubernetes\") pod \"multus-g768l\" (UID: \"551b9d4f-0c1e-440f-8580-a99be726c79b\") " pod="openshift-multus/multus-g768l" Apr 17 07:51:22.523114 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.520851 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/551b9d4f-0c1e-440f-8580-a99be726c79b-os-release\") pod \"multus-g768l\" (UID: \"551b9d4f-0c1e-440f-8580-a99be726c79b\") " pod="openshift-multus/multus-g768l" Apr 17 07:51:22.523114 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.520857 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e22e30ae-58f8-41be-9023-53dbea7c6e98-host-cni-netd\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.523114 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.520887 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ce5742ba-e78c-4173-abe6-deceee95d64e-etc-selinux\") pod \"aws-ebs-csi-driver-node-77jwq\" (UID: \"ce5742ba-e78c-4173-abe6-deceee95d64e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-77jwq" Apr 17 07:51:22.523114 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.520892 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e22e30ae-58f8-41be-9023-53dbea7c6e98-host-cni-netd\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.523114 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.520959 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/551b9d4f-0c1e-440f-8580-a99be726c79b-etc-kubernetes\") pod \"multus-g768l\" (UID: \"551b9d4f-0c1e-440f-8580-a99be726c79b\") " pod="openshift-multus/multus-g768l" Apr 17 07:51:22.523114 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.521552 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e22e30ae-58f8-41be-9023-53dbea7c6e98-ovn-node-metrics-cert\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.536189 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.536165 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6jn4\" (UniqueName: \"kubernetes.io/projected/3b851da0-b5d9-4467-80b9-e5ec59af0f5b-kube-api-access-t6jn4\") pod \"multus-additional-cni-plugins-mk6gb\" (UID: \"3b851da0-b5d9-4467-80b9-e5ec59af0f5b\") " pod="openshift-multus/multus-additional-cni-plugins-mk6gb" Apr 17 07:51:22.537919 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:22.537898 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:51:22.538020 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:22.537923 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:51:22.538020 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:22.537936 2578 projected.go:194] Error preparing data for projected volume kube-api-access-x87jc for pod openshift-network-diagnostics/network-check-target-s895b: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:22.538100 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:22.538043 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c47a920d-1db5-42ad-9b8e-ae9649778582-kube-api-access-x87jc podName:c47a920d-1db5-42ad-9b8e-ae9649778582 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:23.038010921 +0000 UTC m=+3.004047677 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-x87jc" (UniqueName: "kubernetes.io/projected/c47a920d-1db5-42ad-9b8e-ae9649778582-kube-api-access-x87jc") pod "network-check-target-s895b" (UID: "c47a920d-1db5-42ad-9b8e-ae9649778582") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:22.540190 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.540169 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-52cjm\" (UniqueName: \"kubernetes.io/projected/551b9d4f-0c1e-440f-8580-a99be726c79b-kube-api-access-52cjm\") pod \"multus-g768l\" (UID: \"551b9d4f-0c1e-440f-8580-a99be726c79b\") " pod="openshift-multus/multus-g768l" Apr 17 07:51:22.540292 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.540275 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppt6d\" (UniqueName: \"kubernetes.io/projected/e22e30ae-58f8-41be-9023-53dbea7c6e98-kube-api-access-ppt6d\") pod \"ovnkube-node-rnznv\" (UID: \"e22e30ae-58f8-41be-9023-53dbea7c6e98\") " pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.540624 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.540606 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgxbl\" (UniqueName: \"kubernetes.io/projected/ca63f18c-753f-468e-b6ab-a7a1608ee9ef-kube-api-access-dgxbl\") pod \"node-ca-jf4kz\" (UID: \"ca63f18c-753f-468e-b6ab-a7a1608ee9ef\") " pod="openshift-image-registry/node-ca-jf4kz" Apr 17 07:51:22.542369 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.542346 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mt89\" (UniqueName: \"kubernetes.io/projected/fc13c055-cb05-4a25-a5e5-93cef3f0760b-kube-api-access-6mt89\") pod \"iptables-alerter-l64cj\" (UID: \"fc13c055-cb05-4a25-a5e5-93cef3f0760b\") " pod="openshift-network-operator/iptables-alerter-l64cj" Apr 17 07:51:22.622097 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.622048 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/231772c6-755f-4c20-83c4-6013d0df7223-var-lib-kubelet\") pod \"tuned-h8mxt\" (UID: \"231772c6-755f-4c20-83c4-6013d0df7223\") " pod="openshift-cluster-node-tuning-operator/tuned-h8mxt" Apr 17 07:51:22.622097 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.622093 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/231772c6-755f-4c20-83c4-6013d0df7223-etc-tuned\") pod \"tuned-h8mxt\" (UID: \"231772c6-755f-4c20-83c4-6013d0df7223\") " pod="openshift-cluster-node-tuning-operator/tuned-h8mxt" Apr 17 07:51:22.622329 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.622173 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ce5742ba-e78c-4173-abe6-deceee95d64e-etc-selinux\") pod \"aws-ebs-csi-driver-node-77jwq\" (UID: \"ce5742ba-e78c-4173-abe6-deceee95d64e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-77jwq" Apr 17 07:51:22.622329 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.622193 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/231772c6-755f-4c20-83c4-6013d0df7223-var-lib-kubelet\") pod \"tuned-h8mxt\" (UID: \"231772c6-755f-4c20-83c4-6013d0df7223\") " pod="openshift-cluster-node-tuning-operator/tuned-h8mxt" Apr 17 07:51:22.622329 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.622199 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/231772c6-755f-4c20-83c4-6013d0df7223-etc-sysctl-d\") pod \"tuned-h8mxt\" (UID: \"231772c6-755f-4c20-83c4-6013d0df7223\") " pod="openshift-cluster-node-tuning-operator/tuned-h8mxt" Apr 17 07:51:22.622329 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.622257 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ce5742ba-e78c-4173-abe6-deceee95d64e-etc-selinux\") pod \"aws-ebs-csi-driver-node-77jwq\" (UID: \"ce5742ba-e78c-4173-abe6-deceee95d64e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-77jwq" Apr 17 07:51:22.622329 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.622283 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a6642e2d-0acd-4e4b-8013-72420908123d-hosts-file\") pod \"node-resolver-cfdh5\" (UID: \"a6642e2d-0acd-4e4b-8013-72420908123d\") " pod="openshift-dns/node-resolver-cfdh5" Apr 17 07:51:22.622329 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.622308 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/34f61707-b762-47d3-b7c3-a54999ad703b-dbus\") pod \"global-pull-secret-syncer-k9qmk\" (UID: \"34f61707-b762-47d3-b7c3-a54999ad703b\") " pod="kube-system/global-pull-secret-syncer-k9qmk" Apr 17 07:51:22.622329 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.622327 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/4f3abb21-4fac-471f-a620-eac7abc32e29-agent-certs\") pod \"konnectivity-agent-wwr4f\" (UID: \"4f3abb21-4fac-471f-a620-eac7abc32e29\") " pod="kube-system/konnectivity-agent-wwr4f" Apr 17 07:51:22.622678 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.622348 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/231772c6-755f-4c20-83c4-6013d0df7223-etc-sysctl-d\") pod \"tuned-h8mxt\" (UID: \"231772c6-755f-4c20-83c4-6013d0df7223\") " pod="openshift-cluster-node-tuning-operator/tuned-h8mxt" Apr 17 07:51:22.622678 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.622352 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/4f3abb21-4fac-471f-a620-eac7abc32e29-konnectivity-ca\") pod \"konnectivity-agent-wwr4f\" (UID: \"4f3abb21-4fac-471f-a620-eac7abc32e29\") " pod="kube-system/konnectivity-agent-wwr4f" Apr 17 07:51:22.622678 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.622385 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a6642e2d-0acd-4e4b-8013-72420908123d-hosts-file\") pod \"node-resolver-cfdh5\" (UID: \"a6642e2d-0acd-4e4b-8013-72420908123d\") " pod="openshift-dns/node-resolver-cfdh5" Apr 17 07:51:22.622678 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.622404 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ce5742ba-e78c-4173-abe6-deceee95d64e-sys-fs\") pod \"aws-ebs-csi-driver-node-77jwq\" (UID: \"ce5742ba-e78c-4173-abe6-deceee95d64e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-77jwq" Apr 17 07:51:22.622678 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.622432 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/231772c6-755f-4c20-83c4-6013d0df7223-etc-systemd\") pod \"tuned-h8mxt\" (UID: \"231772c6-755f-4c20-83c4-6013d0df7223\") " pod="openshift-cluster-node-tuning-operator/tuned-h8mxt" Apr 17 07:51:22.622678 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.622462 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ce5742ba-e78c-4173-abe6-deceee95d64e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-77jwq\" (UID: \"ce5742ba-e78c-4173-abe6-deceee95d64e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-77jwq" Apr 17 07:51:22.622678 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.622466 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/34f61707-b762-47d3-b7c3-a54999ad703b-dbus\") pod \"global-pull-secret-syncer-k9qmk\" (UID: \"34f61707-b762-47d3-b7c3-a54999ad703b\") " pod="kube-system/global-pull-secret-syncer-k9qmk" Apr 17 07:51:22.622678 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.622470 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ce5742ba-e78c-4173-abe6-deceee95d64e-sys-fs\") pod \"aws-ebs-csi-driver-node-77jwq\" (UID: \"ce5742ba-e78c-4173-abe6-deceee95d64e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-77jwq" Apr 17 07:51:22.622678 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.622489 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ce5742ba-e78c-4173-abe6-deceee95d64e-registration-dir\") pod \"aws-ebs-csi-driver-node-77jwq\" (UID: \"ce5742ba-e78c-4173-abe6-deceee95d64e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-77jwq" Apr 17 07:51:22.622678 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.622515 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/231772c6-755f-4c20-83c4-6013d0df7223-etc-systemd\") pod \"tuned-h8mxt\" (UID: \"231772c6-755f-4c20-83c4-6013d0df7223\") " pod="openshift-cluster-node-tuning-operator/tuned-h8mxt" Apr 17 07:51:22.622678 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.622524 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/231772c6-755f-4c20-83c4-6013d0df7223-etc-kubernetes\") pod \"tuned-h8mxt\" (UID: \"231772c6-755f-4c20-83c4-6013d0df7223\") " pod="openshift-cluster-node-tuning-operator/tuned-h8mxt" Apr 17 07:51:22.622678 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.622529 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ce5742ba-e78c-4173-abe6-deceee95d64e-registration-dir\") pod \"aws-ebs-csi-driver-node-77jwq\" (UID: \"ce5742ba-e78c-4173-abe6-deceee95d64e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-77jwq" Apr 17 07:51:22.622678 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.622555 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/85afae1f-542f-4ccc-b1bd-45ba0e0c418f-metrics-certs\") pod \"network-metrics-daemon-6vh7t\" (UID: \"85afae1f-542f-4ccc-b1bd-45ba0e0c418f\") " pod="openshift-multus/network-metrics-daemon-6vh7t" Apr 17 07:51:22.622678 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.622555 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ce5742ba-e78c-4173-abe6-deceee95d64e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-77jwq\" (UID: \"ce5742ba-e78c-4173-abe6-deceee95d64e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-77jwq" Apr 17 07:51:22.622678 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.622576 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/231772c6-755f-4c20-83c4-6013d0df7223-etc-kubernetes\") pod \"tuned-h8mxt\" (UID: \"231772c6-755f-4c20-83c4-6013d0df7223\") " pod="openshift-cluster-node-tuning-operator/tuned-h8mxt" Apr 17 07:51:22.622678 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.622607 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tcqvh\" (UniqueName: \"kubernetes.io/projected/85afae1f-542f-4ccc-b1bd-45ba0e0c418f-kube-api-access-tcqvh\") pod \"network-metrics-daemon-6vh7t\" (UID: \"85afae1f-542f-4ccc-b1bd-45ba0e0c418f\") " pod="openshift-multus/network-metrics-daemon-6vh7t" Apr 17 07:51:22.622678 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.622639 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ce5742ba-e78c-4173-abe6-deceee95d64e-device-dir\") pod \"aws-ebs-csi-driver-node-77jwq\" (UID: \"ce5742ba-e78c-4173-abe6-deceee95d64e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-77jwq" Apr 17 07:51:22.622678 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:22.622648 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:22.623569 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.622665 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/231772c6-755f-4c20-83c4-6013d0df7223-etc-modprobe-d\") pod \"tuned-h8mxt\" (UID: \"231772c6-755f-4c20-83c4-6013d0df7223\") " pod="openshift-cluster-node-tuning-operator/tuned-h8mxt" Apr 17 07:51:22.623569 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.622688 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/231772c6-755f-4c20-83c4-6013d0df7223-host\") pod \"tuned-h8mxt\" (UID: \"231772c6-755f-4c20-83c4-6013d0df7223\") " pod="openshift-cluster-node-tuning-operator/tuned-h8mxt" Apr 17 07:51:22.623569 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:22.622721 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85afae1f-542f-4ccc-b1bd-45ba0e0c418f-metrics-certs podName:85afae1f-542f-4ccc-b1bd-45ba0e0c418f nodeName:}" failed. No retries permitted until 2026-04-17 07:51:23.122701335 +0000 UTC m=+3.088738109 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/85afae1f-542f-4ccc-b1bd-45ba0e0c418f-metrics-certs") pod "network-metrics-daemon-6vh7t" (UID: "85afae1f-542f-4ccc-b1bd-45ba0e0c418f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:22.623569 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.622725 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ce5742ba-e78c-4173-abe6-deceee95d64e-device-dir\") pod \"aws-ebs-csi-driver-node-77jwq\" (UID: \"ce5742ba-e78c-4173-abe6-deceee95d64e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-77jwq" Apr 17 07:51:22.623569 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.622739 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/231772c6-755f-4c20-83c4-6013d0df7223-host\") pod \"tuned-h8mxt\" (UID: \"231772c6-755f-4c20-83c4-6013d0df7223\") " pod="openshift-cluster-node-tuning-operator/tuned-h8mxt" Apr 17 07:51:22.623569 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.622742 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/34f61707-b762-47d3-b7c3-a54999ad703b-original-pull-secret\") pod \"global-pull-secret-syncer-k9qmk\" (UID: \"34f61707-b762-47d3-b7c3-a54999ad703b\") " pod="kube-system/global-pull-secret-syncer-k9qmk" Apr 17 07:51:22.623569 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.622783 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ce5742ba-e78c-4173-abe6-deceee95d64e-socket-dir\") pod \"aws-ebs-csi-driver-node-77jwq\" (UID: \"ce5742ba-e78c-4173-abe6-deceee95d64e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-77jwq" Apr 17 07:51:22.623569 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.622825 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/231772c6-755f-4c20-83c4-6013d0df7223-etc-modprobe-d\") pod \"tuned-h8mxt\" (UID: \"231772c6-755f-4c20-83c4-6013d0df7223\") " pod="openshift-cluster-node-tuning-operator/tuned-h8mxt" Apr 17 07:51:22.623569 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:22.622857 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 07:51:22.623569 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.622895 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ce5742ba-e78c-4173-abe6-deceee95d64e-socket-dir\") pod \"aws-ebs-csi-driver-node-77jwq\" (UID: \"ce5742ba-e78c-4173-abe6-deceee95d64e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-77jwq" Apr 17 07:51:22.623569 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:22.622905 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34f61707-b762-47d3-b7c3-a54999ad703b-original-pull-secret podName:34f61707-b762-47d3-b7c3-a54999ad703b nodeName:}" failed. No retries permitted until 2026-04-17 07:51:23.122892086 +0000 UTC m=+3.088928846 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/34f61707-b762-47d3-b7c3-a54999ad703b-original-pull-secret") pod "global-pull-secret-syncer-k9qmk" (UID: "34f61707-b762-47d3-b7c3-a54999ad703b") : object "kube-system"/"original-pull-secret" not registered Apr 17 07:51:22.623569 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.622910 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/4f3abb21-4fac-471f-a620-eac7abc32e29-konnectivity-ca\") pod \"konnectivity-agent-wwr4f\" (UID: \"4f3abb21-4fac-471f-a620-eac7abc32e29\") " pod="kube-system/konnectivity-agent-wwr4f" Apr 17 07:51:22.623569 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.622925 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xtdsz\" (UniqueName: \"kubernetes.io/projected/ce5742ba-e78c-4173-abe6-deceee95d64e-kube-api-access-xtdsz\") pod \"aws-ebs-csi-driver-node-77jwq\" (UID: \"ce5742ba-e78c-4173-abe6-deceee95d64e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-77jwq" Apr 17 07:51:22.623569 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.622947 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/231772c6-755f-4c20-83c4-6013d0df7223-etc-sysconfig\") pod \"tuned-h8mxt\" (UID: \"231772c6-755f-4c20-83c4-6013d0df7223\") " pod="openshift-cluster-node-tuning-operator/tuned-h8mxt" Apr 17 07:51:22.623569 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.622975 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a6642e2d-0acd-4e4b-8013-72420908123d-tmp-dir\") pod \"node-resolver-cfdh5\" (UID: \"a6642e2d-0acd-4e4b-8013-72420908123d\") " pod="openshift-dns/node-resolver-cfdh5" Apr 17 07:51:22.623569 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.623000 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pkpjn\" (UniqueName: \"kubernetes.io/projected/a6642e2d-0acd-4e4b-8013-72420908123d-kube-api-access-pkpjn\") pod \"node-resolver-cfdh5\" (UID: \"a6642e2d-0acd-4e4b-8013-72420908123d\") " pod="openshift-dns/node-resolver-cfdh5" Apr 17 07:51:22.624418 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.623028 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/231772c6-755f-4c20-83c4-6013d0df7223-etc-sysctl-conf\") pod \"tuned-h8mxt\" (UID: \"231772c6-755f-4c20-83c4-6013d0df7223\") " pod="openshift-cluster-node-tuning-operator/tuned-h8mxt" Apr 17 07:51:22.624418 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.623052 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/231772c6-755f-4c20-83c4-6013d0df7223-run\") pod \"tuned-h8mxt\" (UID: \"231772c6-755f-4c20-83c4-6013d0df7223\") " pod="openshift-cluster-node-tuning-operator/tuned-h8mxt" Apr 17 07:51:22.624418 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.623075 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/231772c6-755f-4c20-83c4-6013d0df7223-sys\") pod \"tuned-h8mxt\" (UID: \"231772c6-755f-4c20-83c4-6013d0df7223\") " pod="openshift-cluster-node-tuning-operator/tuned-h8mxt" Apr 17 07:51:22.624418 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.623096 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/231772c6-755f-4c20-83c4-6013d0df7223-tmp\") pod \"tuned-h8mxt\" (UID: \"231772c6-755f-4c20-83c4-6013d0df7223\") " pod="openshift-cluster-node-tuning-operator/tuned-h8mxt" Apr 17 07:51:22.624418 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.623120 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8hsg8\" (UniqueName: \"kubernetes.io/projected/231772c6-755f-4c20-83c4-6013d0df7223-kube-api-access-8hsg8\") pod \"tuned-h8mxt\" (UID: \"231772c6-755f-4c20-83c4-6013d0df7223\") " pod="openshift-cluster-node-tuning-operator/tuned-h8mxt" Apr 17 07:51:22.624418 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.623159 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/34f61707-b762-47d3-b7c3-a54999ad703b-kubelet-config\") pod \"global-pull-secret-syncer-k9qmk\" (UID: \"34f61707-b762-47d3-b7c3-a54999ad703b\") " pod="kube-system/global-pull-secret-syncer-k9qmk" Apr 17 07:51:22.624418 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.623189 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/231772c6-755f-4c20-83c4-6013d0df7223-lib-modules\") pod \"tuned-h8mxt\" (UID: \"231772c6-755f-4c20-83c4-6013d0df7223\") " pod="openshift-cluster-node-tuning-operator/tuned-h8mxt" Apr 17 07:51:22.624418 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.623241 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/231772c6-755f-4c20-83c4-6013d0df7223-run\") pod \"tuned-h8mxt\" (UID: \"231772c6-755f-4c20-83c4-6013d0df7223\") " pod="openshift-cluster-node-tuning-operator/tuned-h8mxt" Apr 17 07:51:22.624418 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.623027 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/231772c6-755f-4c20-83c4-6013d0df7223-etc-sysconfig\") pod \"tuned-h8mxt\" (UID: \"231772c6-755f-4c20-83c4-6013d0df7223\") " pod="openshift-cluster-node-tuning-operator/tuned-h8mxt" Apr 17 07:51:22.624418 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.623307 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a6642e2d-0acd-4e4b-8013-72420908123d-tmp-dir\") pod \"node-resolver-cfdh5\" (UID: \"a6642e2d-0acd-4e4b-8013-72420908123d\") " pod="openshift-dns/node-resolver-cfdh5" Apr 17 07:51:22.624418 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.623346 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/231772c6-755f-4c20-83c4-6013d0df7223-lib-modules\") pod \"tuned-h8mxt\" (UID: \"231772c6-755f-4c20-83c4-6013d0df7223\") " pod="openshift-cluster-node-tuning-operator/tuned-h8mxt" Apr 17 07:51:22.624418 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.623374 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/231772c6-755f-4c20-83c4-6013d0df7223-etc-sysctl-conf\") pod \"tuned-h8mxt\" (UID: \"231772c6-755f-4c20-83c4-6013d0df7223\") " pod="openshift-cluster-node-tuning-operator/tuned-h8mxt" Apr 17 07:51:22.624418 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.623403 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/34f61707-b762-47d3-b7c3-a54999ad703b-kubelet-config\") pod \"global-pull-secret-syncer-k9qmk\" (UID: \"34f61707-b762-47d3-b7c3-a54999ad703b\") " pod="kube-system/global-pull-secret-syncer-k9qmk" Apr 17 07:51:22.624418 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.623453 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/231772c6-755f-4c20-83c4-6013d0df7223-sys\") pod \"tuned-h8mxt\" (UID: \"231772c6-755f-4c20-83c4-6013d0df7223\") " pod="openshift-cluster-node-tuning-operator/tuned-h8mxt" Apr 17 07:51:22.625125 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.625103 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/231772c6-755f-4c20-83c4-6013d0df7223-etc-tuned\") pod \"tuned-h8mxt\" (UID: \"231772c6-755f-4c20-83c4-6013d0df7223\") " pod="openshift-cluster-node-tuning-operator/tuned-h8mxt" Apr 17 07:51:22.625290 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.625272 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/231772c6-755f-4c20-83c4-6013d0df7223-tmp\") pod \"tuned-h8mxt\" (UID: \"231772c6-755f-4c20-83c4-6013d0df7223\") " pod="openshift-cluster-node-tuning-operator/tuned-h8mxt" Apr 17 07:51:22.625385 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.625327 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/4f3abb21-4fac-471f-a620-eac7abc32e29-agent-certs\") pod \"konnectivity-agent-wwr4f\" (UID: \"4f3abb21-4fac-471f-a620-eac7abc32e29\") " pod="kube-system/konnectivity-agent-wwr4f" Apr 17 07:51:22.641547 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.641487 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcqvh\" (UniqueName: \"kubernetes.io/projected/85afae1f-542f-4ccc-b1bd-45ba0e0c418f-kube-api-access-tcqvh\") pod \"network-metrics-daemon-6vh7t\" (UID: \"85afae1f-542f-4ccc-b1bd-45ba0e0c418f\") " pod="openshift-multus/network-metrics-daemon-6vh7t" Apr 17 07:51:22.642055 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.642038 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkpjn\" (UniqueName: \"kubernetes.io/projected/a6642e2d-0acd-4e4b-8013-72420908123d-kube-api-access-pkpjn\") pod \"node-resolver-cfdh5\" (UID: \"a6642e2d-0acd-4e4b-8013-72420908123d\") " pod="openshift-dns/node-resolver-cfdh5" Apr 17 07:51:22.642131 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.642064 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hsg8\" (UniqueName: \"kubernetes.io/projected/231772c6-755f-4c20-83c4-6013d0df7223-kube-api-access-8hsg8\") pod \"tuned-h8mxt\" (UID: \"231772c6-755f-4c20-83c4-6013d0df7223\") " pod="openshift-cluster-node-tuning-operator/tuned-h8mxt" Apr 17 07:51:22.642221 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.642205 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtdsz\" (UniqueName: \"kubernetes.io/projected/ce5742ba-e78c-4173-abe6-deceee95d64e-kube-api-access-xtdsz\") pod \"aws-ebs-csi-driver-node-77jwq\" (UID: \"ce5742ba-e78c-4173-abe6-deceee95d64e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-77jwq" Apr 17 07:51:22.706760 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.706729 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:22.714881 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.714858 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jf4kz" Apr 17 07:51:22.723424 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.723399 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-mk6gb" Apr 17 07:51:22.727981 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.727964 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-g768l" Apr 17 07:51:22.735691 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.735668 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-l64cj" Apr 17 07:51:22.742689 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.742671 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-wwr4f" Apr 17 07:51:22.750251 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.750231 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-cfdh5" Apr 17 07:51:22.757876 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.757860 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-77jwq" Apr 17 07:51:22.762341 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.762324 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-h8mxt" Apr 17 07:51:22.809002 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:22.808979 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:51:23.112440 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:23.112414 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b851da0_b5d9_4467_80b9_e5ec59af0f5b.slice/crio-3005d4f207533bcf8d0cbbacbe00217630c921e6132ea6047c5f22b0caacea8e WatchSource:0}: Error finding container 3005d4f207533bcf8d0cbbacbe00217630c921e6132ea6047c5f22b0caacea8e: Status 404 returned error can't find the container with id 3005d4f207533bcf8d0cbbacbe00217630c921e6132ea6047c5f22b0caacea8e Apr 17 07:51:23.114228 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:23.114200 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f3abb21_4fac_471f_a620_eac7abc32e29.slice/crio-7e59028cada6a51cfcb50b416d3baf0378a53db6723ad9e65fee9f4b10223512 WatchSource:0}: Error finding container 7e59028cada6a51cfcb50b416d3baf0378a53db6723ad9e65fee9f4b10223512: Status 404 returned error can't find the container with id 7e59028cada6a51cfcb50b416d3baf0378a53db6723ad9e65fee9f4b10223512 Apr 17 07:51:23.114815 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:23.114756 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode22e30ae_58f8_41be_9023_53dbea7c6e98.slice/crio-8454c7c4400f439dc2614f383a24fdf83fc7e4551fbedbb795fe9603fd767db4 WatchSource:0}: Error finding container 8454c7c4400f439dc2614f383a24fdf83fc7e4551fbedbb795fe9603fd767db4: Status 404 returned error can't find the container with id 8454c7c4400f439dc2614f383a24fdf83fc7e4551fbedbb795fe9603fd767db4 Apr 17 07:51:23.115575 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:23.115446 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce5742ba_e78c_4173_abe6_deceee95d64e.slice/crio-d5ba1056c2db1ef25f8c7e4e8cc8a2a2b335d05b4772b52d17ed894de099d39f WatchSource:0}: Error finding container d5ba1056c2db1ef25f8c7e4e8cc8a2a2b335d05b4772b52d17ed894de099d39f: Status 404 returned error can't find the container with id d5ba1056c2db1ef25f8c7e4e8cc8a2a2b335d05b4772b52d17ed894de099d39f Apr 17 07:51:23.117250 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:23.117019 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod551b9d4f_0c1e_440f_8580_a99be726c79b.slice/crio-0abdd942a82e2112b9878f8e7bd0509d48d6f7029fb3df750601993007bd3987 WatchSource:0}: Error finding container 0abdd942a82e2112b9878f8e7bd0509d48d6f7029fb3df750601993007bd3987: Status 404 returned error can't find the container with id 0abdd942a82e2112b9878f8e7bd0509d48d6f7029fb3df750601993007bd3987 Apr 17 07:51:23.119105 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:23.119051 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca63f18c_753f_468e_b6ab_a7a1608ee9ef.slice/crio-1e219e4e36854f9df5376fbe851a0851a18231afc8e758afb88641206331320d WatchSource:0}: Error finding container 1e219e4e36854f9df5376fbe851a0851a18231afc8e758afb88641206331320d: Status 404 returned error can't find the container with id 1e219e4e36854f9df5376fbe851a0851a18231afc8e758afb88641206331320d Apr 17 07:51:23.120063 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:23.119948 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc13c055_cb05_4a25_a5e5_93cef3f0760b.slice/crio-759bbca9fc8d6bcf46cb7bb94ebc90fbe39f3bdb4811fca8c446177566b8736e WatchSource:0}: Error finding container 759bbca9fc8d6bcf46cb7bb94ebc90fbe39f3bdb4811fca8c446177566b8736e: Status 404 returned error can't find the container with id 759bbca9fc8d6bcf46cb7bb94ebc90fbe39f3bdb4811fca8c446177566b8736e Apr 17 07:51:23.121407 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:23.121301 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod231772c6_755f_4c20_83c4_6013d0df7223.slice/crio-10898c1771410ecf550aab123457e87e6243e6eab4b8cc31e4547c4702ece268 WatchSource:0}: Error finding container 10898c1771410ecf550aab123457e87e6243e6eab4b8cc31e4547c4702ece268: Status 404 returned error can't find the container with id 10898c1771410ecf550aab123457e87e6243e6eab4b8cc31e4547c4702ece268 Apr 17 07:51:23.122083 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:51:23.121969 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6642e2d_0acd_4e4b_8013_72420908123d.slice/crio-2ef0526d5a2384ceb3b4b7d148bd3fed57915b111033f28dc331c3c9b6d498a3 WatchSource:0}: Error finding container 2ef0526d5a2384ceb3b4b7d148bd3fed57915b111033f28dc331c3c9b6d498a3: Status 404 returned error can't find the container with id 2ef0526d5a2384ceb3b4b7d148bd3fed57915b111033f28dc331c3c9b6d498a3 Apr 17 07:51:23.126477 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:23.126431 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/85afae1f-542f-4ccc-b1bd-45ba0e0c418f-metrics-certs\") pod \"network-metrics-daemon-6vh7t\" (UID: \"85afae1f-542f-4ccc-b1bd-45ba0e0c418f\") " pod="openshift-multus/network-metrics-daemon-6vh7t" Apr 17 07:51:23.126477 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:23.126458 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x87jc\" (UniqueName: \"kubernetes.io/projected/c47a920d-1db5-42ad-9b8e-ae9649778582-kube-api-access-x87jc\") pod \"network-check-target-s895b\" (UID: \"c47a920d-1db5-42ad-9b8e-ae9649778582\") " pod="openshift-network-diagnostics/network-check-target-s895b" Apr 17 07:51:23.126593 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:23.126486 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/34f61707-b762-47d3-b7c3-a54999ad703b-original-pull-secret\") pod \"global-pull-secret-syncer-k9qmk\" (UID: \"34f61707-b762-47d3-b7c3-a54999ad703b\") " pod="kube-system/global-pull-secret-syncer-k9qmk" Apr 17 07:51:23.126593 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:23.126587 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:23.126739 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:23.126604 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 07:51:23.126739 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:23.126608 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:51:23.126739 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:23.126648 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85afae1f-542f-4ccc-b1bd-45ba0e0c418f-metrics-certs podName:85afae1f-542f-4ccc-b1bd-45ba0e0c418f nodeName:}" failed. No retries permitted until 2026-04-17 07:51:24.126624772 +0000 UTC m=+4.092661537 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/85afae1f-542f-4ccc-b1bd-45ba0e0c418f-metrics-certs") pod "network-metrics-daemon-6vh7t" (UID: "85afae1f-542f-4ccc-b1bd-45ba0e0c418f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:23.126739 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:23.126652 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:51:23.126739 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:23.126661 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34f61707-b762-47d3-b7c3-a54999ad703b-original-pull-secret podName:34f61707-b762-47d3-b7c3-a54999ad703b nodeName:}" failed. No retries permitted until 2026-04-17 07:51:24.126655224 +0000 UTC m=+4.092691979 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/34f61707-b762-47d3-b7c3-a54999ad703b-original-pull-secret") pod "global-pull-secret-syncer-k9qmk" (UID: "34f61707-b762-47d3-b7c3-a54999ad703b") : object "kube-system"/"original-pull-secret" not registered Apr 17 07:51:23.126739 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:23.126669 2578 projected.go:194] Error preparing data for projected volume kube-api-access-x87jc for pod openshift-network-diagnostics/network-check-target-s895b: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:23.126739 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:23.126710 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c47a920d-1db5-42ad-9b8e-ae9649778582-kube-api-access-x87jc podName:c47a920d-1db5-42ad-9b8e-ae9649778582 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:24.126698259 +0000 UTC m=+4.092735014 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-x87jc" (UniqueName: "kubernetes.io/projected/c47a920d-1db5-42ad-9b8e-ae9649778582-kube-api-access-x87jc") pod "network-check-target-s895b" (UID: "c47a920d-1db5-42ad-9b8e-ae9649778582") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:23.457294 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:23.457249 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 07:46:21 +0000 UTC" deadline="2027-10-17 02:17:03.467740486 +0000 UTC" Apr 17 07:51:23.457294 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:23.457293 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13146h25m40.010451643s" Apr 17 07:51:23.545454 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:23.545417 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-77jwq" event={"ID":"ce5742ba-e78c-4173-abe6-deceee95d64e","Type":"ContainerStarted","Data":"d5ba1056c2db1ef25f8c7e4e8cc8a2a2b335d05b4772b52d17ed894de099d39f"} Apr 17 07:51:23.553815 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:23.553741 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-wwr4f" event={"ID":"4f3abb21-4fac-471f-a620-eac7abc32e29","Type":"ContainerStarted","Data":"7e59028cada6a51cfcb50b416d3baf0378a53db6723ad9e65fee9f4b10223512"} Apr 17 07:51:23.555802 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:23.555764 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jf4kz" event={"ID":"ca63f18c-753f-468e-b6ab-a7a1608ee9ef","Type":"ContainerStarted","Data":"1e219e4e36854f9df5376fbe851a0851a18231afc8e758afb88641206331320d"} Apr 17 07:51:23.557361 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:23.557335 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mk6gb" event={"ID":"3b851da0-b5d9-4467-80b9-e5ec59af0f5b","Type":"ContainerStarted","Data":"3005d4f207533bcf8d0cbbacbe00217630c921e6132ea6047c5f22b0caacea8e"} Apr 17 07:51:23.559630 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:23.559601 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-176.ec2.internal" event={"ID":"f5dc70df488813169343e62031a0c95f","Type":"ContainerStarted","Data":"50f63c4412592612bd90e80ce35d0f32722675644de5fd6b5fb4f26cd88434d3"} Apr 17 07:51:23.563791 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:23.563556 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-cfdh5" event={"ID":"a6642e2d-0acd-4e4b-8013-72420908123d","Type":"ContainerStarted","Data":"2ef0526d5a2384ceb3b4b7d148bd3fed57915b111033f28dc331c3c9b6d498a3"} Apr 17 07:51:23.568189 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:23.568122 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-h8mxt" event={"ID":"231772c6-755f-4c20-83c4-6013d0df7223","Type":"ContainerStarted","Data":"10898c1771410ecf550aab123457e87e6243e6eab4b8cc31e4547c4702ece268"} Apr 17 07:51:23.573664 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:23.573612 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-l64cj" event={"ID":"fc13c055-cb05-4a25-a5e5-93cef3f0760b","Type":"ContainerStarted","Data":"759bbca9fc8d6bcf46cb7bb94ebc90fbe39f3bdb4811fca8c446177566b8736e"} Apr 17 07:51:23.578636 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:23.578610 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" event={"ID":"e22e30ae-58f8-41be-9023-53dbea7c6e98","Type":"ContainerStarted","Data":"8454c7c4400f439dc2614f383a24fdf83fc7e4551fbedbb795fe9603fd767db4"} Apr 17 07:51:23.582866 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:23.582843 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-g768l" event={"ID":"551b9d4f-0c1e-440f-8580-a99be726c79b","Type":"ContainerStarted","Data":"0abdd942a82e2112b9878f8e7bd0509d48d6f7029fb3df750601993007bd3987"} Apr 17 07:51:24.135683 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:24.135645 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/85afae1f-542f-4ccc-b1bd-45ba0e0c418f-metrics-certs\") pod \"network-metrics-daemon-6vh7t\" (UID: \"85afae1f-542f-4ccc-b1bd-45ba0e0c418f\") " pod="openshift-multus/network-metrics-daemon-6vh7t" Apr 17 07:51:24.135853 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:24.135701 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x87jc\" (UniqueName: \"kubernetes.io/projected/c47a920d-1db5-42ad-9b8e-ae9649778582-kube-api-access-x87jc\") pod \"network-check-target-s895b\" (UID: \"c47a920d-1db5-42ad-9b8e-ae9649778582\") " pod="openshift-network-diagnostics/network-check-target-s895b" Apr 17 07:51:24.135853 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:24.135737 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/34f61707-b762-47d3-b7c3-a54999ad703b-original-pull-secret\") pod \"global-pull-secret-syncer-k9qmk\" (UID: \"34f61707-b762-47d3-b7c3-a54999ad703b\") " pod="kube-system/global-pull-secret-syncer-k9qmk" Apr 17 07:51:24.135971 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:24.135864 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 07:51:24.135971 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:24.135925 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34f61707-b762-47d3-b7c3-a54999ad703b-original-pull-secret podName:34f61707-b762-47d3-b7c3-a54999ad703b nodeName:}" failed. No retries permitted until 2026-04-17 07:51:26.135906487 +0000 UTC m=+6.101943247 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/34f61707-b762-47d3-b7c3-a54999ad703b-original-pull-secret") pod "global-pull-secret-syncer-k9qmk" (UID: "34f61707-b762-47d3-b7c3-a54999ad703b") : object "kube-system"/"original-pull-secret" not registered Apr 17 07:51:24.136339 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:24.136319 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:24.136428 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:24.136372 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85afae1f-542f-4ccc-b1bd-45ba0e0c418f-metrics-certs podName:85afae1f-542f-4ccc-b1bd-45ba0e0c418f nodeName:}" failed. No retries permitted until 2026-04-17 07:51:26.136357453 +0000 UTC m=+6.102394214 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/85afae1f-542f-4ccc-b1bd-45ba0e0c418f-metrics-certs") pod "network-metrics-daemon-6vh7t" (UID: "85afae1f-542f-4ccc-b1bd-45ba0e0c418f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:24.136486 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:24.136467 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:51:24.136486 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:24.136480 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:51:24.136593 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:24.136492 2578 projected.go:194] Error preparing data for projected volume kube-api-access-x87jc for pod openshift-network-diagnostics/network-check-target-s895b: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:24.136593 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:24.136527 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c47a920d-1db5-42ad-9b8e-ae9649778582-kube-api-access-x87jc podName:c47a920d-1db5-42ad-9b8e-ae9649778582 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:26.136515539 +0000 UTC m=+6.102552300 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-x87jc" (UniqueName: "kubernetes.io/projected/c47a920d-1db5-42ad-9b8e-ae9649778582-kube-api-access-x87jc") pod "network-check-target-s895b" (UID: "c47a920d-1db5-42ad-9b8e-ae9649778582") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:24.532281 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:24.532185 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s895b" Apr 17 07:51:24.532712 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:24.532317 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s895b" podUID="c47a920d-1db5-42ad-9b8e-ae9649778582" Apr 17 07:51:24.532768 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:24.532747 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vh7t" Apr 17 07:51:24.532866 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:24.532843 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vh7t" podUID="85afae1f-542f-4ccc-b1bd-45ba0e0c418f" Apr 17 07:51:24.532946 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:24.532924 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k9qmk" Apr 17 07:51:24.533008 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:24.532993 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k9qmk" podUID="34f61707-b762-47d3-b7c3-a54999ad703b" Apr 17 07:51:24.612321 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:24.612283 2578 generic.go:358] "Generic (PLEG): container finished" podID="660b5cf6d517612ce055ec178beb9c3f" containerID="e86e377c0405f9b1a53ac7ad9e08338bf8412d3db1beb4c9ddda636d2a325cb6" exitCode=0 Apr 17 07:51:24.613306 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:24.613276 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-176.ec2.internal" event={"ID":"660b5cf6d517612ce055ec178beb9c3f","Type":"ContainerDied","Data":"e86e377c0405f9b1a53ac7ad9e08338bf8412d3db1beb4c9ddda636d2a325cb6"} Apr 17 07:51:24.626644 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:24.626539 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-176.ec2.internal" podStartSLOduration=3.626521365 podStartE2EDuration="3.626521365s" podCreationTimestamp="2026-04-17 07:51:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 07:51:23.577369917 +0000 UTC m=+3.543406696" watchObservedRunningTime="2026-04-17 07:51:24.626521365 +0000 UTC m=+4.592558144" Apr 17 07:51:25.618391 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:25.618345 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-176.ec2.internal" event={"ID":"660b5cf6d517612ce055ec178beb9c3f","Type":"ContainerStarted","Data":"ed18092741d639a17d1fb4ce44e81d3a7c01cd8239635f9a1d9be7311b84d5a0"} Apr 17 07:51:25.631006 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:25.630942 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-176.ec2.internal" podStartSLOduration=4.6309237979999995 podStartE2EDuration="4.630923798s" podCreationTimestamp="2026-04-17 07:51:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 07:51:25.630496217 +0000 UTC m=+5.596532995" watchObservedRunningTime="2026-04-17 07:51:25.630923798 +0000 UTC m=+5.596960575" Apr 17 07:51:26.153012 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:26.152855 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/85afae1f-542f-4ccc-b1bd-45ba0e0c418f-metrics-certs\") pod \"network-metrics-daemon-6vh7t\" (UID: \"85afae1f-542f-4ccc-b1bd-45ba0e0c418f\") " pod="openshift-multus/network-metrics-daemon-6vh7t" Apr 17 07:51:26.153012 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:26.152978 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x87jc\" (UniqueName: \"kubernetes.io/projected/c47a920d-1db5-42ad-9b8e-ae9649778582-kube-api-access-x87jc\") pod \"network-check-target-s895b\" (UID: \"c47a920d-1db5-42ad-9b8e-ae9649778582\") " pod="openshift-network-diagnostics/network-check-target-s895b" Apr 17 07:51:26.153274 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:26.153024 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/34f61707-b762-47d3-b7c3-a54999ad703b-original-pull-secret\") pod \"global-pull-secret-syncer-k9qmk\" (UID: \"34f61707-b762-47d3-b7c3-a54999ad703b\") " pod="kube-system/global-pull-secret-syncer-k9qmk" Apr 17 07:51:26.153274 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:26.153052 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:26.153274 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:26.153127 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85afae1f-542f-4ccc-b1bd-45ba0e0c418f-metrics-certs podName:85afae1f-542f-4ccc-b1bd-45ba0e0c418f nodeName:}" failed. No retries permitted until 2026-04-17 07:51:30.153106903 +0000 UTC m=+10.119143668 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/85afae1f-542f-4ccc-b1bd-45ba0e0c418f-metrics-certs") pod "network-metrics-daemon-6vh7t" (UID: "85afae1f-542f-4ccc-b1bd-45ba0e0c418f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:26.153274 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:26.153238 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 07:51:26.153556 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:26.153287 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34f61707-b762-47d3-b7c3-a54999ad703b-original-pull-secret podName:34f61707-b762-47d3-b7c3-a54999ad703b nodeName:}" failed. No retries permitted until 2026-04-17 07:51:30.15327512 +0000 UTC m=+10.119311874 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/34f61707-b762-47d3-b7c3-a54999ad703b-original-pull-secret") pod "global-pull-secret-syncer-k9qmk" (UID: "34f61707-b762-47d3-b7c3-a54999ad703b") : object "kube-system"/"original-pull-secret" not registered Apr 17 07:51:26.153556 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:26.153366 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:51:26.153556 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:26.153380 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:51:26.153556 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:26.153392 2578 projected.go:194] Error preparing data for projected volume kube-api-access-x87jc for pod openshift-network-diagnostics/network-check-target-s895b: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:26.153556 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:26.153427 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c47a920d-1db5-42ad-9b8e-ae9649778582-kube-api-access-x87jc podName:c47a920d-1db5-42ad-9b8e-ae9649778582 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:30.153416061 +0000 UTC m=+10.119452828 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-x87jc" (UniqueName: "kubernetes.io/projected/c47a920d-1db5-42ad-9b8e-ae9649778582-kube-api-access-x87jc") pod "network-check-target-s895b" (UID: "c47a920d-1db5-42ad-9b8e-ae9649778582") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:26.529965 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:26.529881 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vh7t" Apr 17 07:51:26.530132 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:26.530026 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vh7t" podUID="85afae1f-542f-4ccc-b1bd-45ba0e0c418f" Apr 17 07:51:26.530479 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:26.530458 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k9qmk" Apr 17 07:51:26.530579 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:26.530562 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k9qmk" podUID="34f61707-b762-47d3-b7c3-a54999ad703b" Apr 17 07:51:26.530720 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:26.530703 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s895b" Apr 17 07:51:26.530807 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:26.530788 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s895b" podUID="c47a920d-1db5-42ad-9b8e-ae9649778582" Apr 17 07:51:28.529948 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:28.529921 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k9qmk" Apr 17 07:51:28.530395 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:28.530048 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s895b" Apr 17 07:51:28.530395 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:28.530117 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s895b" podUID="c47a920d-1db5-42ad-9b8e-ae9649778582" Apr 17 07:51:28.530395 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:28.529918 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vh7t" Apr 17 07:51:28.530395 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:28.530040 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k9qmk" podUID="34f61707-b762-47d3-b7c3-a54999ad703b" Apr 17 07:51:28.530395 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:28.530279 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vh7t" podUID="85afae1f-542f-4ccc-b1bd-45ba0e0c418f" Apr 17 07:51:30.186873 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:30.186835 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/85afae1f-542f-4ccc-b1bd-45ba0e0c418f-metrics-certs\") pod \"network-metrics-daemon-6vh7t\" (UID: \"85afae1f-542f-4ccc-b1bd-45ba0e0c418f\") " pod="openshift-multus/network-metrics-daemon-6vh7t" Apr 17 07:51:30.187351 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:30.186887 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x87jc\" (UniqueName: \"kubernetes.io/projected/c47a920d-1db5-42ad-9b8e-ae9649778582-kube-api-access-x87jc\") pod \"network-check-target-s895b\" (UID: \"c47a920d-1db5-42ad-9b8e-ae9649778582\") " pod="openshift-network-diagnostics/network-check-target-s895b" Apr 17 07:51:30.187351 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:30.186922 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/34f61707-b762-47d3-b7c3-a54999ad703b-original-pull-secret\") pod \"global-pull-secret-syncer-k9qmk\" (UID: \"34f61707-b762-47d3-b7c3-a54999ad703b\") " pod="kube-system/global-pull-secret-syncer-k9qmk" Apr 17 07:51:30.187351 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:30.187003 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:30.187351 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:30.187019 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 07:51:30.187351 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:30.187073 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34f61707-b762-47d3-b7c3-a54999ad703b-original-pull-secret podName:34f61707-b762-47d3-b7c3-a54999ad703b nodeName:}" failed. No retries permitted until 2026-04-17 07:51:38.187054721 +0000 UTC m=+18.153091479 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/34f61707-b762-47d3-b7c3-a54999ad703b-original-pull-secret") pod "global-pull-secret-syncer-k9qmk" (UID: "34f61707-b762-47d3-b7c3-a54999ad703b") : object "kube-system"/"original-pull-secret" not registered Apr 17 07:51:30.187351 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:30.187091 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85afae1f-542f-4ccc-b1bd-45ba0e0c418f-metrics-certs podName:85afae1f-542f-4ccc-b1bd-45ba0e0c418f nodeName:}" failed. No retries permitted until 2026-04-17 07:51:38.187081143 +0000 UTC m=+18.153117904 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/85afae1f-542f-4ccc-b1bd-45ba0e0c418f-metrics-certs") pod "network-metrics-daemon-6vh7t" (UID: "85afae1f-542f-4ccc-b1bd-45ba0e0c418f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:30.187351 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:30.187128 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:51:30.187351 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:30.187159 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:51:30.187351 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:30.187170 2578 projected.go:194] Error preparing data for projected volume kube-api-access-x87jc for pod openshift-network-diagnostics/network-check-target-s895b: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:30.187351 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:30.187212 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c47a920d-1db5-42ad-9b8e-ae9649778582-kube-api-access-x87jc podName:c47a920d-1db5-42ad-9b8e-ae9649778582 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:38.187194671 +0000 UTC m=+18.153231439 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-x87jc" (UniqueName: "kubernetes.io/projected/c47a920d-1db5-42ad-9b8e-ae9649778582-kube-api-access-x87jc") pod "network-check-target-s895b" (UID: "c47a920d-1db5-42ad-9b8e-ae9649778582") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:30.531764 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:30.530958 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k9qmk" Apr 17 07:51:30.531764 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:30.531068 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k9qmk" podUID="34f61707-b762-47d3-b7c3-a54999ad703b" Apr 17 07:51:30.531764 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:30.531125 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vh7t" Apr 17 07:51:30.531764 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:30.531230 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vh7t" podUID="85afae1f-542f-4ccc-b1bd-45ba0e0c418f" Apr 17 07:51:30.531764 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:30.531580 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s895b" Apr 17 07:51:30.531764 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:30.531666 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s895b" podUID="c47a920d-1db5-42ad-9b8e-ae9649778582" Apr 17 07:51:32.530012 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:32.529978 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k9qmk" Apr 17 07:51:32.530454 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:32.530090 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vh7t" Apr 17 07:51:32.530454 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:32.530095 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k9qmk" podUID="34f61707-b762-47d3-b7c3-a54999ad703b" Apr 17 07:51:32.530454 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:32.530118 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s895b" Apr 17 07:51:32.530454 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:32.530204 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vh7t" podUID="85afae1f-542f-4ccc-b1bd-45ba0e0c418f" Apr 17 07:51:32.530454 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:32.530279 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s895b" podUID="c47a920d-1db5-42ad-9b8e-ae9649778582" Apr 17 07:51:34.529954 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:34.529923 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k9qmk" Apr 17 07:51:34.530514 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:34.529925 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s895b" Apr 17 07:51:34.530514 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:34.530046 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k9qmk" podUID="34f61707-b762-47d3-b7c3-a54999ad703b" Apr 17 07:51:34.530514 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:34.530115 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s895b" podUID="c47a920d-1db5-42ad-9b8e-ae9649778582" Apr 17 07:51:34.530514 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:34.529930 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vh7t" Apr 17 07:51:34.530514 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:34.530272 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vh7t" podUID="85afae1f-542f-4ccc-b1bd-45ba0e0c418f" Apr 17 07:51:36.529375 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:36.529341 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k9qmk" Apr 17 07:51:36.529844 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:36.529341 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vh7t" Apr 17 07:51:36.529844 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:36.529484 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s895b" Apr 17 07:51:36.529844 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:36.529477 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k9qmk" podUID="34f61707-b762-47d3-b7c3-a54999ad703b" Apr 17 07:51:36.529844 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:36.529632 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vh7t" podUID="85afae1f-542f-4ccc-b1bd-45ba0e0c418f" Apr 17 07:51:36.529844 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:36.529714 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s895b" podUID="c47a920d-1db5-42ad-9b8e-ae9649778582" Apr 17 07:51:38.248319 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:38.248282 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/85afae1f-542f-4ccc-b1bd-45ba0e0c418f-metrics-certs\") pod \"network-metrics-daemon-6vh7t\" (UID: \"85afae1f-542f-4ccc-b1bd-45ba0e0c418f\") " pod="openshift-multus/network-metrics-daemon-6vh7t" Apr 17 07:51:38.248319 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:38.248321 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x87jc\" (UniqueName: \"kubernetes.io/projected/c47a920d-1db5-42ad-9b8e-ae9649778582-kube-api-access-x87jc\") pod \"network-check-target-s895b\" (UID: \"c47a920d-1db5-42ad-9b8e-ae9649778582\") " pod="openshift-network-diagnostics/network-check-target-s895b" Apr 17 07:51:38.248742 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:38.248343 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/34f61707-b762-47d3-b7c3-a54999ad703b-original-pull-secret\") pod \"global-pull-secret-syncer-k9qmk\" (UID: \"34f61707-b762-47d3-b7c3-a54999ad703b\") " pod="kube-system/global-pull-secret-syncer-k9qmk" Apr 17 07:51:38.248742 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:38.248438 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 07:51:38.248742 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:38.248456 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:51:38.248742 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:38.248472 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:51:38.248742 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:38.248485 2578 projected.go:194] Error preparing data for projected volume kube-api-access-x87jc for pod openshift-network-diagnostics/network-check-target-s895b: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:38.248742 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:38.248502 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34f61707-b762-47d3-b7c3-a54999ad703b-original-pull-secret podName:34f61707-b762-47d3-b7c3-a54999ad703b nodeName:}" failed. No retries permitted until 2026-04-17 07:51:54.248483588 +0000 UTC m=+34.214520347 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/34f61707-b762-47d3-b7c3-a54999ad703b-original-pull-secret") pod "global-pull-secret-syncer-k9qmk" (UID: "34f61707-b762-47d3-b7c3-a54999ad703b") : object "kube-system"/"original-pull-secret" not registered Apr 17 07:51:38.248742 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:38.248438 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:38.248742 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:38.248523 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c47a920d-1db5-42ad-9b8e-ae9649778582-kube-api-access-x87jc podName:c47a920d-1db5-42ad-9b8e-ae9649778582 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:54.248512299 +0000 UTC m=+34.214549070 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-x87jc" (UniqueName: "kubernetes.io/projected/c47a920d-1db5-42ad-9b8e-ae9649778582-kube-api-access-x87jc") pod "network-check-target-s895b" (UID: "c47a920d-1db5-42ad-9b8e-ae9649778582") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:38.248742 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:38.248583 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85afae1f-542f-4ccc-b1bd-45ba0e0c418f-metrics-certs podName:85afae1f-542f-4ccc-b1bd-45ba0e0c418f nodeName:}" failed. No retries permitted until 2026-04-17 07:51:54.248568574 +0000 UTC m=+34.214605329 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/85afae1f-542f-4ccc-b1bd-45ba0e0c418f-metrics-certs") pod "network-metrics-daemon-6vh7t" (UID: "85afae1f-542f-4ccc-b1bd-45ba0e0c418f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:38.529494 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:38.529416 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s895b" Apr 17 07:51:38.529639 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:38.529424 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vh7t" Apr 17 07:51:38.529639 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:38.529538 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s895b" podUID="c47a920d-1db5-42ad-9b8e-ae9649778582" Apr 17 07:51:38.529639 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:38.529424 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k9qmk" Apr 17 07:51:38.529639 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:38.529598 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vh7t" podUID="85afae1f-542f-4ccc-b1bd-45ba0e0c418f" Apr 17 07:51:38.529823 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:38.529663 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k9qmk" podUID="34f61707-b762-47d3-b7c3-a54999ad703b" Apr 17 07:51:40.529768 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:40.529605 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s895b" Apr 17 07:51:40.529768 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:40.529756 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vh7t" Apr 17 07:51:40.530396 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:40.529729 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k9qmk" Apr 17 07:51:40.530396 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:40.529823 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s895b" podUID="c47a920d-1db5-42ad-9b8e-ae9649778582" Apr 17 07:51:40.530396 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:40.529922 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k9qmk" podUID="34f61707-b762-47d3-b7c3-a54999ad703b" Apr 17 07:51:40.530396 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:40.530005 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vh7t" podUID="85afae1f-542f-4ccc-b1bd-45ba0e0c418f" Apr 17 07:51:40.642837 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:40.642808 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-h8mxt" event={"ID":"231772c6-755f-4c20-83c4-6013d0df7223","Type":"ContainerStarted","Data":"47fa68b33b34ed923aa250e1d6d01050912b556cc068a9becb3d7858c3254e37"} Apr 17 07:51:40.644541 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:40.644517 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" event={"ID":"e22e30ae-58f8-41be-9023-53dbea7c6e98","Type":"ContainerStarted","Data":"0ba6b351c0cc5ebf0feb871aa0ea751ac84c6ec356399f9e9fb0785206364543"} Apr 17 07:51:40.644621 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:40.644548 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" event={"ID":"e22e30ae-58f8-41be-9023-53dbea7c6e98","Type":"ContainerStarted","Data":"bef3c4ec5c009ab07cc91d0bbacf3c5dcda30e8e48351f36af6c6c84ae89d334"} Apr 17 07:51:40.644621 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:40.644562 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" event={"ID":"e22e30ae-58f8-41be-9023-53dbea7c6e98","Type":"ContainerStarted","Data":"01df6855fba3d06f452a70df9d54f7381e90d1b3872a97a83784e381b8d54893"} Apr 17 07:51:40.645850 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:40.645823 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-g768l" event={"ID":"551b9d4f-0c1e-440f-8580-a99be726c79b","Type":"ContainerStarted","Data":"eb2c0ec37f51373460bc61f9de35f272136e3e581fb2ea4024f1455837466cb2"} Apr 17 07:51:40.647055 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:40.647036 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-77jwq" event={"ID":"ce5742ba-e78c-4173-abe6-deceee95d64e","Type":"ContainerStarted","Data":"de6665d5282f33129e46a45062854885fef3fe3fcdf8f0400cd9086c35c69de4"} Apr 17 07:51:40.648185 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:40.648161 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-wwr4f" event={"ID":"4f3abb21-4fac-471f-a620-eac7abc32e29","Type":"ContainerStarted","Data":"50bdd020918109d0930549216cb41be3e21d4ceba58debbe3fa94bb97ffe78d8"} Apr 17 07:51:40.649334 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:40.649311 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jf4kz" event={"ID":"ca63f18c-753f-468e-b6ab-a7a1608ee9ef","Type":"ContainerStarted","Data":"7230617c7395c5b987449e9f35f9997de2173b5e96b55a7879e64f4d8d5038cf"} Apr 17 07:51:40.650479 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:40.650456 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mk6gb" event={"ID":"3b851da0-b5d9-4467-80b9-e5ec59af0f5b","Type":"ContainerStarted","Data":"525da45af8751151e137dd072a4f9542545a3722658da2399be40f2d39ea8145"} Apr 17 07:51:40.651604 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:40.651579 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-cfdh5" event={"ID":"a6642e2d-0acd-4e4b-8013-72420908123d","Type":"ContainerStarted","Data":"63f94165cf65c0715242857752b8033df7dbebbf480a8459b76915f9614a53ac"} Apr 17 07:51:40.672092 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:40.672053 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-h8mxt" podStartSLOduration=3.636032177 podStartE2EDuration="20.672041744s" podCreationTimestamp="2026-04-17 07:51:20 +0000 UTC" firstStartedPulling="2026-04-17 07:51:23.125234116 +0000 UTC m=+3.091270871" lastFinishedPulling="2026-04-17 07:51:40.161243667 +0000 UTC m=+20.127280438" observedRunningTime="2026-04-17 07:51:40.671820859 +0000 UTC m=+20.637857637" watchObservedRunningTime="2026-04-17 07:51:40.672041744 +0000 UTC m=+20.638078521" Apr 17 07:51:40.697915 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:40.697857 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-g768l" podStartSLOduration=3.648591755 podStartE2EDuration="20.69783987s" podCreationTimestamp="2026-04-17 07:51:20 +0000 UTC" firstStartedPulling="2026-04-17 07:51:23.119694835 +0000 UTC m=+3.085731607" lastFinishedPulling="2026-04-17 07:51:40.168942963 +0000 UTC m=+20.134979722" observedRunningTime="2026-04-17 07:51:40.69749355 +0000 UTC m=+20.663530327" watchObservedRunningTime="2026-04-17 07:51:40.69783987 +0000 UTC m=+20.663876648" Apr 17 07:51:40.711641 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:40.711599 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-wwr4f" podStartSLOduration=11.80226131 podStartE2EDuration="20.711588515s" podCreationTimestamp="2026-04-17 07:51:20 +0000 UTC" firstStartedPulling="2026-04-17 07:51:23.116388465 +0000 UTC m=+3.082425220" lastFinishedPulling="2026-04-17 07:51:32.025715669 +0000 UTC m=+11.991752425" observedRunningTime="2026-04-17 07:51:40.711310456 +0000 UTC m=+20.677347232" watchObservedRunningTime="2026-04-17 07:51:40.711588515 +0000 UTC m=+20.677625291" Apr 17 07:51:40.724271 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:40.724232 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-cfdh5" podStartSLOduration=3.717301702 podStartE2EDuration="20.724220694s" podCreationTimestamp="2026-04-17 07:51:20 +0000 UTC" firstStartedPulling="2026-04-17 07:51:23.124808642 +0000 UTC m=+3.090845410" lastFinishedPulling="2026-04-17 07:51:40.131727633 +0000 UTC m=+20.097764402" observedRunningTime="2026-04-17 07:51:40.724053091 +0000 UTC m=+20.690089867" watchObservedRunningTime="2026-04-17 07:51:40.724220694 +0000 UTC m=+20.690257471" Apr 17 07:51:40.738470 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:40.738420 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-jf4kz" podStartSLOduration=3.70052713 podStartE2EDuration="20.738407305s" podCreationTimestamp="2026-04-17 07:51:20 +0000 UTC" firstStartedPulling="2026-04-17 07:51:23.12168191 +0000 UTC m=+3.087718668" lastFinishedPulling="2026-04-17 07:51:40.159562072 +0000 UTC m=+20.125598843" observedRunningTime="2026-04-17 07:51:40.737545352 +0000 UTC m=+20.703582132" watchObservedRunningTime="2026-04-17 07:51:40.738407305 +0000 UTC m=+20.704444083" Apr 17 07:51:41.616023 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:41.615827 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-wwr4f" Apr 17 07:51:41.616489 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:41.616417 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-wwr4f" Apr 17 07:51:41.653948 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:41.653915 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-l64cj" event={"ID":"fc13c055-cb05-4a25-a5e5-93cef3f0760b","Type":"ContainerStarted","Data":"c657328e4210ac3b3092743926c889a67ab3aa4c68de1afd3b5766fba8dca5cb"} Apr 17 07:51:41.656027 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:41.656011 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rnznv_e22e30ae-58f8-41be-9023-53dbea7c6e98/ovn-acl-logging/0.log" Apr 17 07:51:41.656325 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:41.656298 2578 generic.go:358] "Generic (PLEG): container finished" podID="e22e30ae-58f8-41be-9023-53dbea7c6e98" containerID="bef3c4ec5c009ab07cc91d0bbacf3c5dcda30e8e48351f36af6c6c84ae89d334" exitCode=1 Apr 17 07:51:41.656399 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:41.656370 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" event={"ID":"e22e30ae-58f8-41be-9023-53dbea7c6e98","Type":"ContainerDied","Data":"bef3c4ec5c009ab07cc91d0bbacf3c5dcda30e8e48351f36af6c6c84ae89d334"} Apr 17 07:51:41.656399 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:41.656396 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" event={"ID":"e22e30ae-58f8-41be-9023-53dbea7c6e98","Type":"ContainerStarted","Data":"38b761770d3115cf0f6d1ecac1343dd9e78b1ccfa3eb4a19a296c384e685a172"} Apr 17 07:51:41.656479 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:41.656405 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" event={"ID":"e22e30ae-58f8-41be-9023-53dbea7c6e98","Type":"ContainerStarted","Data":"d4ab7b7a84a0f2c352bb13fd63e28e8c52df2b3bd9b3168ae7f24207ef784eaa"} Apr 17 07:51:41.656479 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:41.656414 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" event={"ID":"e22e30ae-58f8-41be-9023-53dbea7c6e98","Type":"ContainerStarted","Data":"1cfad53c8dc500e20793b19422ddc6df91af6f695ad2acbfab05e6da9e8dfe7d"} Apr 17 07:51:41.657493 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:41.657477 2578 generic.go:358] "Generic (PLEG): container finished" podID="3b851da0-b5d9-4467-80b9-e5ec59af0f5b" containerID="525da45af8751151e137dd072a4f9542545a3722658da2399be40f2d39ea8145" exitCode=0 Apr 17 07:51:41.657584 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:41.657504 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mk6gb" event={"ID":"3b851da0-b5d9-4467-80b9-e5ec59af0f5b","Type":"ContainerDied","Data":"525da45af8751151e137dd072a4f9542545a3722658da2399be40f2d39ea8145"} Apr 17 07:51:41.689545 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:41.689493 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-l64cj" podStartSLOduration=4.651605775 podStartE2EDuration="21.689472855s" podCreationTimestamp="2026-04-17 07:51:20 +0000 UTC" firstStartedPulling="2026-04-17 07:51:23.122041307 +0000 UTC m=+3.088078075" lastFinishedPulling="2026-04-17 07:51:40.159908396 +0000 UTC m=+20.125945155" observedRunningTime="2026-04-17 07:51:41.688449116 +0000 UTC m=+21.654485894" watchObservedRunningTime="2026-04-17 07:51:41.689472855 +0000 UTC m=+21.655509632" Apr 17 07:51:41.815744 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:41.815684 2578 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 07:51:42.486259 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:42.486122 2578 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T07:51:41.815704673Z","UUID":"05772974-4807-4fb3-8371-8d9066d14803","Handler":null,"Name":"","Endpoint":""} Apr 17 07:51:42.489557 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:42.489529 2578 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 07:51:42.489557 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:42.489561 2578 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 07:51:42.529514 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:42.529482 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s895b" Apr 17 07:51:42.529657 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:42.529609 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s895b" podUID="c47a920d-1db5-42ad-9b8e-ae9649778582" Apr 17 07:51:42.529657 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:42.529494 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k9qmk" Apr 17 07:51:42.529769 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:42.529702 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k9qmk" podUID="34f61707-b762-47d3-b7c3-a54999ad703b" Apr 17 07:51:42.529769 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:42.529488 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vh7t" Apr 17 07:51:42.529875 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:42.529785 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vh7t" podUID="85afae1f-542f-4ccc-b1bd-45ba0e0c418f" Apr 17 07:51:42.661544 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:42.661506 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-77jwq" event={"ID":"ce5742ba-e78c-4173-abe6-deceee95d64e","Type":"ContainerStarted","Data":"621e65f1f09521baf8bd583bacc3768ae373ae6aa0ed532a6dc5c196b615aac0"} Apr 17 07:51:42.661544 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:42.661545 2578 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 07:51:43.667241 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:43.666980 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rnznv_e22e30ae-58f8-41be-9023-53dbea7c6e98/ovn-acl-logging/0.log" Apr 17 07:51:43.667744 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:43.667616 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" event={"ID":"e22e30ae-58f8-41be-9023-53dbea7c6e98","Type":"ContainerStarted","Data":"d7abe0a8a2c368b985e222ee6b4ffb4d162774f760b6c19d73d0523b43809b01"} Apr 17 07:51:43.669959 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:43.669926 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-77jwq" event={"ID":"ce5742ba-e78c-4173-abe6-deceee95d64e","Type":"ContainerStarted","Data":"155ed4d77287fe2e7b45dccc0f66cb417f4a5ad139c127b0d00e3c066ede7ea7"} Apr 17 07:51:43.687692 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:43.687647 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-77jwq" podStartSLOduration=3.890712818 podStartE2EDuration="23.687632724s" podCreationTimestamp="2026-04-17 07:51:20 +0000 UTC" firstStartedPulling="2026-04-17 07:51:23.117869927 +0000 UTC m=+3.083906682" lastFinishedPulling="2026-04-17 07:51:42.914789826 +0000 UTC m=+22.880826588" observedRunningTime="2026-04-17 07:51:43.687588174 +0000 UTC m=+23.653624952" watchObservedRunningTime="2026-04-17 07:51:43.687632724 +0000 UTC m=+23.653669492" Apr 17 07:51:44.529900 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:44.529870 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k9qmk" Apr 17 07:51:44.530080 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:44.529870 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vh7t" Apr 17 07:51:44.530080 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:44.530001 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k9qmk" podUID="34f61707-b762-47d3-b7c3-a54999ad703b" Apr 17 07:51:44.530080 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:44.529870 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s895b" Apr 17 07:51:44.530080 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:44.530064 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vh7t" podUID="85afae1f-542f-4ccc-b1bd-45ba0e0c418f" Apr 17 07:51:44.530294 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:44.530157 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s895b" podUID="c47a920d-1db5-42ad-9b8e-ae9649778582" Apr 17 07:51:45.382547 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:45.382508 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-wwr4f" Apr 17 07:51:45.383192 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:45.382670 2578 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 07:51:45.383192 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:45.383063 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-wwr4f" Apr 17 07:51:45.678637 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:45.678509 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rnznv_e22e30ae-58f8-41be-9023-53dbea7c6e98/ovn-acl-logging/0.log" Apr 17 07:51:45.678939 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:45.678908 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" event={"ID":"e22e30ae-58f8-41be-9023-53dbea7c6e98","Type":"ContainerStarted","Data":"1c64362d7d3903aa1857f0c8785f6179dbe6bdc89af94159f39dd38eea932311"} Apr 17 07:51:45.679512 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:45.679427 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:45.679512 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:45.679456 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:45.679512 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:45.679468 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:45.679512 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:45.679508 2578 scope.go:117] "RemoveContainer" containerID="bef3c4ec5c009ab07cc91d0bbacf3c5dcda30e8e48351f36af6c6c84ae89d334" Apr 17 07:51:45.695683 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:45.695661 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:45.696200 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:45.696179 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:51:46.529480 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:46.529451 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s895b" Apr 17 07:51:46.529480 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:46.529474 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vh7t" Apr 17 07:51:46.530175 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:46.529451 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k9qmk" Apr 17 07:51:46.530175 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:46.529565 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s895b" podUID="c47a920d-1db5-42ad-9b8e-ae9649778582" Apr 17 07:51:46.530175 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:46.529616 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vh7t" podUID="85afae1f-542f-4ccc-b1bd-45ba0e0c418f" Apr 17 07:51:46.530175 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:46.529689 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k9qmk" podUID="34f61707-b762-47d3-b7c3-a54999ad703b" Apr 17 07:51:46.683520 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:46.683494 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rnznv_e22e30ae-58f8-41be-9023-53dbea7c6e98/ovn-acl-logging/0.log" Apr 17 07:51:46.683858 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:46.683836 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" event={"ID":"e22e30ae-58f8-41be-9023-53dbea7c6e98","Type":"ContainerStarted","Data":"ad0656a3d14e03288c82e90dea87d6ad66b48dfc0edc62a7858f49e96e5a9f99"} Apr 17 07:51:46.685389 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:46.685367 2578 generic.go:358] "Generic (PLEG): container finished" podID="3b851da0-b5d9-4467-80b9-e5ec59af0f5b" containerID="4969d7cf72e57e461a82397d22a05ebcbfd8d917d32b82cc006c2a6857effdf3" exitCode=0 Apr 17 07:51:46.685498 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:46.685396 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mk6gb" event={"ID":"3b851da0-b5d9-4467-80b9-e5ec59af0f5b","Type":"ContainerDied","Data":"4969d7cf72e57e461a82397d22a05ebcbfd8d917d32b82cc006c2a6857effdf3"} Apr 17 07:51:46.713331 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:46.713285 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" podStartSLOduration=9.621392202 podStartE2EDuration="26.713272267s" podCreationTimestamp="2026-04-17 07:51:20 +0000 UTC" firstStartedPulling="2026-04-17 07:51:23.116682928 +0000 UTC m=+3.082719684" lastFinishedPulling="2026-04-17 07:51:40.208562991 +0000 UTC m=+20.174599749" observedRunningTime="2026-04-17 07:51:46.711530403 +0000 UTC m=+26.677567181" watchObservedRunningTime="2026-04-17 07:51:46.713272267 +0000 UTC m=+26.679309043" Apr 17 07:51:47.494720 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:47.494523 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-s895b"] Apr 17 07:51:47.494874 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:47.494852 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s895b" Apr 17 07:51:47.494991 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:47.494964 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s895b" podUID="c47a920d-1db5-42ad-9b8e-ae9649778582" Apr 17 07:51:47.496916 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:47.496887 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-k9qmk"] Apr 17 07:51:47.497004 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:47.496992 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k9qmk" Apr 17 07:51:47.497108 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:47.497090 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k9qmk" podUID="34f61707-b762-47d3-b7c3-a54999ad703b" Apr 17 07:51:47.497404 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:47.497386 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6vh7t"] Apr 17 07:51:47.497498 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:47.497479 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vh7t" Apr 17 07:51:47.497594 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:47.497563 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vh7t" podUID="85afae1f-542f-4ccc-b1bd-45ba0e0c418f" Apr 17 07:51:47.689488 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:47.689457 2578 generic.go:358] "Generic (PLEG): container finished" podID="3b851da0-b5d9-4467-80b9-e5ec59af0f5b" containerID="0daf4b2e5397e31b9142af68db9a9b8705562651bb417008ca2b2d42380d0011" exitCode=0 Apr 17 07:51:47.689841 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:47.689534 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mk6gb" event={"ID":"3b851da0-b5d9-4467-80b9-e5ec59af0f5b","Type":"ContainerDied","Data":"0daf4b2e5397e31b9142af68db9a9b8705562651bb417008ca2b2d42380d0011"} Apr 17 07:51:48.693260 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:48.693229 2578 generic.go:358] "Generic (PLEG): container finished" podID="3b851da0-b5d9-4467-80b9-e5ec59af0f5b" containerID="17f1ccc71122fc23e85fd4ead492f3b3346a3b76f889c394dc2c3f0569dcbe47" exitCode=0 Apr 17 07:51:48.693635 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:48.693269 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mk6gb" event={"ID":"3b851da0-b5d9-4467-80b9-e5ec59af0f5b","Type":"ContainerDied","Data":"17f1ccc71122fc23e85fd4ead492f3b3346a3b76f889c394dc2c3f0569dcbe47"} Apr 17 07:51:49.529705 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:49.529671 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s895b" Apr 17 07:51:49.529705 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:49.529707 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vh7t" Apr 17 07:51:49.529924 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:49.529676 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k9qmk" Apr 17 07:51:49.529924 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:49.529805 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s895b" podUID="c47a920d-1db5-42ad-9b8e-ae9649778582" Apr 17 07:51:49.529924 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:49.529896 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k9qmk" podUID="34f61707-b762-47d3-b7c3-a54999ad703b" Apr 17 07:51:49.530030 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:49.529986 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vh7t" podUID="85afae1f-542f-4ccc-b1bd-45ba0e0c418f" Apr 17 07:51:51.529275 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:51.529243 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s895b" Apr 17 07:51:51.529949 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:51.529243 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k9qmk" Apr 17 07:51:51.529949 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:51.529361 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s895b" podUID="c47a920d-1db5-42ad-9b8e-ae9649778582" Apr 17 07:51:51.529949 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:51.529245 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vh7t" Apr 17 07:51:51.529949 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:51.529428 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k9qmk" podUID="34f61707-b762-47d3-b7c3-a54999ad703b" Apr 17 07:51:51.529949 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:51.529529 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vh7t" podUID="85afae1f-542f-4ccc-b1bd-45ba0e0c418f" Apr 17 07:51:53.529407 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:53.529372 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k9qmk" Apr 17 07:51:53.529879 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:53.529371 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s895b" Apr 17 07:51:53.529879 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:53.529497 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vh7t" Apr 17 07:51:53.529879 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:53.529503 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k9qmk" podUID="34f61707-b762-47d3-b7c3-a54999ad703b" Apr 17 07:51:53.529879 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:53.529560 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s895b" podUID="c47a920d-1db5-42ad-9b8e-ae9649778582" Apr 17 07:51:53.529879 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:53.529658 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vh7t" podUID="85afae1f-542f-4ccc-b1bd-45ba0e0c418f" Apr 17 07:51:54.275570 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:54.275538 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/85afae1f-542f-4ccc-b1bd-45ba0e0c418f-metrics-certs\") pod \"network-metrics-daemon-6vh7t\" (UID: \"85afae1f-542f-4ccc-b1bd-45ba0e0c418f\") " pod="openshift-multus/network-metrics-daemon-6vh7t" Apr 17 07:51:54.275570 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:54.275575 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x87jc\" (UniqueName: \"kubernetes.io/projected/c47a920d-1db5-42ad-9b8e-ae9649778582-kube-api-access-x87jc\") pod \"network-check-target-s895b\" (UID: \"c47a920d-1db5-42ad-9b8e-ae9649778582\") " pod="openshift-network-diagnostics/network-check-target-s895b" Apr 17 07:51:54.275814 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:54.275607 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/34f61707-b762-47d3-b7c3-a54999ad703b-original-pull-secret\") pod \"global-pull-secret-syncer-k9qmk\" (UID: \"34f61707-b762-47d3-b7c3-a54999ad703b\") " pod="kube-system/global-pull-secret-syncer-k9qmk" Apr 17 07:51:54.275814 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:54.275689 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 07:51:54.275814 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:54.275722 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:51:54.275814 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:54.275740 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34f61707-b762-47d3-b7c3-a54999ad703b-original-pull-secret podName:34f61707-b762-47d3-b7c3-a54999ad703b nodeName:}" failed. No retries permitted until 2026-04-17 07:52:26.275725885 +0000 UTC m=+66.241762640 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/34f61707-b762-47d3-b7c3-a54999ad703b-original-pull-secret") pod "global-pull-secret-syncer-k9qmk" (UID: "34f61707-b762-47d3-b7c3-a54999ad703b") : object "kube-system"/"original-pull-secret" not registered Apr 17 07:51:54.275814 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:54.275745 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:51:54.275814 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:54.275758 2578 projected.go:194] Error preparing data for projected volume kube-api-access-x87jc for pod openshift-network-diagnostics/network-check-target-s895b: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:54.275814 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:54.275689 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:54.275814 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:54.275808 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c47a920d-1db5-42ad-9b8e-ae9649778582-kube-api-access-x87jc podName:c47a920d-1db5-42ad-9b8e-ae9649778582 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:26.275790945 +0000 UTC m=+66.241827704 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-x87jc" (UniqueName: "kubernetes.io/projected/c47a920d-1db5-42ad-9b8e-ae9649778582-kube-api-access-x87jc") pod "network-check-target-s895b" (UID: "c47a920d-1db5-42ad-9b8e-ae9649778582") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:51:54.276078 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:54.275830 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85afae1f-542f-4ccc-b1bd-45ba0e0c418f-metrics-certs podName:85afae1f-542f-4ccc-b1bd-45ba0e0c418f nodeName:}" failed. No retries permitted until 2026-04-17 07:52:26.275818917 +0000 UTC m=+66.241855673 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/85afae1f-542f-4ccc-b1bd-45ba0e0c418f-metrics-certs") pod "network-metrics-daemon-6vh7t" (UID: "85afae1f-542f-4ccc-b1bd-45ba0e0c418f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:51:54.422185 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:54.422159 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-176.ec2.internal" event="NodeReady" Apr 17 07:51:54.422313 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:54.422276 2578 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 07:51:54.464112 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:54.464044 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-xn7q4"] Apr 17 07:51:54.502427 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:54.502396 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-k99jf"] Apr 17 07:51:54.502569 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:54.502550 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xn7q4" Apr 17 07:51:54.504759 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:54.504739 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-mw88v\"" Apr 17 07:51:54.504928 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:54.504904 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 07:51:54.504981 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:54.504965 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 07:51:54.523331 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:54.523310 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-xn7q4"] Apr 17 07:51:54.523430 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:54.523340 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-k99jf"] Apr 17 07:51:54.523471 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:54.523457 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-k99jf" Apr 17 07:51:54.526044 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:54.526009 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 07:51:54.526194 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:54.526174 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 07:51:54.526530 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:54.526505 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 07:51:54.526624 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:54.526517 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9vrx5\"" Apr 17 07:51:54.678044 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:54.678015 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfg7l\" (UniqueName: \"kubernetes.io/projected/86725d0f-c4d4-499a-96f9-106af3387cc2-kube-api-access-xfg7l\") pod \"dns-default-xn7q4\" (UID: \"86725d0f-c4d4-499a-96f9-106af3387cc2\") " pod="openshift-dns/dns-default-xn7q4" Apr 17 07:51:54.678507 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:54.678127 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86725d0f-c4d4-499a-96f9-106af3387cc2-config-volume\") pod \"dns-default-xn7q4\" (UID: \"86725d0f-c4d4-499a-96f9-106af3387cc2\") " pod="openshift-dns/dns-default-xn7q4" Apr 17 07:51:54.678507 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:54.678170 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86725d0f-c4d4-499a-96f9-106af3387cc2-metrics-tls\") pod \"dns-default-xn7q4\" (UID: \"86725d0f-c4d4-499a-96f9-106af3387cc2\") " pod="openshift-dns/dns-default-xn7q4" Apr 17 07:51:54.678507 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:54.678195 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ecab5c24-0616-4d6e-93a1-4c29b1548a0d-cert\") pod \"ingress-canary-k99jf\" (UID: \"ecab5c24-0616-4d6e-93a1-4c29b1548a0d\") " pod="openshift-ingress-canary/ingress-canary-k99jf" Apr 17 07:51:54.678507 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:54.678229 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/86725d0f-c4d4-499a-96f9-106af3387cc2-tmp-dir\") pod \"dns-default-xn7q4\" (UID: \"86725d0f-c4d4-499a-96f9-106af3387cc2\") " pod="openshift-dns/dns-default-xn7q4" Apr 17 07:51:54.678507 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:54.678315 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npl45\" (UniqueName: \"kubernetes.io/projected/ecab5c24-0616-4d6e-93a1-4c29b1548a0d-kube-api-access-npl45\") pod \"ingress-canary-k99jf\" (UID: \"ecab5c24-0616-4d6e-93a1-4c29b1548a0d\") " pod="openshift-ingress-canary/ingress-canary-k99jf" Apr 17 07:51:54.706719 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:54.706687 2578 generic.go:358] "Generic (PLEG): container finished" podID="3b851da0-b5d9-4467-80b9-e5ec59af0f5b" containerID="2648a9d052d6d2161850c2a088a26b0f774be6f72cd180439cbb97bbc8e4096e" exitCode=0 Apr 17 07:51:54.706809 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:54.706769 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mk6gb" event={"ID":"3b851da0-b5d9-4467-80b9-e5ec59af0f5b","Type":"ContainerDied","Data":"2648a9d052d6d2161850c2a088a26b0f774be6f72cd180439cbb97bbc8e4096e"} Apr 17 07:51:54.778720 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:54.778693 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xfg7l\" (UniqueName: \"kubernetes.io/projected/86725d0f-c4d4-499a-96f9-106af3387cc2-kube-api-access-xfg7l\") pod \"dns-default-xn7q4\" (UID: \"86725d0f-c4d4-499a-96f9-106af3387cc2\") " pod="openshift-dns/dns-default-xn7q4" Apr 17 07:51:54.778829 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:54.778789 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86725d0f-c4d4-499a-96f9-106af3387cc2-config-volume\") pod \"dns-default-xn7q4\" (UID: \"86725d0f-c4d4-499a-96f9-106af3387cc2\") " pod="openshift-dns/dns-default-xn7q4" Apr 17 07:51:54.778829 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:54.778819 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86725d0f-c4d4-499a-96f9-106af3387cc2-metrics-tls\") pod \"dns-default-xn7q4\" (UID: \"86725d0f-c4d4-499a-96f9-106af3387cc2\") " pod="openshift-dns/dns-default-xn7q4" Apr 17 07:51:54.778940 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:54.778846 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ecab5c24-0616-4d6e-93a1-4c29b1548a0d-cert\") pod \"ingress-canary-k99jf\" (UID: \"ecab5c24-0616-4d6e-93a1-4c29b1548a0d\") " pod="openshift-ingress-canary/ingress-canary-k99jf" Apr 17 07:51:54.778940 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:54.778883 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/86725d0f-c4d4-499a-96f9-106af3387cc2-tmp-dir\") pod \"dns-default-xn7q4\" (UID: \"86725d0f-c4d4-499a-96f9-106af3387cc2\") " pod="openshift-dns/dns-default-xn7q4" Apr 17 07:51:54.778940 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:54.778913 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-npl45\" (UniqueName: \"kubernetes.io/projected/ecab5c24-0616-4d6e-93a1-4c29b1548a0d-kube-api-access-npl45\") pod \"ingress-canary-k99jf\" (UID: \"ecab5c24-0616-4d6e-93a1-4c29b1548a0d\") " pod="openshift-ingress-canary/ingress-canary-k99jf" Apr 17 07:51:54.779100 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:54.778942 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:51:54.779100 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:54.779010 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86725d0f-c4d4-499a-96f9-106af3387cc2-metrics-tls podName:86725d0f-c4d4-499a-96f9-106af3387cc2 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:55.278988763 +0000 UTC m=+35.245025521 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/86725d0f-c4d4-499a-96f9-106af3387cc2-metrics-tls") pod "dns-default-xn7q4" (UID: "86725d0f-c4d4-499a-96f9-106af3387cc2") : secret "dns-default-metrics-tls" not found Apr 17 07:51:54.779100 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:54.779041 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:51:54.779286 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:54.779100 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecab5c24-0616-4d6e-93a1-4c29b1548a0d-cert podName:ecab5c24-0616-4d6e-93a1-4c29b1548a0d nodeName:}" failed. No retries permitted until 2026-04-17 07:51:55.279081995 +0000 UTC m=+35.245118750 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ecab5c24-0616-4d6e-93a1-4c29b1548a0d-cert") pod "ingress-canary-k99jf" (UID: "ecab5c24-0616-4d6e-93a1-4c29b1548a0d") : secret "canary-serving-cert" not found Apr 17 07:51:54.779286 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:54.779226 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/86725d0f-c4d4-499a-96f9-106af3387cc2-tmp-dir\") pod \"dns-default-xn7q4\" (UID: \"86725d0f-c4d4-499a-96f9-106af3387cc2\") " pod="openshift-dns/dns-default-xn7q4" Apr 17 07:51:54.779389 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:54.779363 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86725d0f-c4d4-499a-96f9-106af3387cc2-config-volume\") pod \"dns-default-xn7q4\" (UID: \"86725d0f-c4d4-499a-96f9-106af3387cc2\") " pod="openshift-dns/dns-default-xn7q4" Apr 17 07:51:54.789348 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:54.789321 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfg7l\" (UniqueName: \"kubernetes.io/projected/86725d0f-c4d4-499a-96f9-106af3387cc2-kube-api-access-xfg7l\") pod \"dns-default-xn7q4\" (UID: \"86725d0f-c4d4-499a-96f9-106af3387cc2\") " pod="openshift-dns/dns-default-xn7q4" Apr 17 07:51:54.789433 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:54.789418 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-npl45\" (UniqueName: \"kubernetes.io/projected/ecab5c24-0616-4d6e-93a1-4c29b1548a0d-kube-api-access-npl45\") pod \"ingress-canary-k99jf\" (UID: \"ecab5c24-0616-4d6e-93a1-4c29b1548a0d\") " pod="openshift-ingress-canary/ingress-canary-k99jf" Apr 17 07:51:55.283124 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:55.282938 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86725d0f-c4d4-499a-96f9-106af3387cc2-metrics-tls\") pod \"dns-default-xn7q4\" (UID: \"86725d0f-c4d4-499a-96f9-106af3387cc2\") " pod="openshift-dns/dns-default-xn7q4" Apr 17 07:51:55.283124 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:55.283130 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ecab5c24-0616-4d6e-93a1-4c29b1548a0d-cert\") pod \"ingress-canary-k99jf\" (UID: \"ecab5c24-0616-4d6e-93a1-4c29b1548a0d\") " pod="openshift-ingress-canary/ingress-canary-k99jf" Apr 17 07:51:55.283340 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:55.283078 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:51:55.283340 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:55.283236 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:51:55.283340 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:55.283246 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86725d0f-c4d4-499a-96f9-106af3387cc2-metrics-tls podName:86725d0f-c4d4-499a-96f9-106af3387cc2 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:56.283231741 +0000 UTC m=+36.249268495 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/86725d0f-c4d4-499a-96f9-106af3387cc2-metrics-tls") pod "dns-default-xn7q4" (UID: "86725d0f-c4d4-499a-96f9-106af3387cc2") : secret "dns-default-metrics-tls" not found Apr 17 07:51:55.283340 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:55.283276 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecab5c24-0616-4d6e-93a1-4c29b1548a0d-cert podName:ecab5c24-0616-4d6e-93a1-4c29b1548a0d nodeName:}" failed. No retries permitted until 2026-04-17 07:51:56.283263704 +0000 UTC m=+36.249300458 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ecab5c24-0616-4d6e-93a1-4c29b1548a0d-cert") pod "ingress-canary-k99jf" (UID: "ecab5c24-0616-4d6e-93a1-4c29b1548a0d") : secret "canary-serving-cert" not found Apr 17 07:51:55.529508 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:55.529484 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vh7t" Apr 17 07:51:55.529655 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:55.529484 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s895b" Apr 17 07:51:55.529720 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:55.529484 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k9qmk" Apr 17 07:51:55.531871 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:55.531848 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 07:51:55.531995 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:55.531874 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-tjzzs\"" Apr 17 07:51:55.531995 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:55.531876 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-bmlbc\"" Apr 17 07:51:55.531995 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:55.531972 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 07:51:55.531995 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:55.531993 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 07:51:55.532586 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:55.532572 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 07:51:55.711311 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:55.711280 2578 generic.go:358] "Generic (PLEG): container finished" podID="3b851da0-b5d9-4467-80b9-e5ec59af0f5b" containerID="7942991fad20334033adcbaf166127f70ec40dc7d4259dad100167b8ac017738" exitCode=0 Apr 17 07:51:55.711646 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:55.711313 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mk6gb" event={"ID":"3b851da0-b5d9-4467-80b9-e5ec59af0f5b","Type":"ContainerDied","Data":"7942991fad20334033adcbaf166127f70ec40dc7d4259dad100167b8ac017738"} Apr 17 07:51:56.289326 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:56.289285 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86725d0f-c4d4-499a-96f9-106af3387cc2-metrics-tls\") pod \"dns-default-xn7q4\" (UID: \"86725d0f-c4d4-499a-96f9-106af3387cc2\") " pod="openshift-dns/dns-default-xn7q4" Apr 17 07:51:56.289491 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:56.289344 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ecab5c24-0616-4d6e-93a1-4c29b1548a0d-cert\") pod \"ingress-canary-k99jf\" (UID: \"ecab5c24-0616-4d6e-93a1-4c29b1548a0d\") " pod="openshift-ingress-canary/ingress-canary-k99jf" Apr 17 07:51:56.289491 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:56.289476 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:51:56.289603 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:56.289494 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:51:56.289603 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:56.289547 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecab5c24-0616-4d6e-93a1-4c29b1548a0d-cert podName:ecab5c24-0616-4d6e-93a1-4c29b1548a0d nodeName:}" failed. No retries permitted until 2026-04-17 07:51:58.289533989 +0000 UTC m=+38.255570743 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ecab5c24-0616-4d6e-93a1-4c29b1548a0d-cert") pod "ingress-canary-k99jf" (UID: "ecab5c24-0616-4d6e-93a1-4c29b1548a0d") : secret "canary-serving-cert" not found Apr 17 07:51:56.289603 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:56.289581 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86725d0f-c4d4-499a-96f9-106af3387cc2-metrics-tls podName:86725d0f-c4d4-499a-96f9-106af3387cc2 nodeName:}" failed. No retries permitted until 2026-04-17 07:51:58.289555066 +0000 UTC m=+38.255591820 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/86725d0f-c4d4-499a-96f9-106af3387cc2-metrics-tls") pod "dns-default-xn7q4" (UID: "86725d0f-c4d4-499a-96f9-106af3387cc2") : secret "dns-default-metrics-tls" not found Apr 17 07:51:56.715868 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:56.715831 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mk6gb" event={"ID":"3b851da0-b5d9-4467-80b9-e5ec59af0f5b","Type":"ContainerStarted","Data":"3722d9c9c17d4684ef07cbcd297bcf8b4087d81983352cd8ce1cd84f9cd22f1b"} Apr 17 07:51:56.738127 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:56.736695 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-mk6gb" podStartSLOduration=5.491360692 podStartE2EDuration="36.736650924s" podCreationTimestamp="2026-04-17 07:51:20 +0000 UTC" firstStartedPulling="2026-04-17 07:51:23.114332353 +0000 UTC m=+3.080369109" lastFinishedPulling="2026-04-17 07:51:54.359622586 +0000 UTC m=+34.325659341" observedRunningTime="2026-04-17 07:51:56.736456726 +0000 UTC m=+36.702493504" watchObservedRunningTime="2026-04-17 07:51:56.736650924 +0000 UTC m=+36.702687695" Apr 17 07:51:58.306719 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:58.306682 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86725d0f-c4d4-499a-96f9-106af3387cc2-metrics-tls\") pod \"dns-default-xn7q4\" (UID: \"86725d0f-c4d4-499a-96f9-106af3387cc2\") " pod="openshift-dns/dns-default-xn7q4" Apr 17 07:51:58.306719 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:51:58.306718 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ecab5c24-0616-4d6e-93a1-4c29b1548a0d-cert\") pod \"ingress-canary-k99jf\" (UID: \"ecab5c24-0616-4d6e-93a1-4c29b1548a0d\") " pod="openshift-ingress-canary/ingress-canary-k99jf" Apr 17 07:51:58.307106 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:58.306828 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:51:58.307106 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:58.306889 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86725d0f-c4d4-499a-96f9-106af3387cc2-metrics-tls podName:86725d0f-c4d4-499a-96f9-106af3387cc2 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:02.306873952 +0000 UTC m=+42.272910707 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/86725d0f-c4d4-499a-96f9-106af3387cc2-metrics-tls") pod "dns-default-xn7q4" (UID: "86725d0f-c4d4-499a-96f9-106af3387cc2") : secret "dns-default-metrics-tls" not found Apr 17 07:51:58.307106 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:58.306833 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:51:58.307106 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:51:58.306959 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecab5c24-0616-4d6e-93a1-4c29b1548a0d-cert podName:ecab5c24-0616-4d6e-93a1-4c29b1548a0d nodeName:}" failed. No retries permitted until 2026-04-17 07:52:02.306944544 +0000 UTC m=+42.272981316 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ecab5c24-0616-4d6e-93a1-4c29b1548a0d-cert") pod "ingress-canary-k99jf" (UID: "ecab5c24-0616-4d6e-93a1-4c29b1548a0d") : secret "canary-serving-cert" not found Apr 17 07:52:02.334965 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:52:02.334915 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86725d0f-c4d4-499a-96f9-106af3387cc2-metrics-tls\") pod \"dns-default-xn7q4\" (UID: \"86725d0f-c4d4-499a-96f9-106af3387cc2\") " pod="openshift-dns/dns-default-xn7q4" Apr 17 07:52:02.335548 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:52:02.334965 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ecab5c24-0616-4d6e-93a1-4c29b1548a0d-cert\") pod \"ingress-canary-k99jf\" (UID: \"ecab5c24-0616-4d6e-93a1-4c29b1548a0d\") " pod="openshift-ingress-canary/ingress-canary-k99jf" Apr 17 07:52:02.335633 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:52:02.335610 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:52:02.335887 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:52:02.335869 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86725d0f-c4d4-499a-96f9-106af3387cc2-metrics-tls podName:86725d0f-c4d4-499a-96f9-106af3387cc2 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:10.33584058 +0000 UTC m=+50.301877337 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/86725d0f-c4d4-499a-96f9-106af3387cc2-metrics-tls") pod "dns-default-xn7q4" (UID: "86725d0f-c4d4-499a-96f9-106af3387cc2") : secret "dns-default-metrics-tls" not found Apr 17 07:52:02.338587 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:52:02.336131 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:52:02.338587 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:52:02.336255 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecab5c24-0616-4d6e-93a1-4c29b1548a0d-cert podName:ecab5c24-0616-4d6e-93a1-4c29b1548a0d nodeName:}" failed. No retries permitted until 2026-04-17 07:52:10.336222647 +0000 UTC m=+50.302259405 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ecab5c24-0616-4d6e-93a1-4c29b1548a0d-cert") pod "ingress-canary-k99jf" (UID: "ecab5c24-0616-4d6e-93a1-4c29b1548a0d") : secret "canary-serving-cert" not found Apr 17 07:52:10.387201 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:52:10.387158 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86725d0f-c4d4-499a-96f9-106af3387cc2-metrics-tls\") pod \"dns-default-xn7q4\" (UID: \"86725d0f-c4d4-499a-96f9-106af3387cc2\") " pod="openshift-dns/dns-default-xn7q4" Apr 17 07:52:10.387838 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:52:10.387212 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ecab5c24-0616-4d6e-93a1-4c29b1548a0d-cert\") pod \"ingress-canary-k99jf\" (UID: \"ecab5c24-0616-4d6e-93a1-4c29b1548a0d\") " pod="openshift-ingress-canary/ingress-canary-k99jf" Apr 17 07:52:10.387838 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:52:10.387371 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:52:10.387838 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:52:10.387374 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:52:10.387838 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:52:10.387460 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecab5c24-0616-4d6e-93a1-4c29b1548a0d-cert podName:ecab5c24-0616-4d6e-93a1-4c29b1548a0d nodeName:}" failed. No retries permitted until 2026-04-17 07:52:26.387439062 +0000 UTC m=+66.353475819 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ecab5c24-0616-4d6e-93a1-4c29b1548a0d-cert") pod "ingress-canary-k99jf" (UID: "ecab5c24-0616-4d6e-93a1-4c29b1548a0d") : secret "canary-serving-cert" not found Apr 17 07:52:10.387838 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:52:10.387481 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86725d0f-c4d4-499a-96f9-106af3387cc2-metrics-tls podName:86725d0f-c4d4-499a-96f9-106af3387cc2 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:26.387471319 +0000 UTC m=+66.353508076 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/86725d0f-c4d4-499a-96f9-106af3387cc2-metrics-tls") pod "dns-default-xn7q4" (UID: "86725d0f-c4d4-499a-96f9-106af3387cc2") : secret "dns-default-metrics-tls" not found Apr 17 07:52:17.699910 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:52:17.699881 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rnznv" Apr 17 07:52:26.303980 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:52:26.303932 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/85afae1f-542f-4ccc-b1bd-45ba0e0c418f-metrics-certs\") pod \"network-metrics-daemon-6vh7t\" (UID: \"85afae1f-542f-4ccc-b1bd-45ba0e0c418f\") " pod="openshift-multus/network-metrics-daemon-6vh7t" Apr 17 07:52:26.303980 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:52:26.303985 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x87jc\" (UniqueName: \"kubernetes.io/projected/c47a920d-1db5-42ad-9b8e-ae9649778582-kube-api-access-x87jc\") pod \"network-check-target-s895b\" (UID: \"c47a920d-1db5-42ad-9b8e-ae9649778582\") " pod="openshift-network-diagnostics/network-check-target-s895b" Apr 17 07:52:26.304564 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:52:26.304019 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/34f61707-b762-47d3-b7c3-a54999ad703b-original-pull-secret\") pod \"global-pull-secret-syncer-k9qmk\" (UID: \"34f61707-b762-47d3-b7c3-a54999ad703b\") " pod="kube-system/global-pull-secret-syncer-k9qmk" Apr 17 07:52:26.306623 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:52:26.306602 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 07:52:26.306715 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:52:26.306604 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 07:52:26.306715 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:52:26.306708 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 07:52:26.314735 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:52:26.314715 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 07:52:26.314827 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:52:26.314794 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85afae1f-542f-4ccc-b1bd-45ba0e0c418f-metrics-certs podName:85afae1f-542f-4ccc-b1bd-45ba0e0c418f nodeName:}" failed. No retries permitted until 2026-04-17 07:53:30.314773682 +0000 UTC m=+130.280810448 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/85afae1f-542f-4ccc-b1bd-45ba0e0c418f-metrics-certs") pod "network-metrics-daemon-6vh7t" (UID: "85afae1f-542f-4ccc-b1bd-45ba0e0c418f") : secret "metrics-daemon-secret" not found Apr 17 07:52:26.316721 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:52:26.316705 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 07:52:26.317678 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:52:26.317652 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/34f61707-b762-47d3-b7c3-a54999ad703b-original-pull-secret\") pod \"global-pull-secret-syncer-k9qmk\" (UID: \"34f61707-b762-47d3-b7c3-a54999ad703b\") " pod="kube-system/global-pull-secret-syncer-k9qmk" Apr 17 07:52:26.328459 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:52:26.328436 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x87jc\" (UniqueName: \"kubernetes.io/projected/c47a920d-1db5-42ad-9b8e-ae9649778582-kube-api-access-x87jc\") pod \"network-check-target-s895b\" (UID: \"c47a920d-1db5-42ad-9b8e-ae9649778582\") " pod="openshift-network-diagnostics/network-check-target-s895b" Apr 17 07:52:26.404259 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:52:26.404234 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86725d0f-c4d4-499a-96f9-106af3387cc2-metrics-tls\") pod \"dns-default-xn7q4\" (UID: \"86725d0f-c4d4-499a-96f9-106af3387cc2\") " pod="openshift-dns/dns-default-xn7q4" Apr 17 07:52:26.404343 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:52:26.404265 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ecab5c24-0616-4d6e-93a1-4c29b1548a0d-cert\") pod \"ingress-canary-k99jf\" (UID: \"ecab5c24-0616-4d6e-93a1-4c29b1548a0d\") " pod="openshift-ingress-canary/ingress-canary-k99jf" Apr 17 07:52:26.404381 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:52:26.404360 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:52:26.404421 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:52:26.404411 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86725d0f-c4d4-499a-96f9-106af3387cc2-metrics-tls podName:86725d0f-c4d4-499a-96f9-106af3387cc2 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:58.404397748 +0000 UTC m=+98.370434503 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/86725d0f-c4d4-499a-96f9-106af3387cc2-metrics-tls") pod "dns-default-xn7q4" (UID: "86725d0f-c4d4-499a-96f9-106af3387cc2") : secret "dns-default-metrics-tls" not found Apr 17 07:52:26.404467 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:52:26.404365 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:52:26.404510 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:52:26.404498 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecab5c24-0616-4d6e-93a1-4c29b1548a0d-cert podName:ecab5c24-0616-4d6e-93a1-4c29b1548a0d nodeName:}" failed. No retries permitted until 2026-04-17 07:52:58.404486721 +0000 UTC m=+98.370523476 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ecab5c24-0616-4d6e-93a1-4c29b1548a0d-cert") pod "ingress-canary-k99jf" (UID: "ecab5c24-0616-4d6e-93a1-4c29b1548a0d") : secret "canary-serving-cert" not found Apr 17 07:52:26.446422 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:52:26.446396 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-bmlbc\"" Apr 17 07:52:26.449204 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:52:26.449192 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k9qmk" Apr 17 07:52:26.454808 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:52:26.454789 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s895b" Apr 17 07:52:26.601327 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:52:26.601292 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-s895b"] Apr 17 07:52:26.607366 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:52:26.607340 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc47a920d_1db5_42ad_9b8e_ae9649778582.slice/crio-6447fbcb2ff33f0125c0797c10a9a94f35628031c6ecbe41c056fc97f8d846fc WatchSource:0}: Error finding container 6447fbcb2ff33f0125c0797c10a9a94f35628031c6ecbe41c056fc97f8d846fc: Status 404 returned error can't find the container with id 6447fbcb2ff33f0125c0797c10a9a94f35628031c6ecbe41c056fc97f8d846fc Apr 17 07:52:26.614235 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:52:26.614212 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-k9qmk"] Apr 17 07:52:26.617778 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:52:26.617754 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34f61707_b762_47d3_b7c3_a54999ad703b.slice/crio-ded2a839fa57b8f0cc61173fe9032107f1fac41a7521e0da74628a0cb16e504f WatchSource:0}: Error finding container ded2a839fa57b8f0cc61173fe9032107f1fac41a7521e0da74628a0cb16e504f: Status 404 returned error can't find the container with id ded2a839fa57b8f0cc61173fe9032107f1fac41a7521e0da74628a0cb16e504f Apr 17 07:52:26.772397 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:52:26.772363 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-s895b" event={"ID":"c47a920d-1db5-42ad-9b8e-ae9649778582","Type":"ContainerStarted","Data":"6447fbcb2ff33f0125c0797c10a9a94f35628031c6ecbe41c056fc97f8d846fc"} Apr 17 07:52:26.773239 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:52:26.773213 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-k9qmk" event={"ID":"34f61707-b762-47d3-b7c3-a54999ad703b","Type":"ContainerStarted","Data":"ded2a839fa57b8f0cc61173fe9032107f1fac41a7521e0da74628a0cb16e504f"} Apr 17 07:52:31.784437 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:52:31.784401 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-s895b" event={"ID":"c47a920d-1db5-42ad-9b8e-ae9649778582","Type":"ContainerStarted","Data":"9b6250466abe0a3d1fdd7f9bbe9750eaf7e89b187c9b2018c194eb65ec582032"} Apr 17 07:52:31.784943 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:52:31.784521 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-s895b" Apr 17 07:52:31.785745 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:52:31.785721 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-k9qmk" event={"ID":"34f61707-b762-47d3-b7c3-a54999ad703b","Type":"ContainerStarted","Data":"9d18cfa6f382f72ecfc7cd9421bbe26618bcf625fedba396ce9217d6c9bb51a3"} Apr 17 07:52:31.799599 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:52:31.799561 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-s895b" podStartSLOduration=67.475862199 podStartE2EDuration="1m11.799551354s" podCreationTimestamp="2026-04-17 07:51:20 +0000 UTC" firstStartedPulling="2026-04-17 07:52:26.609163926 +0000 UTC m=+66.575200681" lastFinishedPulling="2026-04-17 07:52:30.932853079 +0000 UTC m=+70.898889836" observedRunningTime="2026-04-17 07:52:31.79897429 +0000 UTC m=+71.765011089" watchObservedRunningTime="2026-04-17 07:52:31.799551354 +0000 UTC m=+71.765588131" Apr 17 07:52:31.813452 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:52:31.813408 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-k9qmk" podStartSLOduration=66.494955636 podStartE2EDuration="1m10.813396458s" podCreationTimestamp="2026-04-17 07:51:21 +0000 UTC" firstStartedPulling="2026-04-17 07:52:26.619361699 +0000 UTC m=+66.585398458" lastFinishedPulling="2026-04-17 07:52:30.937802525 +0000 UTC m=+70.903839280" observedRunningTime="2026-04-17 07:52:31.813025095 +0000 UTC m=+71.779061872" watchObservedRunningTime="2026-04-17 07:52:31.813396458 +0000 UTC m=+71.779433236" Apr 17 07:52:58.416815 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:52:58.416757 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86725d0f-c4d4-499a-96f9-106af3387cc2-metrics-tls\") pod \"dns-default-xn7q4\" (UID: \"86725d0f-c4d4-499a-96f9-106af3387cc2\") " pod="openshift-dns/dns-default-xn7q4" Apr 17 07:52:58.416815 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:52:58.416819 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ecab5c24-0616-4d6e-93a1-4c29b1548a0d-cert\") pod \"ingress-canary-k99jf\" (UID: \"ecab5c24-0616-4d6e-93a1-4c29b1548a0d\") " pod="openshift-ingress-canary/ingress-canary-k99jf" Apr 17 07:52:58.417337 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:52:58.416902 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:52:58.417337 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:52:58.416979 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86725d0f-c4d4-499a-96f9-106af3387cc2-metrics-tls podName:86725d0f-c4d4-499a-96f9-106af3387cc2 nodeName:}" failed. No retries permitted until 2026-04-17 07:54:02.416962051 +0000 UTC m=+162.382998805 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/86725d0f-c4d4-499a-96f9-106af3387cc2-metrics-tls") pod "dns-default-xn7q4" (UID: "86725d0f-c4d4-499a-96f9-106af3387cc2") : secret "dns-default-metrics-tls" not found Apr 17 07:52:58.417337 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:52:58.416902 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:52:58.417337 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:52:58.417052 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecab5c24-0616-4d6e-93a1-4c29b1548a0d-cert podName:ecab5c24-0616-4d6e-93a1-4c29b1548a0d nodeName:}" failed. No retries permitted until 2026-04-17 07:54:02.417038694 +0000 UTC m=+162.383075449 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ecab5c24-0616-4d6e-93a1-4c29b1548a0d-cert") pod "ingress-canary-k99jf" (UID: "ecab5c24-0616-4d6e-93a1-4c29b1548a0d") : secret "canary-serving-cert" not found Apr 17 07:53:02.789928 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:02.789895 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-s895b" Apr 17 07:53:10.325121 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.325085 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-9wbj9"] Apr 17 07:53:10.329326 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.329308 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-9wbj9" Apr 17 07:53:10.331462 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.331435 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 17 07:53:10.331575 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.331470 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-7g692\"" Apr 17 07:53:10.331575 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.331449 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 07:53:10.332324 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.332306 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 17 07:53:10.332437 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.332366 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 07:53:10.335425 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.335403 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-9wbj9"] Apr 17 07:53:10.336366 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.336340 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 17 07:53:10.397395 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.397364 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/12552903-12e4-45a7-8194-2c6cc6a37b21-snapshots\") pod \"insights-operator-585dfdc468-9wbj9\" (UID: \"12552903-12e4-45a7-8194-2c6cc6a37b21\") " pod="openshift-insights/insights-operator-585dfdc468-9wbj9" Apr 17 07:53:10.397548 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.397423 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/12552903-12e4-45a7-8194-2c6cc6a37b21-tmp\") pod \"insights-operator-585dfdc468-9wbj9\" (UID: \"12552903-12e4-45a7-8194-2c6cc6a37b21\") " pod="openshift-insights/insights-operator-585dfdc468-9wbj9" Apr 17 07:53:10.397548 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.397472 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12552903-12e4-45a7-8194-2c6cc6a37b21-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-9wbj9\" (UID: \"12552903-12e4-45a7-8194-2c6cc6a37b21\") " pod="openshift-insights/insights-operator-585dfdc468-9wbj9" Apr 17 07:53:10.397548 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.397516 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12552903-12e4-45a7-8194-2c6cc6a37b21-service-ca-bundle\") pod \"insights-operator-585dfdc468-9wbj9\" (UID: \"12552903-12e4-45a7-8194-2c6cc6a37b21\") " pod="openshift-insights/insights-operator-585dfdc468-9wbj9" Apr 17 07:53:10.397548 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.397536 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12552903-12e4-45a7-8194-2c6cc6a37b21-serving-cert\") pod \"insights-operator-585dfdc468-9wbj9\" (UID: \"12552903-12e4-45a7-8194-2c6cc6a37b21\") " pod="openshift-insights/insights-operator-585dfdc468-9wbj9" Apr 17 07:53:10.397691 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.397560 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lxpj\" (UniqueName: \"kubernetes.io/projected/12552903-12e4-45a7-8194-2c6cc6a37b21-kube-api-access-5lxpj\") pod \"insights-operator-585dfdc468-9wbj9\" (UID: \"12552903-12e4-45a7-8194-2c6cc6a37b21\") " pod="openshift-insights/insights-operator-585dfdc468-9wbj9" Apr 17 07:53:10.439213 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.439182 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5b66495d-5v9dj"] Apr 17 07:53:10.442212 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.442197 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5b66495d-5v9dj" Apr 17 07:53:10.445009 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.444983 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 17 07:53:10.445120 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.445020 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-64bdg\"" Apr 17 07:53:10.445120 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.444988 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 17 07:53:10.445257 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.444989 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 07:53:10.445313 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.445260 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 17 07:53:10.445408 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.445391 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 17 07:53:10.445994 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.445976 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 07:53:10.453499 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.453478 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5b66495d-5v9dj"] Apr 17 07:53:10.498395 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.498375 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/12552903-12e4-45a7-8194-2c6cc6a37b21-snapshots\") pod \"insights-operator-585dfdc468-9wbj9\" (UID: \"12552903-12e4-45a7-8194-2c6cc6a37b21\") " pod="openshift-insights/insights-operator-585dfdc468-9wbj9" Apr 17 07:53:10.498499 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.498426 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/12552903-12e4-45a7-8194-2c6cc6a37b21-tmp\") pod \"insights-operator-585dfdc468-9wbj9\" (UID: \"12552903-12e4-45a7-8194-2c6cc6a37b21\") " pod="openshift-insights/insights-operator-585dfdc468-9wbj9" Apr 17 07:53:10.498499 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.498455 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12552903-12e4-45a7-8194-2c6cc6a37b21-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-9wbj9\" (UID: \"12552903-12e4-45a7-8194-2c6cc6a37b21\") " pod="openshift-insights/insights-operator-585dfdc468-9wbj9" Apr 17 07:53:10.498499 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.498491 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12552903-12e4-45a7-8194-2c6cc6a37b21-service-ca-bundle\") pod \"insights-operator-585dfdc468-9wbj9\" (UID: \"12552903-12e4-45a7-8194-2c6cc6a37b21\") " pod="openshift-insights/insights-operator-585dfdc468-9wbj9" Apr 17 07:53:10.498665 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.498517 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12552903-12e4-45a7-8194-2c6cc6a37b21-serving-cert\") pod \"insights-operator-585dfdc468-9wbj9\" (UID: \"12552903-12e4-45a7-8194-2c6cc6a37b21\") " pod="openshift-insights/insights-operator-585dfdc468-9wbj9" Apr 17 07:53:10.498665 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.498550 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5lxpj\" (UniqueName: \"kubernetes.io/projected/12552903-12e4-45a7-8194-2c6cc6a37b21-kube-api-access-5lxpj\") pod \"insights-operator-585dfdc468-9wbj9\" (UID: \"12552903-12e4-45a7-8194-2c6cc6a37b21\") " pod="openshift-insights/insights-operator-585dfdc468-9wbj9" Apr 17 07:53:10.499046 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.499027 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/12552903-12e4-45a7-8194-2c6cc6a37b21-tmp\") pod \"insights-operator-585dfdc468-9wbj9\" (UID: \"12552903-12e4-45a7-8194-2c6cc6a37b21\") " pod="openshift-insights/insights-operator-585dfdc468-9wbj9" Apr 17 07:53:10.499129 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.499085 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12552903-12e4-45a7-8194-2c6cc6a37b21-service-ca-bundle\") pod \"insights-operator-585dfdc468-9wbj9\" (UID: \"12552903-12e4-45a7-8194-2c6cc6a37b21\") " pod="openshift-insights/insights-operator-585dfdc468-9wbj9" Apr 17 07:53:10.499129 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.499097 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/12552903-12e4-45a7-8194-2c6cc6a37b21-snapshots\") pod \"insights-operator-585dfdc468-9wbj9\" (UID: \"12552903-12e4-45a7-8194-2c6cc6a37b21\") " pod="openshift-insights/insights-operator-585dfdc468-9wbj9" Apr 17 07:53:10.499425 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.499406 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12552903-12e4-45a7-8194-2c6cc6a37b21-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-9wbj9\" (UID: \"12552903-12e4-45a7-8194-2c6cc6a37b21\") " pod="openshift-insights/insights-operator-585dfdc468-9wbj9" Apr 17 07:53:10.500830 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.500812 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12552903-12e4-45a7-8194-2c6cc6a37b21-serving-cert\") pod \"insights-operator-585dfdc468-9wbj9\" (UID: \"12552903-12e4-45a7-8194-2c6cc6a37b21\") " pod="openshift-insights/insights-operator-585dfdc468-9wbj9" Apr 17 07:53:10.506107 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.506089 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lxpj\" (UniqueName: \"kubernetes.io/projected/12552903-12e4-45a7-8194-2c6cc6a37b21-kube-api-access-5lxpj\") pod \"insights-operator-585dfdc468-9wbj9\" (UID: \"12552903-12e4-45a7-8194-2c6cc6a37b21\") " pod="openshift-insights/insights-operator-585dfdc468-9wbj9" Apr 17 07:53:10.539371 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.539347 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5js9r"] Apr 17 07:53:10.542377 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.542364 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5js9r" Apr 17 07:53:10.545539 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.545517 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 17 07:53:10.545539 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.545520 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 17 07:53:10.545692 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.545583 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 17 07:53:10.546069 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.546052 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 17 07:53:10.546069 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.546065 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-852w7\"" Apr 17 07:53:10.554324 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.554303 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5js9r"] Apr 17 07:53:10.599749 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.599683 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2bfs\" (UniqueName: \"kubernetes.io/projected/c26b8a3e-8882-4232-9053-f698f9bb8392-kube-api-access-z2bfs\") pod \"router-default-5b66495d-5v9dj\" (UID: \"c26b8a3e-8882-4232-9053-f698f9bb8392\") " pod="openshift-ingress/router-default-5b66495d-5v9dj" Apr 17 07:53:10.599749 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.599717 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa3b3f1b-d142-4dcd-b779-b2342879913b-config\") pod \"service-ca-operator-d6fc45fc5-5js9r\" (UID: \"fa3b3f1b-d142-4dcd-b779-b2342879913b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5js9r" Apr 17 07:53:10.599749 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.599737 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbr98\" (UniqueName: \"kubernetes.io/projected/fa3b3f1b-d142-4dcd-b779-b2342879913b-kube-api-access-jbr98\") pod \"service-ca-operator-d6fc45fc5-5js9r\" (UID: \"fa3b3f1b-d142-4dcd-b779-b2342879913b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5js9r" Apr 17 07:53:10.599913 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.599829 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c26b8a3e-8882-4232-9053-f698f9bb8392-metrics-certs\") pod \"router-default-5b66495d-5v9dj\" (UID: \"c26b8a3e-8882-4232-9053-f698f9bb8392\") " pod="openshift-ingress/router-default-5b66495d-5v9dj" Apr 17 07:53:10.599913 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.599856 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa3b3f1b-d142-4dcd-b779-b2342879913b-serving-cert\") pod \"service-ca-operator-d6fc45fc5-5js9r\" (UID: \"fa3b3f1b-d142-4dcd-b779-b2342879913b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5js9r" Apr 17 07:53:10.599913 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.599880 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c26b8a3e-8882-4232-9053-f698f9bb8392-default-certificate\") pod \"router-default-5b66495d-5v9dj\" (UID: \"c26b8a3e-8882-4232-9053-f698f9bb8392\") " pod="openshift-ingress/router-default-5b66495d-5v9dj" Apr 17 07:53:10.600001 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.599931 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c26b8a3e-8882-4232-9053-f698f9bb8392-service-ca-bundle\") pod \"router-default-5b66495d-5v9dj\" (UID: \"c26b8a3e-8882-4232-9053-f698f9bb8392\") " pod="openshift-ingress/router-default-5b66495d-5v9dj" Apr 17 07:53:10.600001 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.599950 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c26b8a3e-8882-4232-9053-f698f9bb8392-stats-auth\") pod \"router-default-5b66495d-5v9dj\" (UID: \"c26b8a3e-8882-4232-9053-f698f9bb8392\") " pod="openshift-ingress/router-default-5b66495d-5v9dj" Apr 17 07:53:10.640500 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.640475 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-597b89995d-65v5w"] Apr 17 07:53:10.642614 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.642588 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-9wbj9" Apr 17 07:53:10.643560 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.643359 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-597b89995d-65v5w" Apr 17 07:53:10.645502 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.645483 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 07:53:10.645704 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.645690 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 07:53:10.645804 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.645741 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 07:53:10.645863 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.645803 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-6xw8q\"" Apr 17 07:53:10.650071 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.650050 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 07:53:10.655474 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.655453 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-597b89995d-65v5w"] Apr 17 07:53:10.700370 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.700336 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c26b8a3e-8882-4232-9053-f698f9bb8392-default-certificate\") pod \"router-default-5b66495d-5v9dj\" (UID: \"c26b8a3e-8882-4232-9053-f698f9bb8392\") " pod="openshift-ingress/router-default-5b66495d-5v9dj" Apr 17 07:53:10.700500 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.700395 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c26b8a3e-8882-4232-9053-f698f9bb8392-service-ca-bundle\") pod \"router-default-5b66495d-5v9dj\" (UID: \"c26b8a3e-8882-4232-9053-f698f9bb8392\") " pod="openshift-ingress/router-default-5b66495d-5v9dj" Apr 17 07:53:10.700500 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.700416 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c26b8a3e-8882-4232-9053-f698f9bb8392-stats-auth\") pod \"router-default-5b66495d-5v9dj\" (UID: \"c26b8a3e-8882-4232-9053-f698f9bb8392\") " pod="openshift-ingress/router-default-5b66495d-5v9dj" Apr 17 07:53:10.700500 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.700457 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z2bfs\" (UniqueName: \"kubernetes.io/projected/c26b8a3e-8882-4232-9053-f698f9bb8392-kube-api-access-z2bfs\") pod \"router-default-5b66495d-5v9dj\" (UID: \"c26b8a3e-8882-4232-9053-f698f9bb8392\") " pod="openshift-ingress/router-default-5b66495d-5v9dj" Apr 17 07:53:10.700500 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.700484 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa3b3f1b-d142-4dcd-b779-b2342879913b-config\") pod \"service-ca-operator-d6fc45fc5-5js9r\" (UID: \"fa3b3f1b-d142-4dcd-b779-b2342879913b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5js9r" Apr 17 07:53:10.700764 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.700510 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jbr98\" (UniqueName: \"kubernetes.io/projected/fa3b3f1b-d142-4dcd-b779-b2342879913b-kube-api-access-jbr98\") pod \"service-ca-operator-d6fc45fc5-5js9r\" (UID: \"fa3b3f1b-d142-4dcd-b779-b2342879913b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5js9r" Apr 17 07:53:10.700764 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.700575 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c26b8a3e-8882-4232-9053-f698f9bb8392-metrics-certs\") pod \"router-default-5b66495d-5v9dj\" (UID: \"c26b8a3e-8882-4232-9053-f698f9bb8392\") " pod="openshift-ingress/router-default-5b66495d-5v9dj" Apr 17 07:53:10.700764 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:53:10.700617 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c26b8a3e-8882-4232-9053-f698f9bb8392-service-ca-bundle podName:c26b8a3e-8882-4232-9053-f698f9bb8392 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:11.200592827 +0000 UTC m=+111.166629587 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/c26b8a3e-8882-4232-9053-f698f9bb8392-service-ca-bundle") pod "router-default-5b66495d-5v9dj" (UID: "c26b8a3e-8882-4232-9053-f698f9bb8392") : configmap references non-existent config key: service-ca.crt Apr 17 07:53:10.700764 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:53:10.700648 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 07:53:10.700764 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.700653 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa3b3f1b-d142-4dcd-b779-b2342879913b-serving-cert\") pod \"service-ca-operator-d6fc45fc5-5js9r\" (UID: \"fa3b3f1b-d142-4dcd-b779-b2342879913b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5js9r" Apr 17 07:53:10.700764 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:53:10.700696 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c26b8a3e-8882-4232-9053-f698f9bb8392-metrics-certs podName:c26b8a3e-8882-4232-9053-f698f9bb8392 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:11.200680878 +0000 UTC m=+111.166717652 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c26b8a3e-8882-4232-9053-f698f9bb8392-metrics-certs") pod "router-default-5b66495d-5v9dj" (UID: "c26b8a3e-8882-4232-9053-f698f9bb8392") : secret "router-metrics-certs-default" not found Apr 17 07:53:10.701084 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.701063 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa3b3f1b-d142-4dcd-b779-b2342879913b-config\") pod \"service-ca-operator-d6fc45fc5-5js9r\" (UID: \"fa3b3f1b-d142-4dcd-b779-b2342879913b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5js9r" Apr 17 07:53:10.703260 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.703233 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa3b3f1b-d142-4dcd-b779-b2342879913b-serving-cert\") pod \"service-ca-operator-d6fc45fc5-5js9r\" (UID: \"fa3b3f1b-d142-4dcd-b779-b2342879913b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5js9r" Apr 17 07:53:10.703966 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.703937 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c26b8a3e-8882-4232-9053-f698f9bb8392-default-certificate\") pod \"router-default-5b66495d-5v9dj\" (UID: \"c26b8a3e-8882-4232-9053-f698f9bb8392\") " pod="openshift-ingress/router-default-5b66495d-5v9dj" Apr 17 07:53:10.704068 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.703999 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c26b8a3e-8882-4232-9053-f698f9bb8392-stats-auth\") pod \"router-default-5b66495d-5v9dj\" (UID: \"c26b8a3e-8882-4232-9053-f698f9bb8392\") " pod="openshift-ingress/router-default-5b66495d-5v9dj" Apr 17 07:53:10.713093 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.713043 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbr98\" (UniqueName: \"kubernetes.io/projected/fa3b3f1b-d142-4dcd-b779-b2342879913b-kube-api-access-jbr98\") pod \"service-ca-operator-d6fc45fc5-5js9r\" (UID: \"fa3b3f1b-d142-4dcd-b779-b2342879913b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5js9r" Apr 17 07:53:10.714126 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.714103 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2bfs\" (UniqueName: \"kubernetes.io/projected/c26b8a3e-8882-4232-9053-f698f9bb8392-kube-api-access-z2bfs\") pod \"router-default-5b66495d-5v9dj\" (UID: \"c26b8a3e-8882-4232-9053-f698f9bb8392\") " pod="openshift-ingress/router-default-5b66495d-5v9dj" Apr 17 07:53:10.762182 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.762134 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-9wbj9"] Apr 17 07:53:10.765112 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:53:10.765082 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12552903_12e4_45a7_8194_2c6cc6a37b21.slice/crio-f3d0a5b2a69a775330e204e9bd161f6478123d27d9465aa68a4d8303c34e3d40 WatchSource:0}: Error finding container f3d0a5b2a69a775330e204e9bd161f6478123d27d9465aa68a4d8303c34e3d40: Status 404 returned error can't find the container with id f3d0a5b2a69a775330e204e9bd161f6478123d27d9465aa68a4d8303c34e3d40 Apr 17 07:53:10.801877 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.801856 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/95a9e913-e1cb-49b7-ae5f-77299dc16875-trusted-ca\") pod \"image-registry-597b89995d-65v5w\" (UID: \"95a9e913-e1cb-49b7-ae5f-77299dc16875\") " pod="openshift-image-registry/image-registry-597b89995d-65v5w" Apr 17 07:53:10.801977 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.801886 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/95a9e913-e1cb-49b7-ae5f-77299dc16875-ca-trust-extracted\") pod \"image-registry-597b89995d-65v5w\" (UID: \"95a9e913-e1cb-49b7-ae5f-77299dc16875\") " pod="openshift-image-registry/image-registry-597b89995d-65v5w" Apr 17 07:53:10.801977 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.801906 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djvrk\" (UniqueName: \"kubernetes.io/projected/95a9e913-e1cb-49b7-ae5f-77299dc16875-kube-api-access-djvrk\") pod \"image-registry-597b89995d-65v5w\" (UID: \"95a9e913-e1cb-49b7-ae5f-77299dc16875\") " pod="openshift-image-registry/image-registry-597b89995d-65v5w" Apr 17 07:53:10.801977 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.801963 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/95a9e913-e1cb-49b7-ae5f-77299dc16875-bound-sa-token\") pod \"image-registry-597b89995d-65v5w\" (UID: \"95a9e913-e1cb-49b7-ae5f-77299dc16875\") " pod="openshift-image-registry/image-registry-597b89995d-65v5w" Apr 17 07:53:10.802092 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.802011 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/95a9e913-e1cb-49b7-ae5f-77299dc16875-registry-certificates\") pod \"image-registry-597b89995d-65v5w\" (UID: \"95a9e913-e1cb-49b7-ae5f-77299dc16875\") " pod="openshift-image-registry/image-registry-597b89995d-65v5w" Apr 17 07:53:10.802092 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.802063 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/95a9e913-e1cb-49b7-ae5f-77299dc16875-image-registry-private-configuration\") pod \"image-registry-597b89995d-65v5w\" (UID: \"95a9e913-e1cb-49b7-ae5f-77299dc16875\") " pod="openshift-image-registry/image-registry-597b89995d-65v5w" Apr 17 07:53:10.802198 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.802106 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/95a9e913-e1cb-49b7-ae5f-77299dc16875-installation-pull-secrets\") pod \"image-registry-597b89995d-65v5w\" (UID: \"95a9e913-e1cb-49b7-ae5f-77299dc16875\") " pod="openshift-image-registry/image-registry-597b89995d-65v5w" Apr 17 07:53:10.802235 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.802199 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/95a9e913-e1cb-49b7-ae5f-77299dc16875-registry-tls\") pod \"image-registry-597b89995d-65v5w\" (UID: \"95a9e913-e1cb-49b7-ae5f-77299dc16875\") " pod="openshift-image-registry/image-registry-597b89995d-65v5w" Apr 17 07:53:10.851225 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.851171 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5js9r" Apr 17 07:53:10.859253 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.859215 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-9wbj9" event={"ID":"12552903-12e4-45a7-8194-2c6cc6a37b21","Type":"ContainerStarted","Data":"f3d0a5b2a69a775330e204e9bd161f6478123d27d9465aa68a4d8303c34e3d40"} Apr 17 07:53:10.902994 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.902965 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/95a9e913-e1cb-49b7-ae5f-77299dc16875-trusted-ca\") pod \"image-registry-597b89995d-65v5w\" (UID: \"95a9e913-e1cb-49b7-ae5f-77299dc16875\") " pod="openshift-image-registry/image-registry-597b89995d-65v5w" Apr 17 07:53:10.903112 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.903000 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/95a9e913-e1cb-49b7-ae5f-77299dc16875-ca-trust-extracted\") pod \"image-registry-597b89995d-65v5w\" (UID: \"95a9e913-e1cb-49b7-ae5f-77299dc16875\") " pod="openshift-image-registry/image-registry-597b89995d-65v5w" Apr 17 07:53:10.903112 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.903018 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-djvrk\" (UniqueName: \"kubernetes.io/projected/95a9e913-e1cb-49b7-ae5f-77299dc16875-kube-api-access-djvrk\") pod \"image-registry-597b89995d-65v5w\" (UID: \"95a9e913-e1cb-49b7-ae5f-77299dc16875\") " pod="openshift-image-registry/image-registry-597b89995d-65v5w" Apr 17 07:53:10.903112 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.903040 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/95a9e913-e1cb-49b7-ae5f-77299dc16875-bound-sa-token\") pod \"image-registry-597b89995d-65v5w\" (UID: \"95a9e913-e1cb-49b7-ae5f-77299dc16875\") " pod="openshift-image-registry/image-registry-597b89995d-65v5w" Apr 17 07:53:10.903112 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.903070 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/95a9e913-e1cb-49b7-ae5f-77299dc16875-registry-certificates\") pod \"image-registry-597b89995d-65v5w\" (UID: \"95a9e913-e1cb-49b7-ae5f-77299dc16875\") " pod="openshift-image-registry/image-registry-597b89995d-65v5w" Apr 17 07:53:10.903346 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.903120 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/95a9e913-e1cb-49b7-ae5f-77299dc16875-image-registry-private-configuration\") pod \"image-registry-597b89995d-65v5w\" (UID: \"95a9e913-e1cb-49b7-ae5f-77299dc16875\") " pod="openshift-image-registry/image-registry-597b89995d-65v5w" Apr 17 07:53:10.903346 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.903185 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/95a9e913-e1cb-49b7-ae5f-77299dc16875-installation-pull-secrets\") pod \"image-registry-597b89995d-65v5w\" (UID: \"95a9e913-e1cb-49b7-ae5f-77299dc16875\") " pod="openshift-image-registry/image-registry-597b89995d-65v5w" Apr 17 07:53:10.903346 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.903231 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/95a9e913-e1cb-49b7-ae5f-77299dc16875-registry-tls\") pod \"image-registry-597b89995d-65v5w\" (UID: \"95a9e913-e1cb-49b7-ae5f-77299dc16875\") " pod="openshift-image-registry/image-registry-597b89995d-65v5w" Apr 17 07:53:10.903346 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:53:10.903324 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 07:53:10.903346 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:53:10.903337 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-597b89995d-65v5w: secret "image-registry-tls" not found Apr 17 07:53:10.903594 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:53:10.903390 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/95a9e913-e1cb-49b7-ae5f-77299dc16875-registry-tls podName:95a9e913-e1cb-49b7-ae5f-77299dc16875 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:11.403368226 +0000 UTC m=+111.369404984 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/95a9e913-e1cb-49b7-ae5f-77299dc16875-registry-tls") pod "image-registry-597b89995d-65v5w" (UID: "95a9e913-e1cb-49b7-ae5f-77299dc16875") : secret "image-registry-tls" not found Apr 17 07:53:10.903594 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.903401 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/95a9e913-e1cb-49b7-ae5f-77299dc16875-ca-trust-extracted\") pod \"image-registry-597b89995d-65v5w\" (UID: \"95a9e913-e1cb-49b7-ae5f-77299dc16875\") " pod="openshift-image-registry/image-registry-597b89995d-65v5w" Apr 17 07:53:10.903768 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.903750 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/95a9e913-e1cb-49b7-ae5f-77299dc16875-registry-certificates\") pod \"image-registry-597b89995d-65v5w\" (UID: \"95a9e913-e1cb-49b7-ae5f-77299dc16875\") " pod="openshift-image-registry/image-registry-597b89995d-65v5w" Apr 17 07:53:10.904238 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.904216 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/95a9e913-e1cb-49b7-ae5f-77299dc16875-trusted-ca\") pod \"image-registry-597b89995d-65v5w\" (UID: \"95a9e913-e1cb-49b7-ae5f-77299dc16875\") " pod="openshift-image-registry/image-registry-597b89995d-65v5w" Apr 17 07:53:10.905746 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.905721 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/95a9e913-e1cb-49b7-ae5f-77299dc16875-image-registry-private-configuration\") pod \"image-registry-597b89995d-65v5w\" (UID: \"95a9e913-e1cb-49b7-ae5f-77299dc16875\") " pod="openshift-image-registry/image-registry-597b89995d-65v5w" Apr 17 07:53:10.905837 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.905733 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/95a9e913-e1cb-49b7-ae5f-77299dc16875-installation-pull-secrets\") pod \"image-registry-597b89995d-65v5w\" (UID: \"95a9e913-e1cb-49b7-ae5f-77299dc16875\") " pod="openshift-image-registry/image-registry-597b89995d-65v5w" Apr 17 07:53:10.911117 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.911094 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/95a9e913-e1cb-49b7-ae5f-77299dc16875-bound-sa-token\") pod \"image-registry-597b89995d-65v5w\" (UID: \"95a9e913-e1cb-49b7-ae5f-77299dc16875\") " pod="openshift-image-registry/image-registry-597b89995d-65v5w" Apr 17 07:53:10.911227 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.911210 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-djvrk\" (UniqueName: \"kubernetes.io/projected/95a9e913-e1cb-49b7-ae5f-77299dc16875-kube-api-access-djvrk\") pod \"image-registry-597b89995d-65v5w\" (UID: \"95a9e913-e1cb-49b7-ae5f-77299dc16875\") " pod="openshift-image-registry/image-registry-597b89995d-65v5w" Apr 17 07:53:10.960085 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:10.960059 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5js9r"] Apr 17 07:53:10.963410 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:53:10.963383 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa3b3f1b_d142_4dcd_b779_b2342879913b.slice/crio-b296ce3578d8181f3a422d01ff649845194572eddf62b89e9acce5f40cbd1992 WatchSource:0}: Error finding container b296ce3578d8181f3a422d01ff649845194572eddf62b89e9acce5f40cbd1992: Status 404 returned error can't find the container with id b296ce3578d8181f3a422d01ff649845194572eddf62b89e9acce5f40cbd1992 Apr 17 07:53:11.204711 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:11.204682 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c26b8a3e-8882-4232-9053-f698f9bb8392-service-ca-bundle\") pod \"router-default-5b66495d-5v9dj\" (UID: \"c26b8a3e-8882-4232-9053-f698f9bb8392\") " pod="openshift-ingress/router-default-5b66495d-5v9dj" Apr 17 07:53:11.204875 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:11.204730 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c26b8a3e-8882-4232-9053-f698f9bb8392-metrics-certs\") pod \"router-default-5b66495d-5v9dj\" (UID: \"c26b8a3e-8882-4232-9053-f698f9bb8392\") " pod="openshift-ingress/router-default-5b66495d-5v9dj" Apr 17 07:53:11.204875 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:53:11.204845 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 07:53:11.204875 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:53:11.204856 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c26b8a3e-8882-4232-9053-f698f9bb8392-service-ca-bundle podName:c26b8a3e-8882-4232-9053-f698f9bb8392 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:12.204838628 +0000 UTC m=+112.170875383 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/c26b8a3e-8882-4232-9053-f698f9bb8392-service-ca-bundle") pod "router-default-5b66495d-5v9dj" (UID: "c26b8a3e-8882-4232-9053-f698f9bb8392") : configmap references non-existent config key: service-ca.crt Apr 17 07:53:11.204987 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:53:11.204886 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c26b8a3e-8882-4232-9053-f698f9bb8392-metrics-certs podName:c26b8a3e-8882-4232-9053-f698f9bb8392 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:12.20487405 +0000 UTC m=+112.170910805 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c26b8a3e-8882-4232-9053-f698f9bb8392-metrics-certs") pod "router-default-5b66495d-5v9dj" (UID: "c26b8a3e-8882-4232-9053-f698f9bb8392") : secret "router-metrics-certs-default" not found Apr 17 07:53:11.406852 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:11.406805 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/95a9e913-e1cb-49b7-ae5f-77299dc16875-registry-tls\") pod \"image-registry-597b89995d-65v5w\" (UID: \"95a9e913-e1cb-49b7-ae5f-77299dc16875\") " pod="openshift-image-registry/image-registry-597b89995d-65v5w" Apr 17 07:53:11.407320 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:53:11.406980 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 07:53:11.407320 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:53:11.406999 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-597b89995d-65v5w: secret "image-registry-tls" not found Apr 17 07:53:11.407320 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:53:11.407056 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/95a9e913-e1cb-49b7-ae5f-77299dc16875-registry-tls podName:95a9e913-e1cb-49b7-ae5f-77299dc16875 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:12.407038941 +0000 UTC m=+112.373075696 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/95a9e913-e1cb-49b7-ae5f-77299dc16875-registry-tls") pod "image-registry-597b89995d-65v5w" (UID: "95a9e913-e1cb-49b7-ae5f-77299dc16875") : secret "image-registry-tls" not found Apr 17 07:53:11.862619 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:11.862580 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5js9r" event={"ID":"fa3b3f1b-d142-4dcd-b779-b2342879913b","Type":"ContainerStarted","Data":"b296ce3578d8181f3a422d01ff649845194572eddf62b89e9acce5f40cbd1992"} Apr 17 07:53:12.215779 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:12.215747 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c26b8a3e-8882-4232-9053-f698f9bb8392-metrics-certs\") pod \"router-default-5b66495d-5v9dj\" (UID: \"c26b8a3e-8882-4232-9053-f698f9bb8392\") " pod="openshift-ingress/router-default-5b66495d-5v9dj" Apr 17 07:53:12.215947 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:12.215861 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c26b8a3e-8882-4232-9053-f698f9bb8392-service-ca-bundle\") pod \"router-default-5b66495d-5v9dj\" (UID: \"c26b8a3e-8882-4232-9053-f698f9bb8392\") " pod="openshift-ingress/router-default-5b66495d-5v9dj" Apr 17 07:53:12.215947 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:53:12.215919 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 07:53:12.216043 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:53:12.215994 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c26b8a3e-8882-4232-9053-f698f9bb8392-metrics-certs podName:c26b8a3e-8882-4232-9053-f698f9bb8392 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:14.215971877 +0000 UTC m=+114.182008637 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c26b8a3e-8882-4232-9053-f698f9bb8392-metrics-certs") pod "router-default-5b66495d-5v9dj" (UID: "c26b8a3e-8882-4232-9053-f698f9bb8392") : secret "router-metrics-certs-default" not found Apr 17 07:53:12.216043 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:53:12.216013 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c26b8a3e-8882-4232-9053-f698f9bb8392-service-ca-bundle podName:c26b8a3e-8882-4232-9053-f698f9bb8392 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:14.21600452 +0000 UTC m=+114.182041275 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/c26b8a3e-8882-4232-9053-f698f9bb8392-service-ca-bundle") pod "router-default-5b66495d-5v9dj" (UID: "c26b8a3e-8882-4232-9053-f698f9bb8392") : configmap references non-existent config key: service-ca.crt Apr 17 07:53:12.417701 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:12.417660 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/95a9e913-e1cb-49b7-ae5f-77299dc16875-registry-tls\") pod \"image-registry-597b89995d-65v5w\" (UID: \"95a9e913-e1cb-49b7-ae5f-77299dc16875\") " pod="openshift-image-registry/image-registry-597b89995d-65v5w" Apr 17 07:53:12.418123 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:53:12.417826 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 07:53:12.418123 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:53:12.417845 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-597b89995d-65v5w: secret "image-registry-tls" not found Apr 17 07:53:12.418123 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:53:12.417898 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/95a9e913-e1cb-49b7-ae5f-77299dc16875-registry-tls podName:95a9e913-e1cb-49b7-ae5f-77299dc16875 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:14.417882978 +0000 UTC m=+114.383919738 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/95a9e913-e1cb-49b7-ae5f-77299dc16875-registry-tls") pod "image-registry-597b89995d-65v5w" (UID: "95a9e913-e1cb-49b7-ae5f-77299dc16875") : secret "image-registry-tls" not found Apr 17 07:53:13.867923 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:13.867822 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5js9r" event={"ID":"fa3b3f1b-d142-4dcd-b779-b2342879913b","Type":"ContainerStarted","Data":"daabdefee5f9c6279957c048601477f5686078918d14744f5a682450f9403137"} Apr 17 07:53:13.869158 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:13.869121 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-9wbj9" event={"ID":"12552903-12e4-45a7-8194-2c6cc6a37b21","Type":"ContainerStarted","Data":"ccc527180fa76957634aa9bcbc6b0ed789d47b0d52c07991e36ba830f55cbece"} Apr 17 07:53:13.881327 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:13.881268 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5js9r" podStartSLOduration=1.279368627 podStartE2EDuration="3.881250784s" podCreationTimestamp="2026-04-17 07:53:10 +0000 UTC" firstStartedPulling="2026-04-17 07:53:10.965135598 +0000 UTC m=+110.931172353" lastFinishedPulling="2026-04-17 07:53:13.567017741 +0000 UTC m=+113.533054510" observedRunningTime="2026-04-17 07:53:13.881080045 +0000 UTC m=+113.847116835" watchObservedRunningTime="2026-04-17 07:53:13.881250784 +0000 UTC m=+113.847287564" Apr 17 07:53:13.896983 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:13.896935 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-9wbj9" podStartSLOduration=1.101556417 podStartE2EDuration="3.896924349s" podCreationTimestamp="2026-04-17 07:53:10 +0000 UTC" firstStartedPulling="2026-04-17 07:53:10.76672651 +0000 UTC m=+110.732763264" lastFinishedPulling="2026-04-17 07:53:13.562094436 +0000 UTC m=+113.528131196" observedRunningTime="2026-04-17 07:53:13.895693118 +0000 UTC m=+113.861729892" watchObservedRunningTime="2026-04-17 07:53:13.896924349 +0000 UTC m=+113.862961128" Apr 17 07:53:14.232506 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:14.232470 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c26b8a3e-8882-4232-9053-f698f9bb8392-metrics-certs\") pod \"router-default-5b66495d-5v9dj\" (UID: \"c26b8a3e-8882-4232-9053-f698f9bb8392\") " pod="openshift-ingress/router-default-5b66495d-5v9dj" Apr 17 07:53:14.232692 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:14.232576 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c26b8a3e-8882-4232-9053-f698f9bb8392-service-ca-bundle\") pod \"router-default-5b66495d-5v9dj\" (UID: \"c26b8a3e-8882-4232-9053-f698f9bb8392\") " pod="openshift-ingress/router-default-5b66495d-5v9dj" Apr 17 07:53:14.232692 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:53:14.232628 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 07:53:14.232820 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:53:14.232696 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c26b8a3e-8882-4232-9053-f698f9bb8392-metrics-certs podName:c26b8a3e-8882-4232-9053-f698f9bb8392 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:18.232680189 +0000 UTC m=+118.198716943 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c26b8a3e-8882-4232-9053-f698f9bb8392-metrics-certs") pod "router-default-5b66495d-5v9dj" (UID: "c26b8a3e-8882-4232-9053-f698f9bb8392") : secret "router-metrics-certs-default" not found Apr 17 07:53:14.232820 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:53:14.232709 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c26b8a3e-8882-4232-9053-f698f9bb8392-service-ca-bundle podName:c26b8a3e-8882-4232-9053-f698f9bb8392 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:18.23270352 +0000 UTC m=+118.198740275 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/c26b8a3e-8882-4232-9053-f698f9bb8392-service-ca-bundle") pod "router-default-5b66495d-5v9dj" (UID: "c26b8a3e-8882-4232-9053-f698f9bb8392") : configmap references non-existent config key: service-ca.crt Apr 17 07:53:14.435005 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:14.434974 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/95a9e913-e1cb-49b7-ae5f-77299dc16875-registry-tls\") pod \"image-registry-597b89995d-65v5w\" (UID: \"95a9e913-e1cb-49b7-ae5f-77299dc16875\") " pod="openshift-image-registry/image-registry-597b89995d-65v5w" Apr 17 07:53:14.435203 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:53:14.435159 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 07:53:14.435203 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:53:14.435181 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-597b89995d-65v5w: secret "image-registry-tls" not found Apr 17 07:53:14.435324 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:53:14.435250 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/95a9e913-e1cb-49b7-ae5f-77299dc16875-registry-tls podName:95a9e913-e1cb-49b7-ae5f-77299dc16875 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:18.435228159 +0000 UTC m=+118.401264931 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/95a9e913-e1cb-49b7-ae5f-77299dc16875-registry-tls") pod "image-registry-597b89995d-65v5w" (UID: "95a9e913-e1cb-49b7-ae5f-77299dc16875") : secret "image-registry-tls" not found Apr 17 07:53:16.020331 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:16.020294 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-ccz4q"] Apr 17 07:53:16.023632 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:16.023616 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ccz4q" Apr 17 07:53:16.025930 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:16.025909 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-7sxkb\"" Apr 17 07:53:16.026039 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:16.025932 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 17 07:53:16.026039 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:16.025909 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 17 07:53:16.034280 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:16.034258 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-ccz4q"] Apr 17 07:53:16.148674 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:16.148647 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klvg9\" (UniqueName: \"kubernetes.io/projected/93c5801c-fbc8-4496-b232-1869fa1f2267-kube-api-access-klvg9\") pod \"migrator-74bb7799d9-ccz4q\" (UID: \"93c5801c-fbc8-4496-b232-1869fa1f2267\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ccz4q" Apr 17 07:53:16.249767 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:16.249737 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-klvg9\" (UniqueName: \"kubernetes.io/projected/93c5801c-fbc8-4496-b232-1869fa1f2267-kube-api-access-klvg9\") pod \"migrator-74bb7799d9-ccz4q\" (UID: \"93c5801c-fbc8-4496-b232-1869fa1f2267\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ccz4q" Apr 17 07:53:16.256951 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:16.256930 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-klvg9\" (UniqueName: \"kubernetes.io/projected/93c5801c-fbc8-4496-b232-1869fa1f2267-kube-api-access-klvg9\") pod \"migrator-74bb7799d9-ccz4q\" (UID: \"93c5801c-fbc8-4496-b232-1869fa1f2267\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ccz4q" Apr 17 07:53:16.332046 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:16.331957 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ccz4q" Apr 17 07:53:16.443691 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:16.443661 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-ccz4q"] Apr 17 07:53:16.446924 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:53:16.446888 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93c5801c_fbc8_4496_b232_1869fa1f2267.slice/crio-88d9a9e3df600e918fe5d78f29fc5829d3f9ca936cb975bae412f06557506b26 WatchSource:0}: Error finding container 88d9a9e3df600e918fe5d78f29fc5829d3f9ca936cb975bae412f06557506b26: Status 404 returned error can't find the container with id 88d9a9e3df600e918fe5d78f29fc5829d3f9ca936cb975bae412f06557506b26 Apr 17 07:53:16.460802 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:16.460784 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-cfdh5_a6642e2d-0acd-4e4b-8013-72420908123d/dns-node-resolver/0.log" Apr 17 07:53:16.875671 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:16.875640 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ccz4q" event={"ID":"93c5801c-fbc8-4496-b232-1869fa1f2267","Type":"ContainerStarted","Data":"88d9a9e3df600e918fe5d78f29fc5829d3f9ca936cb975bae412f06557506b26"} Apr 17 07:53:17.261180 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:17.261135 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-jf4kz_ca63f18c-753f-468e-b6ab-a7a1608ee9ef/node-ca/0.log" Apr 17 07:53:17.879558 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:17.879486 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ccz4q" event={"ID":"93c5801c-fbc8-4496-b232-1869fa1f2267","Type":"ContainerStarted","Data":"5b41ed09cb6e841081f82b9ed5d366c26712ad4afdbc67c4a2c6d601eb370de7"} Apr 17 07:53:17.879558 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:17.879530 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ccz4q" event={"ID":"93c5801c-fbc8-4496-b232-1869fa1f2267","Type":"ContainerStarted","Data":"a3c6a4c3da02e510b5925e1929069703a7664b9616319b5ae637ca736cf3c9a8"} Apr 17 07:53:17.893655 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:17.893615 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ccz4q" podStartSLOduration=0.715601716 podStartE2EDuration="1.893596957s" podCreationTimestamp="2026-04-17 07:53:16 +0000 UTC" firstStartedPulling="2026-04-17 07:53:16.449382019 +0000 UTC m=+116.415418773" lastFinishedPulling="2026-04-17 07:53:17.627377256 +0000 UTC m=+117.593414014" observedRunningTime="2026-04-17 07:53:17.893124817 +0000 UTC m=+117.859161594" watchObservedRunningTime="2026-04-17 07:53:17.893596957 +0000 UTC m=+117.859633734" Apr 17 07:53:18.266653 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:18.266621 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c26b8a3e-8882-4232-9053-f698f9bb8392-service-ca-bundle\") pod \"router-default-5b66495d-5v9dj\" (UID: \"c26b8a3e-8882-4232-9053-f698f9bb8392\") " pod="openshift-ingress/router-default-5b66495d-5v9dj" Apr 17 07:53:18.267011 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:18.266671 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c26b8a3e-8882-4232-9053-f698f9bb8392-metrics-certs\") pod \"router-default-5b66495d-5v9dj\" (UID: \"c26b8a3e-8882-4232-9053-f698f9bb8392\") " pod="openshift-ingress/router-default-5b66495d-5v9dj" Apr 17 07:53:18.267011 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:53:18.266771 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 07:53:18.267011 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:53:18.266782 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c26b8a3e-8882-4232-9053-f698f9bb8392-service-ca-bundle podName:c26b8a3e-8882-4232-9053-f698f9bb8392 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:26.266764754 +0000 UTC m=+126.232801516 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/c26b8a3e-8882-4232-9053-f698f9bb8392-service-ca-bundle") pod "router-default-5b66495d-5v9dj" (UID: "c26b8a3e-8882-4232-9053-f698f9bb8392") : configmap references non-existent config key: service-ca.crt Apr 17 07:53:18.267011 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:53:18.266812 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c26b8a3e-8882-4232-9053-f698f9bb8392-metrics-certs podName:c26b8a3e-8882-4232-9053-f698f9bb8392 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:26.266800969 +0000 UTC m=+126.232837724 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c26b8a3e-8882-4232-9053-f698f9bb8392-metrics-certs") pod "router-default-5b66495d-5v9dj" (UID: "c26b8a3e-8882-4232-9053-f698f9bb8392") : secret "router-metrics-certs-default" not found Apr 17 07:53:18.467764 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:18.467729 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/95a9e913-e1cb-49b7-ae5f-77299dc16875-registry-tls\") pod \"image-registry-597b89995d-65v5w\" (UID: \"95a9e913-e1cb-49b7-ae5f-77299dc16875\") " pod="openshift-image-registry/image-registry-597b89995d-65v5w" Apr 17 07:53:18.467911 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:53:18.467871 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 07:53:18.467911 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:53:18.467890 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-597b89995d-65v5w: secret "image-registry-tls" not found Apr 17 07:53:18.467977 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:53:18.467950 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/95a9e913-e1cb-49b7-ae5f-77299dc16875-registry-tls podName:95a9e913-e1cb-49b7-ae5f-77299dc16875 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:26.467936429 +0000 UTC m=+126.433973185 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/95a9e913-e1cb-49b7-ae5f-77299dc16875-registry-tls") pod "image-registry-597b89995d-65v5w" (UID: "95a9e913-e1cb-49b7-ae5f-77299dc16875") : secret "image-registry-tls" not found Apr 17 07:53:26.328320 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:26.328279 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c26b8a3e-8882-4232-9053-f698f9bb8392-service-ca-bundle\") pod \"router-default-5b66495d-5v9dj\" (UID: \"c26b8a3e-8882-4232-9053-f698f9bb8392\") " pod="openshift-ingress/router-default-5b66495d-5v9dj" Apr 17 07:53:26.328695 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:26.328347 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c26b8a3e-8882-4232-9053-f698f9bb8392-metrics-certs\") pod \"router-default-5b66495d-5v9dj\" (UID: \"c26b8a3e-8882-4232-9053-f698f9bb8392\") " pod="openshift-ingress/router-default-5b66495d-5v9dj" Apr 17 07:53:26.328695 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:53:26.328405 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c26b8a3e-8882-4232-9053-f698f9bb8392-service-ca-bundle podName:c26b8a3e-8882-4232-9053-f698f9bb8392 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:42.328387476 +0000 UTC m=+142.294424235 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/c26b8a3e-8882-4232-9053-f698f9bb8392-service-ca-bundle") pod "router-default-5b66495d-5v9dj" (UID: "c26b8a3e-8882-4232-9053-f698f9bb8392") : configmap references non-existent config key: service-ca.crt Apr 17 07:53:26.330585 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:26.330569 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c26b8a3e-8882-4232-9053-f698f9bb8392-metrics-certs\") pod \"router-default-5b66495d-5v9dj\" (UID: \"c26b8a3e-8882-4232-9053-f698f9bb8392\") " pod="openshift-ingress/router-default-5b66495d-5v9dj" Apr 17 07:53:26.529567 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:26.529536 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/95a9e913-e1cb-49b7-ae5f-77299dc16875-registry-tls\") pod \"image-registry-597b89995d-65v5w\" (UID: \"95a9e913-e1cb-49b7-ae5f-77299dc16875\") " pod="openshift-image-registry/image-registry-597b89995d-65v5w" Apr 17 07:53:26.532037 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:26.532011 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/95a9e913-e1cb-49b7-ae5f-77299dc16875-registry-tls\") pod \"image-registry-597b89995d-65v5w\" (UID: \"95a9e913-e1cb-49b7-ae5f-77299dc16875\") " pod="openshift-image-registry/image-registry-597b89995d-65v5w" Apr 17 07:53:26.566081 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:26.566050 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-6xw8q\"" Apr 17 07:53:26.573808 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:26.573788 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-597b89995d-65v5w" Apr 17 07:53:26.690694 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:26.690655 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-597b89995d-65v5w"] Apr 17 07:53:26.694243 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:53:26.694209 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95a9e913_e1cb_49b7_ae5f_77299dc16875.slice/crio-8e05d73744a108ab9cac17e7d815267116d1fc3eaa5b0ba872d7f4eaf54060e8 WatchSource:0}: Error finding container 8e05d73744a108ab9cac17e7d815267116d1fc3eaa5b0ba872d7f4eaf54060e8: Status 404 returned error can't find the container with id 8e05d73744a108ab9cac17e7d815267116d1fc3eaa5b0ba872d7f4eaf54060e8 Apr 17 07:53:26.902364 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:26.902331 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-597b89995d-65v5w" event={"ID":"95a9e913-e1cb-49b7-ae5f-77299dc16875","Type":"ContainerStarted","Data":"98094e0cdbc915b67b1139814d86f1596779d324fee03ff9a63cb80161f95070"} Apr 17 07:53:26.902364 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:26.902367 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-597b89995d-65v5w" event={"ID":"95a9e913-e1cb-49b7-ae5f-77299dc16875","Type":"ContainerStarted","Data":"8e05d73744a108ab9cac17e7d815267116d1fc3eaa5b0ba872d7f4eaf54060e8"} Apr 17 07:53:26.902573 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:26.902457 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-597b89995d-65v5w" Apr 17 07:53:26.921870 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:26.921825 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-597b89995d-65v5w" podStartSLOduration=16.921811931 podStartE2EDuration="16.921811931s" podCreationTimestamp="2026-04-17 07:53:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 07:53:26.920926806 +0000 UTC m=+126.886963583" watchObservedRunningTime="2026-04-17 07:53:26.921811931 +0000 UTC m=+126.887848708" Apr 17 07:53:30.360530 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:30.360494 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/85afae1f-542f-4ccc-b1bd-45ba0e0c418f-metrics-certs\") pod \"network-metrics-daemon-6vh7t\" (UID: \"85afae1f-542f-4ccc-b1bd-45ba0e0c418f\") " pod="openshift-multus/network-metrics-daemon-6vh7t" Apr 17 07:53:30.362800 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:30.362772 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/85afae1f-542f-4ccc-b1bd-45ba0e0c418f-metrics-certs\") pod \"network-metrics-daemon-6vh7t\" (UID: \"85afae1f-542f-4ccc-b1bd-45ba0e0c418f\") " pod="openshift-multus/network-metrics-daemon-6vh7t" Apr 17 07:53:30.641213 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:30.641126 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-tjzzs\"" Apr 17 07:53:30.649828 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:30.649811 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vh7t" Apr 17 07:53:30.760655 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:30.760628 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6vh7t"] Apr 17 07:53:30.763740 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:53:30.763711 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85afae1f_542f_4ccc_b1bd_45ba0e0c418f.slice/crio-15fe816995135ed34fde12edb3dcb7710cafa5ad825fc186387e4c50d7f993a3 WatchSource:0}: Error finding container 15fe816995135ed34fde12edb3dcb7710cafa5ad825fc186387e4c50d7f993a3: Status 404 returned error can't find the container with id 15fe816995135ed34fde12edb3dcb7710cafa5ad825fc186387e4c50d7f993a3 Apr 17 07:53:30.912348 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:30.912316 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6vh7t" event={"ID":"85afae1f-542f-4ccc-b1bd-45ba0e0c418f","Type":"ContainerStarted","Data":"15fe816995135ed34fde12edb3dcb7710cafa5ad825fc186387e4c50d7f993a3"} Apr 17 07:53:32.918904 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:32.918844 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6vh7t" event={"ID":"85afae1f-542f-4ccc-b1bd-45ba0e0c418f","Type":"ContainerStarted","Data":"4d1ebc2a165527753792fde9032825e79717424651d211de408f04b8d3e2cfa7"} Apr 17 07:53:32.918904 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:32.918904 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6vh7t" event={"ID":"85afae1f-542f-4ccc-b1bd-45ba0e0c418f","Type":"ContainerStarted","Data":"da078bfea3163995deeab9d705c5877f9c4a1bf1a243f8df8ea57c1013ee04e3"} Apr 17 07:53:41.117188 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.117114 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-6vh7t" podStartSLOduration=139.556094275 podStartE2EDuration="2m21.117098523s" podCreationTimestamp="2026-04-17 07:51:20 +0000 UTC" firstStartedPulling="2026-04-17 07:53:30.765601648 +0000 UTC m=+130.731638407" lastFinishedPulling="2026-04-17 07:53:32.3266059 +0000 UTC m=+132.292642655" observedRunningTime="2026-04-17 07:53:32.933559012 +0000 UTC m=+132.899595790" watchObservedRunningTime="2026-04-17 07:53:41.117098523 +0000 UTC m=+141.083135301" Apr 17 07:53:41.117994 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.117974 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-597b89995d-65v5w"] Apr 17 07:53:41.122014 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.121983 2578 patch_prober.go:28] interesting pod/image-registry-597b89995d-65v5w container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 07:53:41.122177 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.122048 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-597b89995d-65v5w" podUID="95a9e913-e1cb-49b7-ae5f-77299dc16875" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 07:53:41.212036 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.212006 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-54c7b778bc-f89zb"] Apr 17 07:53:41.215040 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.215022 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-54c7b778bc-f89zb" Apr 17 07:53:41.236553 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.236525 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ae8faad5-6108-4619-82ef-7f707ede4f61-ca-trust-extracted\") pod \"image-registry-54c7b778bc-f89zb\" (UID: \"ae8faad5-6108-4619-82ef-7f707ede4f61\") " pod="openshift-image-registry/image-registry-54c7b778bc-f89zb" Apr 17 07:53:41.236657 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.236562 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ae8faad5-6108-4619-82ef-7f707ede4f61-registry-certificates\") pod \"image-registry-54c7b778bc-f89zb\" (UID: \"ae8faad5-6108-4619-82ef-7f707ede4f61\") " pod="openshift-image-registry/image-registry-54c7b778bc-f89zb" Apr 17 07:53:41.236657 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.236578 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ae8faad5-6108-4619-82ef-7f707ede4f61-installation-pull-secrets\") pod \"image-registry-54c7b778bc-f89zb\" (UID: \"ae8faad5-6108-4619-82ef-7f707ede4f61\") " pod="openshift-image-registry/image-registry-54c7b778bc-f89zb" Apr 17 07:53:41.236657 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.236600 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhbr9\" (UniqueName: \"kubernetes.io/projected/ae8faad5-6108-4619-82ef-7f707ede4f61-kube-api-access-dhbr9\") pod \"image-registry-54c7b778bc-f89zb\" (UID: \"ae8faad5-6108-4619-82ef-7f707ede4f61\") " pod="openshift-image-registry/image-registry-54c7b778bc-f89zb" Apr 17 07:53:41.236765 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.236694 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ae8faad5-6108-4619-82ef-7f707ede4f61-image-registry-private-configuration\") pod \"image-registry-54c7b778bc-f89zb\" (UID: \"ae8faad5-6108-4619-82ef-7f707ede4f61\") " pod="openshift-image-registry/image-registry-54c7b778bc-f89zb" Apr 17 07:53:41.236765 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.236724 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ae8faad5-6108-4619-82ef-7f707ede4f61-bound-sa-token\") pod \"image-registry-54c7b778bc-f89zb\" (UID: \"ae8faad5-6108-4619-82ef-7f707ede4f61\") " pod="openshift-image-registry/image-registry-54c7b778bc-f89zb" Apr 17 07:53:41.236765 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.236757 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ae8faad5-6108-4619-82ef-7f707ede4f61-registry-tls\") pod \"image-registry-54c7b778bc-f89zb\" (UID: \"ae8faad5-6108-4619-82ef-7f707ede4f61\") " pod="openshift-image-registry/image-registry-54c7b778bc-f89zb" Apr 17 07:53:41.236862 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.236800 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ae8faad5-6108-4619-82ef-7f707ede4f61-trusted-ca\") pod \"image-registry-54c7b778bc-f89zb\" (UID: \"ae8faad5-6108-4619-82ef-7f707ede4f61\") " pod="openshift-image-registry/image-registry-54c7b778bc-f89zb" Apr 17 07:53:41.245321 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.245298 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-lqcdr"] Apr 17 07:53:41.248823 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.248806 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-lqcdr" Apr 17 07:53:41.251829 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.251810 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-6r99z\"" Apr 17 07:53:41.253268 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.253250 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 07:53:41.253344 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.253305 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 07:53:41.255938 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.255899 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-54c7b778bc-f89zb"] Apr 17 07:53:41.271089 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.271067 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-lqcdr"] Apr 17 07:53:41.337964 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.337935 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ae8faad5-6108-4619-82ef-7f707ede4f61-ca-trust-extracted\") pod \"image-registry-54c7b778bc-f89zb\" (UID: \"ae8faad5-6108-4619-82ef-7f707ede4f61\") " pod="openshift-image-registry/image-registry-54c7b778bc-f89zb" Apr 17 07:53:41.337964 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.337969 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b3c626ad-c69c-40d3-87c2-df8c2f8dc567-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-lqcdr\" (UID: \"b3c626ad-c69c-40d3-87c2-df8c2f8dc567\") " pod="openshift-insights/insights-runtime-extractor-lqcdr" Apr 17 07:53:41.338165 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.337986 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfwlq\" (UniqueName: \"kubernetes.io/projected/b3c626ad-c69c-40d3-87c2-df8c2f8dc567-kube-api-access-xfwlq\") pod \"insights-runtime-extractor-lqcdr\" (UID: \"b3c626ad-c69c-40d3-87c2-df8c2f8dc567\") " pod="openshift-insights/insights-runtime-extractor-lqcdr" Apr 17 07:53:41.338165 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.338017 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ae8faad5-6108-4619-82ef-7f707ede4f61-registry-certificates\") pod \"image-registry-54c7b778bc-f89zb\" (UID: \"ae8faad5-6108-4619-82ef-7f707ede4f61\") " pod="openshift-image-registry/image-registry-54c7b778bc-f89zb" Apr 17 07:53:41.338165 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.338041 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ae8faad5-6108-4619-82ef-7f707ede4f61-installation-pull-secrets\") pod \"image-registry-54c7b778bc-f89zb\" (UID: \"ae8faad5-6108-4619-82ef-7f707ede4f61\") " pod="openshift-image-registry/image-registry-54c7b778bc-f89zb" Apr 17 07:53:41.338165 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.338059 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dhbr9\" (UniqueName: \"kubernetes.io/projected/ae8faad5-6108-4619-82ef-7f707ede4f61-kube-api-access-dhbr9\") pod \"image-registry-54c7b778bc-f89zb\" (UID: \"ae8faad5-6108-4619-82ef-7f707ede4f61\") " pod="openshift-image-registry/image-registry-54c7b778bc-f89zb" Apr 17 07:53:41.338165 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.338103 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b3c626ad-c69c-40d3-87c2-df8c2f8dc567-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-lqcdr\" (UID: \"b3c626ad-c69c-40d3-87c2-df8c2f8dc567\") " pod="openshift-insights/insights-runtime-extractor-lqcdr" Apr 17 07:53:41.338165 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.338124 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ae8faad5-6108-4619-82ef-7f707ede4f61-image-registry-private-configuration\") pod \"image-registry-54c7b778bc-f89zb\" (UID: \"ae8faad5-6108-4619-82ef-7f707ede4f61\") " pod="openshift-image-registry/image-registry-54c7b778bc-f89zb" Apr 17 07:53:41.338165 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.338160 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b3c626ad-c69c-40d3-87c2-df8c2f8dc567-crio-socket\") pod \"insights-runtime-extractor-lqcdr\" (UID: \"b3c626ad-c69c-40d3-87c2-df8c2f8dc567\") " pod="openshift-insights/insights-runtime-extractor-lqcdr" Apr 17 07:53:41.338508 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.338191 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ae8faad5-6108-4619-82ef-7f707ede4f61-bound-sa-token\") pod \"image-registry-54c7b778bc-f89zb\" (UID: \"ae8faad5-6108-4619-82ef-7f707ede4f61\") " pod="openshift-image-registry/image-registry-54c7b778bc-f89zb" Apr 17 07:53:41.338508 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.338224 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b3c626ad-c69c-40d3-87c2-df8c2f8dc567-data-volume\") pod \"insights-runtime-extractor-lqcdr\" (UID: \"b3c626ad-c69c-40d3-87c2-df8c2f8dc567\") " pod="openshift-insights/insights-runtime-extractor-lqcdr" Apr 17 07:53:41.338508 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.338310 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ae8faad5-6108-4619-82ef-7f707ede4f61-registry-tls\") pod \"image-registry-54c7b778bc-f89zb\" (UID: \"ae8faad5-6108-4619-82ef-7f707ede4f61\") " pod="openshift-image-registry/image-registry-54c7b778bc-f89zb" Apr 17 07:53:41.338508 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.338375 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ae8faad5-6108-4619-82ef-7f707ede4f61-ca-trust-extracted\") pod \"image-registry-54c7b778bc-f89zb\" (UID: \"ae8faad5-6108-4619-82ef-7f707ede4f61\") " pod="openshift-image-registry/image-registry-54c7b778bc-f89zb" Apr 17 07:53:41.338508 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.338466 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ae8faad5-6108-4619-82ef-7f707ede4f61-trusted-ca\") pod \"image-registry-54c7b778bc-f89zb\" (UID: \"ae8faad5-6108-4619-82ef-7f707ede4f61\") " pod="openshift-image-registry/image-registry-54c7b778bc-f89zb" Apr 17 07:53:41.338912 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.338886 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ae8faad5-6108-4619-82ef-7f707ede4f61-registry-certificates\") pod \"image-registry-54c7b778bc-f89zb\" (UID: \"ae8faad5-6108-4619-82ef-7f707ede4f61\") " pod="openshift-image-registry/image-registry-54c7b778bc-f89zb" Apr 17 07:53:41.339367 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.339342 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ae8faad5-6108-4619-82ef-7f707ede4f61-trusted-ca\") pod \"image-registry-54c7b778bc-f89zb\" (UID: \"ae8faad5-6108-4619-82ef-7f707ede4f61\") " pod="openshift-image-registry/image-registry-54c7b778bc-f89zb" Apr 17 07:53:41.340885 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.340858 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ae8faad5-6108-4619-82ef-7f707ede4f61-image-registry-private-configuration\") pod \"image-registry-54c7b778bc-f89zb\" (UID: \"ae8faad5-6108-4619-82ef-7f707ede4f61\") " pod="openshift-image-registry/image-registry-54c7b778bc-f89zb" Apr 17 07:53:41.340970 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.340899 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ae8faad5-6108-4619-82ef-7f707ede4f61-registry-tls\") pod \"image-registry-54c7b778bc-f89zb\" (UID: \"ae8faad5-6108-4619-82ef-7f707ede4f61\") " pod="openshift-image-registry/image-registry-54c7b778bc-f89zb" Apr 17 07:53:41.340970 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.340911 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ae8faad5-6108-4619-82ef-7f707ede4f61-installation-pull-secrets\") pod \"image-registry-54c7b778bc-f89zb\" (UID: \"ae8faad5-6108-4619-82ef-7f707ede4f61\") " pod="openshift-image-registry/image-registry-54c7b778bc-f89zb" Apr 17 07:53:41.348657 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.348629 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ae8faad5-6108-4619-82ef-7f707ede4f61-bound-sa-token\") pod \"image-registry-54c7b778bc-f89zb\" (UID: \"ae8faad5-6108-4619-82ef-7f707ede4f61\") " pod="openshift-image-registry/image-registry-54c7b778bc-f89zb" Apr 17 07:53:41.349179 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.349162 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhbr9\" (UniqueName: \"kubernetes.io/projected/ae8faad5-6108-4619-82ef-7f707ede4f61-kube-api-access-dhbr9\") pod \"image-registry-54c7b778bc-f89zb\" (UID: \"ae8faad5-6108-4619-82ef-7f707ede4f61\") " pod="openshift-image-registry/image-registry-54c7b778bc-f89zb" Apr 17 07:53:41.439438 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.439411 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b3c626ad-c69c-40d3-87c2-df8c2f8dc567-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-lqcdr\" (UID: \"b3c626ad-c69c-40d3-87c2-df8c2f8dc567\") " pod="openshift-insights/insights-runtime-extractor-lqcdr" Apr 17 07:53:41.439651 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.439445 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b3c626ad-c69c-40d3-87c2-df8c2f8dc567-crio-socket\") pod \"insights-runtime-extractor-lqcdr\" (UID: \"b3c626ad-c69c-40d3-87c2-df8c2f8dc567\") " pod="openshift-insights/insights-runtime-extractor-lqcdr" Apr 17 07:53:41.439651 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.439473 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b3c626ad-c69c-40d3-87c2-df8c2f8dc567-data-volume\") pod \"insights-runtime-extractor-lqcdr\" (UID: \"b3c626ad-c69c-40d3-87c2-df8c2f8dc567\") " pod="openshift-insights/insights-runtime-extractor-lqcdr" Apr 17 07:53:41.439651 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.439537 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b3c626ad-c69c-40d3-87c2-df8c2f8dc567-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-lqcdr\" (UID: \"b3c626ad-c69c-40d3-87c2-df8c2f8dc567\") " pod="openshift-insights/insights-runtime-extractor-lqcdr" Apr 17 07:53:41.439651 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.439552 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b3c626ad-c69c-40d3-87c2-df8c2f8dc567-crio-socket\") pod \"insights-runtime-extractor-lqcdr\" (UID: \"b3c626ad-c69c-40d3-87c2-df8c2f8dc567\") " pod="openshift-insights/insights-runtime-extractor-lqcdr" Apr 17 07:53:41.439651 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.439559 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xfwlq\" (UniqueName: \"kubernetes.io/projected/b3c626ad-c69c-40d3-87c2-df8c2f8dc567-kube-api-access-xfwlq\") pod \"insights-runtime-extractor-lqcdr\" (UID: \"b3c626ad-c69c-40d3-87c2-df8c2f8dc567\") " pod="openshift-insights/insights-runtime-extractor-lqcdr" Apr 17 07:53:41.439877 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.439859 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b3c626ad-c69c-40d3-87c2-df8c2f8dc567-data-volume\") pod \"insights-runtime-extractor-lqcdr\" (UID: \"b3c626ad-c69c-40d3-87c2-df8c2f8dc567\") " pod="openshift-insights/insights-runtime-extractor-lqcdr" Apr 17 07:53:41.440042 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.440024 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b3c626ad-c69c-40d3-87c2-df8c2f8dc567-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-lqcdr\" (UID: \"b3c626ad-c69c-40d3-87c2-df8c2f8dc567\") " pod="openshift-insights/insights-runtime-extractor-lqcdr" Apr 17 07:53:41.441946 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.441926 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b3c626ad-c69c-40d3-87c2-df8c2f8dc567-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-lqcdr\" (UID: \"b3c626ad-c69c-40d3-87c2-df8c2f8dc567\") " pod="openshift-insights/insights-runtime-extractor-lqcdr" Apr 17 07:53:41.449407 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.449385 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfwlq\" (UniqueName: \"kubernetes.io/projected/b3c626ad-c69c-40d3-87c2-df8c2f8dc567-kube-api-access-xfwlq\") pod \"insights-runtime-extractor-lqcdr\" (UID: \"b3c626ad-c69c-40d3-87c2-df8c2f8dc567\") " pod="openshift-insights/insights-runtime-extractor-lqcdr" Apr 17 07:53:41.523957 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.523919 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-54c7b778bc-f89zb" Apr 17 07:53:41.558228 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.558196 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-lqcdr" Apr 17 07:53:41.662232 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.662210 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-54c7b778bc-f89zb"] Apr 17 07:53:41.664948 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:53:41.664924 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae8faad5_6108_4619_82ef_7f707ede4f61.slice/crio-c861f31b39e3cca823113aece897e75bbf4da018c1c95964abc485dcc8abedd2 WatchSource:0}: Error finding container c861f31b39e3cca823113aece897e75bbf4da018c1c95964abc485dcc8abedd2: Status 404 returned error can't find the container with id c861f31b39e3cca823113aece897e75bbf4da018c1c95964abc485dcc8abedd2 Apr 17 07:53:41.686666 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.686642 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-lqcdr"] Apr 17 07:53:41.700094 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:53:41.700058 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3c626ad_c69c_40d3_87c2_df8c2f8dc567.slice/crio-a3f03690bc4f45b466adfed9d86561d1d67d3a954e541535428be934ff8915a4 WatchSource:0}: Error finding container a3f03690bc4f45b466adfed9d86561d1d67d3a954e541535428be934ff8915a4: Status 404 returned error can't find the container with id a3f03690bc4f45b466adfed9d86561d1d67d3a954e541535428be934ff8915a4 Apr 17 07:53:41.943460 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.943424 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-lqcdr" event={"ID":"b3c626ad-c69c-40d3-87c2-df8c2f8dc567","Type":"ContainerStarted","Data":"29a9b1e76ce2e8d696f7dd5b4091e478a7a30c8a0060352456484d407472cb23"} Apr 17 07:53:41.943460 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.943466 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-lqcdr" event={"ID":"b3c626ad-c69c-40d3-87c2-df8c2f8dc567","Type":"ContainerStarted","Data":"a3f03690bc4f45b466adfed9d86561d1d67d3a954e541535428be934ff8915a4"} Apr 17 07:53:41.944688 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.944660 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-54c7b778bc-f89zb" event={"ID":"ae8faad5-6108-4619-82ef-7f707ede4f61","Type":"ContainerStarted","Data":"4d35337ce108ba589d4e0347bb6d6c30c34acc8b04adcb6d27325aff78d8589f"} Apr 17 07:53:41.944688 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.944687 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-54c7b778bc-f89zb" event={"ID":"ae8faad5-6108-4619-82ef-7f707ede4f61","Type":"ContainerStarted","Data":"c861f31b39e3cca823113aece897e75bbf4da018c1c95964abc485dcc8abedd2"} Apr 17 07:53:41.944843 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:41.944802 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-54c7b778bc-f89zb" Apr 17 07:53:42.346180 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:42.346126 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c26b8a3e-8882-4232-9053-f698f9bb8392-service-ca-bundle\") pod \"router-default-5b66495d-5v9dj\" (UID: \"c26b8a3e-8882-4232-9053-f698f9bb8392\") " pod="openshift-ingress/router-default-5b66495d-5v9dj" Apr 17 07:53:42.346717 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:42.346695 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c26b8a3e-8882-4232-9053-f698f9bb8392-service-ca-bundle\") pod \"router-default-5b66495d-5v9dj\" (UID: \"c26b8a3e-8882-4232-9053-f698f9bb8392\") " pod="openshift-ingress/router-default-5b66495d-5v9dj" Apr 17 07:53:42.553593 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:42.553565 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-64bdg\"" Apr 17 07:53:42.561713 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:42.561694 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5b66495d-5v9dj" Apr 17 07:53:42.689430 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:42.689371 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-54c7b778bc-f89zb" podStartSLOduration=1.689354114 podStartE2EDuration="1.689354114s" podCreationTimestamp="2026-04-17 07:53:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 07:53:41.963335519 +0000 UTC m=+141.929372287" watchObservedRunningTime="2026-04-17 07:53:42.689354114 +0000 UTC m=+142.655390890" Apr 17 07:53:42.690100 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:42.690080 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5b66495d-5v9dj"] Apr 17 07:53:42.693153 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:53:42.693113 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc26b8a3e_8882_4232_9053_f698f9bb8392.slice/crio-22c139e61b232a4a56ec797c9db22bf227608c4392be653a29e4f3a6cb8fd147 WatchSource:0}: Error finding container 22c139e61b232a4a56ec797c9db22bf227608c4392be653a29e4f3a6cb8fd147: Status 404 returned error can't find the container with id 22c139e61b232a4a56ec797c9db22bf227608c4392be653a29e4f3a6cb8fd147 Apr 17 07:53:42.949253 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:42.949217 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5b66495d-5v9dj" event={"ID":"c26b8a3e-8882-4232-9053-f698f9bb8392","Type":"ContainerStarted","Data":"52d9daaa90e54f5bfe561a9713748904ef27ae3f4578a185c9625c3175f40a2d"} Apr 17 07:53:42.949253 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:42.949259 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5b66495d-5v9dj" event={"ID":"c26b8a3e-8882-4232-9053-f698f9bb8392","Type":"ContainerStarted","Data":"22c139e61b232a4a56ec797c9db22bf227608c4392be653a29e4f3a6cb8fd147"} Apr 17 07:53:42.951183 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:42.951137 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-lqcdr" event={"ID":"b3c626ad-c69c-40d3-87c2-df8c2f8dc567","Type":"ContainerStarted","Data":"58070ba5db65b421690821bd033eb3fdaf18e61f5f1813149824b071171696b8"} Apr 17 07:53:42.969545 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:42.969502 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5b66495d-5v9dj" podStartSLOduration=32.969489877 podStartE2EDuration="32.969489877s" podCreationTimestamp="2026-04-17 07:53:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 07:53:42.968521082 +0000 UTC m=+142.934557861" watchObservedRunningTime="2026-04-17 07:53:42.969489877 +0000 UTC m=+142.935526653" Apr 17 07:53:43.562724 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:43.562687 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5b66495d-5v9dj" Apr 17 07:53:43.565779 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:43.565753 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5b66495d-5v9dj" Apr 17 07:53:43.953434 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:43.953410 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-5b66495d-5v9dj" Apr 17 07:53:43.954635 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:43.954616 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5b66495d-5v9dj" Apr 17 07:53:44.957581 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:44.957542 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-lqcdr" event={"ID":"b3c626ad-c69c-40d3-87c2-df8c2f8dc567","Type":"ContainerStarted","Data":"f74de487bd65f79b9de4b3ad3bde554e919e8586dc7097ccf0b7d68974d723d3"} Apr 17 07:53:44.974772 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:44.974725 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-lqcdr" podStartSLOduration=1.7764471290000001 podStartE2EDuration="3.974711808s" podCreationTimestamp="2026-04-17 07:53:41 +0000 UTC" firstStartedPulling="2026-04-17 07:53:41.749473661 +0000 UTC m=+141.715510417" lastFinishedPulling="2026-04-17 07:53:43.94773834 +0000 UTC m=+143.913775096" observedRunningTime="2026-04-17 07:53:44.973790736 +0000 UTC m=+144.939827513" watchObservedRunningTime="2026-04-17 07:53:44.974711808 +0000 UTC m=+144.940748585" Apr 17 07:53:51.121920 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:51.121891 2578 patch_prober.go:28] interesting pod/image-registry-597b89995d-65v5w container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 07:53:51.122290 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:51.121940 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-597b89995d-65v5w" podUID="95a9e913-e1cb-49b7-ae5f-77299dc16875" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 07:53:54.752535 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:54.752503 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-6s55s"] Apr 17 07:53:54.757183 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:54.757166 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6s55s" Apr 17 07:53:54.759366 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:54.759343 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 07:53:54.759366 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:54.759360 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-zxkj7\"" Apr 17 07:53:54.759520 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:54.759382 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 17 07:53:54.759520 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:54.759402 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 07:53:54.760153 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:54.760125 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 17 07:53:54.760221 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:54.760182 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 07:53:54.766174 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:54.766135 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-6s55s"] Apr 17 07:53:54.781593 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:54.781575 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-57md7"] Apr 17 07:53:54.784804 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:54.784786 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-57md7" Apr 17 07:53:54.787125 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:54.787106 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 07:53:54.787486 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:54.787470 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 07:53:54.787558 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:54.787470 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-s4jk5\"" Apr 17 07:53:54.787711 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:54.787694 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 07:53:54.837557 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:54.837533 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0-root\") pod \"node-exporter-57md7\" (UID: \"4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0\") " pod="openshift-monitoring/node-exporter-57md7" Apr 17 07:53:54.837654 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:54.837560 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0-metrics-client-ca\") pod \"node-exporter-57md7\" (UID: \"4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0\") " pod="openshift-monitoring/node-exporter-57md7" Apr 17 07:53:54.837654 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:54.837581 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/87604ba9-674a-4721-a9dd-ffc9217f96aa-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-6s55s\" (UID: \"87604ba9-674a-4721-a9dd-ffc9217f96aa\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6s55s" Apr 17 07:53:54.837654 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:54.837602 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-57md7\" (UID: \"4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0\") " pod="openshift-monitoring/node-exporter-57md7" Apr 17 07:53:54.837654 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:54.837627 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjnzw\" (UniqueName: \"kubernetes.io/projected/4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0-kube-api-access-tjnzw\") pod \"node-exporter-57md7\" (UID: \"4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0\") " pod="openshift-monitoring/node-exporter-57md7" Apr 17 07:53:54.837782 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:54.837677 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0-node-exporter-wtmp\") pod \"node-exporter-57md7\" (UID: \"4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0\") " pod="openshift-monitoring/node-exporter-57md7" Apr 17 07:53:54.837782 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:54.837740 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0-sys\") pod \"node-exporter-57md7\" (UID: \"4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0\") " pod="openshift-monitoring/node-exporter-57md7" Apr 17 07:53:54.837782 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:54.837766 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klctg\" (UniqueName: \"kubernetes.io/projected/87604ba9-674a-4721-a9dd-ffc9217f96aa-kube-api-access-klctg\") pod \"openshift-state-metrics-9d44df66c-6s55s\" (UID: \"87604ba9-674a-4721-a9dd-ffc9217f96aa\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6s55s" Apr 17 07:53:54.837872 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:54.837790 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0-node-exporter-accelerators-collector-config\") pod \"node-exporter-57md7\" (UID: \"4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0\") " pod="openshift-monitoring/node-exporter-57md7" Apr 17 07:53:54.837872 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:54.837837 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/87604ba9-674a-4721-a9dd-ffc9217f96aa-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-6s55s\" (UID: \"87604ba9-674a-4721-a9dd-ffc9217f96aa\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6s55s" Apr 17 07:53:54.837872 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:54.837859 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/87604ba9-674a-4721-a9dd-ffc9217f96aa-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-6s55s\" (UID: \"87604ba9-674a-4721-a9dd-ffc9217f96aa\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6s55s" Apr 17 07:53:54.837958 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:54.837876 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0-node-exporter-textfile\") pod \"node-exporter-57md7\" (UID: \"4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0\") " pod="openshift-monitoring/node-exporter-57md7" Apr 17 07:53:54.837958 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:54.837894 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0-node-exporter-tls\") pod \"node-exporter-57md7\" (UID: \"4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0\") " pod="openshift-monitoring/node-exporter-57md7" Apr 17 07:53:54.939159 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:54.939121 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0-node-exporter-wtmp\") pod \"node-exporter-57md7\" (UID: \"4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0\") " pod="openshift-monitoring/node-exporter-57md7" Apr 17 07:53:54.939252 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:54.939198 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0-sys\") pod \"node-exporter-57md7\" (UID: \"4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0\") " pod="openshift-monitoring/node-exporter-57md7" Apr 17 07:53:54.939252 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:54.939236 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-klctg\" (UniqueName: \"kubernetes.io/projected/87604ba9-674a-4721-a9dd-ffc9217f96aa-kube-api-access-klctg\") pod \"openshift-state-metrics-9d44df66c-6s55s\" (UID: \"87604ba9-674a-4721-a9dd-ffc9217f96aa\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6s55s" Apr 17 07:53:54.939329 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:54.939264 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0-node-exporter-accelerators-collector-config\") pod \"node-exporter-57md7\" (UID: \"4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0\") " pod="openshift-monitoring/node-exporter-57md7" Apr 17 07:53:54.939329 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:54.939288 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0-node-exporter-wtmp\") pod \"node-exporter-57md7\" (UID: \"4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0\") " pod="openshift-monitoring/node-exporter-57md7" Apr 17 07:53:54.939329 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:54.939292 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0-sys\") pod \"node-exporter-57md7\" (UID: \"4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0\") " pod="openshift-monitoring/node-exporter-57md7" Apr 17 07:53:54.939329 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:54.939316 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/87604ba9-674a-4721-a9dd-ffc9217f96aa-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-6s55s\" (UID: \"87604ba9-674a-4721-a9dd-ffc9217f96aa\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6s55s" Apr 17 07:53:54.939497 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:54.939338 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/87604ba9-674a-4721-a9dd-ffc9217f96aa-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-6s55s\" (UID: \"87604ba9-674a-4721-a9dd-ffc9217f96aa\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6s55s" Apr 17 07:53:54.939497 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:54.939467 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0-node-exporter-textfile\") pod \"node-exporter-57md7\" (UID: \"4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0\") " pod="openshift-monitoring/node-exporter-57md7" Apr 17 07:53:54.939591 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:54.939516 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0-node-exporter-tls\") pod \"node-exporter-57md7\" (UID: \"4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0\") " pod="openshift-monitoring/node-exporter-57md7" Apr 17 07:53:54.939591 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:54.939560 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0-root\") pod \"node-exporter-57md7\" (UID: \"4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0\") " pod="openshift-monitoring/node-exporter-57md7" Apr 17 07:53:54.939684 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:54.939588 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0-metrics-client-ca\") pod \"node-exporter-57md7\" (UID: \"4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0\") " pod="openshift-monitoring/node-exporter-57md7" Apr 17 07:53:54.939684 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:54.939626 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0-root\") pod \"node-exporter-57md7\" (UID: \"4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0\") " pod="openshift-monitoring/node-exporter-57md7" Apr 17 07:53:54.939684 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:53:54.939637 2578 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 07:53:54.939684 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:54.939630 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/87604ba9-674a-4721-a9dd-ffc9217f96aa-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-6s55s\" (UID: \"87604ba9-674a-4721-a9dd-ffc9217f96aa\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6s55s" Apr 17 07:53:54.939879 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:53:54.939702 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0-node-exporter-tls podName:4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:55.439681303 +0000 UTC m=+155.405718071 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0-node-exporter-tls") pod "node-exporter-57md7" (UID: "4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0") : secret "node-exporter-tls" not found Apr 17 07:53:54.939879 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:53:54.939706 2578 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 17 07:53:54.939879 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:54.939724 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-57md7\" (UID: \"4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0\") " pod="openshift-monitoring/node-exporter-57md7" Apr 17 07:53:54.939879 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:53:54.939764 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87604ba9-674a-4721-a9dd-ffc9217f96aa-openshift-state-metrics-tls podName:87604ba9-674a-4721-a9dd-ffc9217f96aa nodeName:}" failed. No retries permitted until 2026-04-17 07:53:55.439747359 +0000 UTC m=+155.405784127 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/87604ba9-674a-4721-a9dd-ffc9217f96aa-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-6s55s" (UID: "87604ba9-674a-4721-a9dd-ffc9217f96aa") : secret "openshift-state-metrics-tls" not found Apr 17 07:53:54.939879 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:54.939784 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tjnzw\" (UniqueName: \"kubernetes.io/projected/4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0-kube-api-access-tjnzw\") pod \"node-exporter-57md7\" (UID: \"4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0\") " pod="openshift-monitoring/node-exporter-57md7" Apr 17 07:53:54.939879 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:54.939798 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0-node-exporter-textfile\") pod \"node-exporter-57md7\" (UID: \"4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0\") " pod="openshift-monitoring/node-exporter-57md7" Apr 17 07:53:54.940119 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:54.939923 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0-node-exporter-accelerators-collector-config\") pod \"node-exporter-57md7\" (UID: \"4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0\") " pod="openshift-monitoring/node-exporter-57md7" Apr 17 07:53:54.940189 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:54.940171 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/87604ba9-674a-4721-a9dd-ffc9217f96aa-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-6s55s\" (UID: \"87604ba9-674a-4721-a9dd-ffc9217f96aa\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6s55s" Apr 17 07:53:54.940256 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:54.940236 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0-metrics-client-ca\") pod \"node-exporter-57md7\" (UID: \"4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0\") " pod="openshift-monitoring/node-exporter-57md7" Apr 17 07:53:54.941837 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:54.941816 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-57md7\" (UID: \"4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0\") " pod="openshift-monitoring/node-exporter-57md7" Apr 17 07:53:54.941924 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:54.941838 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/87604ba9-674a-4721-a9dd-ffc9217f96aa-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-6s55s\" (UID: \"87604ba9-674a-4721-a9dd-ffc9217f96aa\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6s55s" Apr 17 07:53:54.946273 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:54.946250 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-klctg\" (UniqueName: \"kubernetes.io/projected/87604ba9-674a-4721-a9dd-ffc9217f96aa-kube-api-access-klctg\") pod \"openshift-state-metrics-9d44df66c-6s55s\" (UID: \"87604ba9-674a-4721-a9dd-ffc9217f96aa\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6s55s" Apr 17 07:53:54.947688 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:54.947670 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjnzw\" (UniqueName: \"kubernetes.io/projected/4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0-kube-api-access-tjnzw\") pod \"node-exporter-57md7\" (UID: \"4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0\") " pod="openshift-monitoring/node-exporter-57md7" Apr 17 07:53:55.443708 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:55.443678 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0-node-exporter-tls\") pod \"node-exporter-57md7\" (UID: \"4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0\") " pod="openshift-monitoring/node-exporter-57md7" Apr 17 07:53:55.443874 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:55.443723 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/87604ba9-674a-4721-a9dd-ffc9217f96aa-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-6s55s\" (UID: \"87604ba9-674a-4721-a9dd-ffc9217f96aa\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6s55s" Apr 17 07:53:55.446001 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:55.445972 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/87604ba9-674a-4721-a9dd-ffc9217f96aa-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-6s55s\" (UID: \"87604ba9-674a-4721-a9dd-ffc9217f96aa\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6s55s" Apr 17 07:53:55.446155 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:55.446013 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0-node-exporter-tls\") pod \"node-exporter-57md7\" (UID: \"4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0\") " pod="openshift-monitoring/node-exporter-57md7" Apr 17 07:53:55.666481 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:55.666451 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6s55s" Apr 17 07:53:55.693633 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:55.693596 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-57md7" Apr 17 07:53:55.705609 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:53:55.705575 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b2eb30e_df2b_4532_9ee5_3cd6e4fb18f0.slice/crio-0e944670ebf21d3941dc17022f8cbb7bd0f7405696c30365cbbf8778dbd9cde7 WatchSource:0}: Error finding container 0e944670ebf21d3941dc17022f8cbb7bd0f7405696c30365cbbf8778dbd9cde7: Status 404 returned error can't find the container with id 0e944670ebf21d3941dc17022f8cbb7bd0f7405696c30365cbbf8778dbd9cde7 Apr 17 07:53:55.787720 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:55.787691 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-6s55s"] Apr 17 07:53:55.790634 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:53:55.790604 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87604ba9_674a_4721_a9dd_ffc9217f96aa.slice/crio-3237efba7fc01db49022ab9cd1c7cf69dad3a62d68147bc5b7e243357b144e99 WatchSource:0}: Error finding container 3237efba7fc01db49022ab9cd1c7cf69dad3a62d68147bc5b7e243357b144e99: Status 404 returned error can't find the container with id 3237efba7fc01db49022ab9cd1c7cf69dad3a62d68147bc5b7e243357b144e99 Apr 17 07:53:55.983623 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:55.983539 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-57md7" event={"ID":"4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0","Type":"ContainerStarted","Data":"0e944670ebf21d3941dc17022f8cbb7bd0f7405696c30365cbbf8778dbd9cde7"} Apr 17 07:53:55.985032 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:55.985010 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6s55s" event={"ID":"87604ba9-674a-4721-a9dd-ffc9217f96aa","Type":"ContainerStarted","Data":"90f4028bda2ef84242d2dc0a901c97dfd02521550d3dfb1d0f96e4dd84f90a56"} Apr 17 07:53:55.985123 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:55.985039 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6s55s" event={"ID":"87604ba9-674a-4721-a9dd-ffc9217f96aa","Type":"ContainerStarted","Data":"69e282439f6101564d748b20849c4c39396b91db73fa0df3f1d943d79b0e0a41"} Apr 17 07:53:55.985123 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:55.985049 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6s55s" event={"ID":"87604ba9-674a-4721-a9dd-ffc9217f96aa","Type":"ContainerStarted","Data":"3237efba7fc01db49022ab9cd1c7cf69dad3a62d68147bc5b7e243357b144e99"} Apr 17 07:53:56.989463 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:56.989376 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-57md7" event={"ID":"4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0","Type":"ContainerStarted","Data":"513d128df7477f8234e89e3c5baf32783595529788294876ddd0fbcf6bec20de"} Apr 17 07:53:57.511035 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:53:57.510989 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-xn7q4" podUID="86725d0f-c4d4-499a-96f9-106af3387cc2" Apr 17 07:53:57.535101 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:53:57.535071 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-k99jf" podUID="ecab5c24-0616-4d6e-93a1-4c29b1548a0d" Apr 17 07:53:57.993330 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:57.993297 2578 generic.go:358] "Generic (PLEG): container finished" podID="4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0" containerID="513d128df7477f8234e89e3c5baf32783595529788294876ddd0fbcf6bec20de" exitCode=0 Apr 17 07:53:57.993751 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:57.993391 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-57md7" event={"ID":"4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0","Type":"ContainerDied","Data":"513d128df7477f8234e89e3c5baf32783595529788294876ddd0fbcf6bec20de"} Apr 17 07:53:57.995205 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:57.995179 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6s55s" event={"ID":"87604ba9-674a-4721-a9dd-ffc9217f96aa","Type":"ContainerStarted","Data":"b2ce89514b362492dcb39b69f10be2a9601a02b64de1136830f335f5a26acf99"} Apr 17 07:53:57.995327 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:57.995214 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xn7q4" Apr 17 07:53:58.038994 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:58.038949 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6s55s" podStartSLOduration=2.8670966890000003 podStartE2EDuration="4.038934557s" podCreationTimestamp="2026-04-17 07:53:54 +0000 UTC" firstStartedPulling="2026-04-17 07:53:55.901810789 +0000 UTC m=+155.867847553" lastFinishedPulling="2026-04-17 07:53:57.073648667 +0000 UTC m=+157.039685421" observedRunningTime="2026-04-17 07:53:58.037523608 +0000 UTC m=+158.003560386" watchObservedRunningTime="2026-04-17 07:53:58.038934557 +0000 UTC m=+158.004971336" Apr 17 07:53:58.999322 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:58.999288 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-57md7" event={"ID":"4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0","Type":"ContainerStarted","Data":"df8e50ba8443f523b3d542c63aab51649fd14aae4ff1c52822ee64bcfdb56a94"} Apr 17 07:53:58.999322 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:58.999327 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-57md7" event={"ID":"4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0","Type":"ContainerStarted","Data":"7f594b41d4544b10fd5329ccfffb3635f41f5c2441d53656fa7dd77eeb92b74a"} Apr 17 07:53:59.017798 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:59.017754 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-57md7" podStartSLOduration=3.984019892 podStartE2EDuration="5.017741147s" podCreationTimestamp="2026-04-17 07:53:54 +0000 UTC" firstStartedPulling="2026-04-17 07:53:55.70709774 +0000 UTC m=+155.673134494" lastFinishedPulling="2026-04-17 07:53:56.740818983 +0000 UTC m=+156.706855749" observedRunningTime="2026-04-17 07:53:59.017196003 +0000 UTC m=+158.983232780" watchObservedRunningTime="2026-04-17 07:53:59.017741147 +0000 UTC m=+158.983777923" Apr 17 07:53:59.554982 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:59.554952 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-zt672"] Apr 17 07:53:59.558208 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:59.558193 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zt672" Apr 17 07:53:59.560597 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:59.560574 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 17 07:53:59.560702 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:59.560596 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-gmgxx\"" Apr 17 07:53:59.565909 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:59.565885 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-zt672"] Apr 17 07:53:59.678883 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:59.678845 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1153f1d4-8d0e-472e-a2d6-5c7105b8aa2d-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-zt672\" (UID: \"1153f1d4-8d0e-472e-a2d6-5c7105b8aa2d\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zt672" Apr 17 07:53:59.780100 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:59.780066 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1153f1d4-8d0e-472e-a2d6-5c7105b8aa2d-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-zt672\" (UID: \"1153f1d4-8d0e-472e-a2d6-5c7105b8aa2d\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zt672" Apr 17 07:53:59.782641 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:59.782620 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1153f1d4-8d0e-472e-a2d6-5c7105b8aa2d-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-zt672\" (UID: \"1153f1d4-8d0e-472e-a2d6-5c7105b8aa2d\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zt672" Apr 17 07:53:59.867971 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:59.867901 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zt672" Apr 17 07:53:59.981857 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:59.981711 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-zt672"] Apr 17 07:53:59.984362 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:53:59.984321 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1153f1d4_8d0e_472e_a2d6_5c7105b8aa2d.slice/crio-902651a1a8ed0ff221189acb69aa526097005809e349cc345350c32b7fe3088e WatchSource:0}: Error finding container 902651a1a8ed0ff221189acb69aa526097005809e349cc345350c32b7fe3088e: Status 404 returned error can't find the container with id 902651a1a8ed0ff221189acb69aa526097005809e349cc345350c32b7fe3088e Apr 17 07:53:59.994336 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:59.994314 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-54f6d8586d-x4jzp"] Apr 17 07:53:59.998953 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:53:59.998935 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-54f6d8586d-x4jzp" Apr 17 07:54:00.002206 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:00.002181 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 17 07:54:00.002206 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:00.002202 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 17 07:54:00.002663 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:00.002300 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 17 07:54:00.002663 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:00.002300 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 17 07:54:00.003411 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:00.003059 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-zrtcd\"" Apr 17 07:54:00.003411 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:00.003188 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 17 07:54:00.003411 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:00.003235 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zt672" event={"ID":"1153f1d4-8d0e-472e-a2d6-5c7105b8aa2d","Type":"ContainerStarted","Data":"902651a1a8ed0ff221189acb69aa526097005809e349cc345350c32b7fe3088e"} Apr 17 07:54:00.010912 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:00.010893 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-54f6d8586d-x4jzp"] Apr 17 07:54:00.012834 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:00.012816 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 17 07:54:00.082465 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:00.082443 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3b4ef6a3-477e-4314-8383-f7152dad52b5-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-54f6d8586d-x4jzp\" (UID: \"3b4ef6a3-477e-4314-8383-f7152dad52b5\") " pod="openshift-monitoring/telemeter-client-54f6d8586d-x4jzp" Apr 17 07:54:00.082591 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:00.082471 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/3b4ef6a3-477e-4314-8383-f7152dad52b5-federate-client-tls\") pod \"telemeter-client-54f6d8586d-x4jzp\" (UID: \"3b4ef6a3-477e-4314-8383-f7152dad52b5\") " pod="openshift-monitoring/telemeter-client-54f6d8586d-x4jzp" Apr 17 07:54:00.082591 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:00.082516 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b4ef6a3-477e-4314-8383-f7152dad52b5-serving-certs-ca-bundle\") pod \"telemeter-client-54f6d8586d-x4jzp\" (UID: \"3b4ef6a3-477e-4314-8383-f7152dad52b5\") " pod="openshift-monitoring/telemeter-client-54f6d8586d-x4jzp" Apr 17 07:54:00.082591 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:00.082534 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b4ef6a3-477e-4314-8383-f7152dad52b5-telemeter-trusted-ca-bundle\") pod \"telemeter-client-54f6d8586d-x4jzp\" (UID: \"3b4ef6a3-477e-4314-8383-f7152dad52b5\") " pod="openshift-monitoring/telemeter-client-54f6d8586d-x4jzp" Apr 17 07:54:00.082728 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:00.082582 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/3b4ef6a3-477e-4314-8383-f7152dad52b5-telemeter-client-tls\") pod \"telemeter-client-54f6d8586d-x4jzp\" (UID: \"3b4ef6a3-477e-4314-8383-f7152dad52b5\") " pod="openshift-monitoring/telemeter-client-54f6d8586d-x4jzp" Apr 17 07:54:00.082826 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:00.082809 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/3b4ef6a3-477e-4314-8383-f7152dad52b5-secret-telemeter-client\") pod \"telemeter-client-54f6d8586d-x4jzp\" (UID: \"3b4ef6a3-477e-4314-8383-f7152dad52b5\") " pod="openshift-monitoring/telemeter-client-54f6d8586d-x4jzp" Apr 17 07:54:00.082888 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:00.082840 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp7vg\" (UniqueName: \"kubernetes.io/projected/3b4ef6a3-477e-4314-8383-f7152dad52b5-kube-api-access-qp7vg\") pod \"telemeter-client-54f6d8586d-x4jzp\" (UID: \"3b4ef6a3-477e-4314-8383-f7152dad52b5\") " pod="openshift-monitoring/telemeter-client-54f6d8586d-x4jzp" Apr 17 07:54:00.082888 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:00.082867 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3b4ef6a3-477e-4314-8383-f7152dad52b5-metrics-client-ca\") pod \"telemeter-client-54f6d8586d-x4jzp\" (UID: \"3b4ef6a3-477e-4314-8383-f7152dad52b5\") " pod="openshift-monitoring/telemeter-client-54f6d8586d-x4jzp" Apr 17 07:54:00.183387 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:00.183352 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/3b4ef6a3-477e-4314-8383-f7152dad52b5-federate-client-tls\") pod \"telemeter-client-54f6d8586d-x4jzp\" (UID: \"3b4ef6a3-477e-4314-8383-f7152dad52b5\") " pod="openshift-monitoring/telemeter-client-54f6d8586d-x4jzp" Apr 17 07:54:00.183535 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:00.183413 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b4ef6a3-477e-4314-8383-f7152dad52b5-serving-certs-ca-bundle\") pod \"telemeter-client-54f6d8586d-x4jzp\" (UID: \"3b4ef6a3-477e-4314-8383-f7152dad52b5\") " pod="openshift-monitoring/telemeter-client-54f6d8586d-x4jzp" Apr 17 07:54:00.183535 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:00.183438 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b4ef6a3-477e-4314-8383-f7152dad52b5-telemeter-trusted-ca-bundle\") pod \"telemeter-client-54f6d8586d-x4jzp\" (UID: \"3b4ef6a3-477e-4314-8383-f7152dad52b5\") " pod="openshift-monitoring/telemeter-client-54f6d8586d-x4jzp" Apr 17 07:54:00.183535 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:00.183457 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/3b4ef6a3-477e-4314-8383-f7152dad52b5-telemeter-client-tls\") pod \"telemeter-client-54f6d8586d-x4jzp\" (UID: \"3b4ef6a3-477e-4314-8383-f7152dad52b5\") " pod="openshift-monitoring/telemeter-client-54f6d8586d-x4jzp" Apr 17 07:54:00.183654 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:00.183629 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/3b4ef6a3-477e-4314-8383-f7152dad52b5-secret-telemeter-client\") pod \"telemeter-client-54f6d8586d-x4jzp\" (UID: \"3b4ef6a3-477e-4314-8383-f7152dad52b5\") " pod="openshift-monitoring/telemeter-client-54f6d8586d-x4jzp" Apr 17 07:54:00.183697 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:00.183678 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qp7vg\" (UniqueName: \"kubernetes.io/projected/3b4ef6a3-477e-4314-8383-f7152dad52b5-kube-api-access-qp7vg\") pod \"telemeter-client-54f6d8586d-x4jzp\" (UID: \"3b4ef6a3-477e-4314-8383-f7152dad52b5\") " pod="openshift-monitoring/telemeter-client-54f6d8586d-x4jzp" Apr 17 07:54:00.183732 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:00.183709 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3b4ef6a3-477e-4314-8383-f7152dad52b5-metrics-client-ca\") pod \"telemeter-client-54f6d8586d-x4jzp\" (UID: \"3b4ef6a3-477e-4314-8383-f7152dad52b5\") " pod="openshift-monitoring/telemeter-client-54f6d8586d-x4jzp" Apr 17 07:54:00.183790 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:00.183768 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3b4ef6a3-477e-4314-8383-f7152dad52b5-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-54f6d8586d-x4jzp\" (UID: \"3b4ef6a3-477e-4314-8383-f7152dad52b5\") " pod="openshift-monitoring/telemeter-client-54f6d8586d-x4jzp" Apr 17 07:54:00.184868 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:00.184831 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b4ef6a3-477e-4314-8383-f7152dad52b5-serving-certs-ca-bundle\") pod \"telemeter-client-54f6d8586d-x4jzp\" (UID: \"3b4ef6a3-477e-4314-8383-f7152dad52b5\") " pod="openshift-monitoring/telemeter-client-54f6d8586d-x4jzp" Apr 17 07:54:00.184997 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:00.184885 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3b4ef6a3-477e-4314-8383-f7152dad52b5-metrics-client-ca\") pod \"telemeter-client-54f6d8586d-x4jzp\" (UID: \"3b4ef6a3-477e-4314-8383-f7152dad52b5\") " pod="openshift-monitoring/telemeter-client-54f6d8586d-x4jzp" Apr 17 07:54:00.185061 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:00.184993 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b4ef6a3-477e-4314-8383-f7152dad52b5-telemeter-trusted-ca-bundle\") pod \"telemeter-client-54f6d8586d-x4jzp\" (UID: \"3b4ef6a3-477e-4314-8383-f7152dad52b5\") " pod="openshift-monitoring/telemeter-client-54f6d8586d-x4jzp" Apr 17 07:54:00.186109 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:00.186077 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/3b4ef6a3-477e-4314-8383-f7152dad52b5-secret-telemeter-client\") pod \"telemeter-client-54f6d8586d-x4jzp\" (UID: \"3b4ef6a3-477e-4314-8383-f7152dad52b5\") " pod="openshift-monitoring/telemeter-client-54f6d8586d-x4jzp" Apr 17 07:54:00.186249 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:00.186231 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3b4ef6a3-477e-4314-8383-f7152dad52b5-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-54f6d8586d-x4jzp\" (UID: \"3b4ef6a3-477e-4314-8383-f7152dad52b5\") " pod="openshift-monitoring/telemeter-client-54f6d8586d-x4jzp" Apr 17 07:54:00.186328 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:00.186232 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/3b4ef6a3-477e-4314-8383-f7152dad52b5-federate-client-tls\") pod \"telemeter-client-54f6d8586d-x4jzp\" (UID: \"3b4ef6a3-477e-4314-8383-f7152dad52b5\") " pod="openshift-monitoring/telemeter-client-54f6d8586d-x4jzp" Apr 17 07:54:00.186328 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:00.186289 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/3b4ef6a3-477e-4314-8383-f7152dad52b5-telemeter-client-tls\") pod \"telemeter-client-54f6d8586d-x4jzp\" (UID: \"3b4ef6a3-477e-4314-8383-f7152dad52b5\") " pod="openshift-monitoring/telemeter-client-54f6d8586d-x4jzp" Apr 17 07:54:00.191351 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:00.191333 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp7vg\" (UniqueName: \"kubernetes.io/projected/3b4ef6a3-477e-4314-8383-f7152dad52b5-kube-api-access-qp7vg\") pod \"telemeter-client-54f6d8586d-x4jzp\" (UID: \"3b4ef6a3-477e-4314-8383-f7152dad52b5\") " pod="openshift-monitoring/telemeter-client-54f6d8586d-x4jzp" Apr 17 07:54:00.312234 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:00.312200 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-54f6d8586d-x4jzp" Apr 17 07:54:00.432739 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:00.432703 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-54f6d8586d-x4jzp"] Apr 17 07:54:00.435780 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:54:00.435706 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b4ef6a3_477e_4314_8383_f7152dad52b5.slice/crio-8a18ff7e18d3bda9fc10e20d3ec558888457e29923bb3c2c393a6c3da85c6b19 WatchSource:0}: Error finding container 8a18ff7e18d3bda9fc10e20d3ec558888457e29923bb3c2c393a6c3da85c6b19: Status 404 returned error can't find the container with id 8a18ff7e18d3bda9fc10e20d3ec558888457e29923bb3c2c393a6c3da85c6b19 Apr 17 07:54:01.008005 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.007963 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-54f6d8586d-x4jzp" event={"ID":"3b4ef6a3-477e-4314-8383-f7152dad52b5","Type":"ContainerStarted","Data":"8a18ff7e18d3bda9fc10e20d3ec558888457e29923bb3c2c393a6c3da85c6b19"} Apr 17 07:54:01.049680 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.049647 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 07:54:01.053846 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.053818 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.056271 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.056240 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 07:54:01.056407 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.056293 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 07:54:01.056941 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.056683 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 07:54:01.056941 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.056699 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 07:54:01.056941 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.056710 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 07:54:01.056941 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.056721 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 07:54:01.056941 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.056760 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-66tugsbqr2jc7\"" Apr 17 07:54:01.056941 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.056900 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 07:54:01.057421 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.057063 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-jxmxj\"" Apr 17 07:54:01.057421 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.057104 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 07:54:01.057421 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.057315 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 07:54:01.057421 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.057323 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 07:54:01.057421 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.057315 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 07:54:01.062109 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.062056 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 07:54:01.064326 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.064114 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 07:54:01.076544 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.076523 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 07:54:01.092557 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.092531 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhd6k\" (UniqueName: \"kubernetes.io/projected/e362f670-e98c-4474-a28c-786ee6ce1475-kube-api-access-lhd6k\") pod \"prometheus-k8s-0\" (UID: \"e362f670-e98c-4474-a28c-786ee6ce1475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.092679 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.092574 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e362f670-e98c-4474-a28c-786ee6ce1475-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e362f670-e98c-4474-a28c-786ee6ce1475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.092679 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.092616 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e362f670-e98c-4474-a28c-786ee6ce1475-config\") pod \"prometheus-k8s-0\" (UID: \"e362f670-e98c-4474-a28c-786ee6ce1475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.092679 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.092641 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e362f670-e98c-4474-a28c-786ee6ce1475-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e362f670-e98c-4474-a28c-786ee6ce1475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.092858 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.092684 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e362f670-e98c-4474-a28c-786ee6ce1475-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e362f670-e98c-4474-a28c-786ee6ce1475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.092858 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.092713 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e362f670-e98c-4474-a28c-786ee6ce1475-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e362f670-e98c-4474-a28c-786ee6ce1475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.092858 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.092819 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e362f670-e98c-4474-a28c-786ee6ce1475-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e362f670-e98c-4474-a28c-786ee6ce1475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.092858 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.092856 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e362f670-e98c-4474-a28c-786ee6ce1475-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e362f670-e98c-4474-a28c-786ee6ce1475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.093064 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.092893 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e362f670-e98c-4474-a28c-786ee6ce1475-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e362f670-e98c-4474-a28c-786ee6ce1475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.093064 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.092918 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e362f670-e98c-4474-a28c-786ee6ce1475-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e362f670-e98c-4474-a28c-786ee6ce1475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.093064 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.092944 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e362f670-e98c-4474-a28c-786ee6ce1475-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e362f670-e98c-4474-a28c-786ee6ce1475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.093064 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.093016 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e362f670-e98c-4474-a28c-786ee6ce1475-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e362f670-e98c-4474-a28c-786ee6ce1475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.093064 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.093051 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e362f670-e98c-4474-a28c-786ee6ce1475-web-config\") pod \"prometheus-k8s-0\" (UID: \"e362f670-e98c-4474-a28c-786ee6ce1475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.093349 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.093129 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e362f670-e98c-4474-a28c-786ee6ce1475-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e362f670-e98c-4474-a28c-786ee6ce1475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.093349 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.093193 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e362f670-e98c-4474-a28c-786ee6ce1475-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e362f670-e98c-4474-a28c-786ee6ce1475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.093349 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.093268 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e362f670-e98c-4474-a28c-786ee6ce1475-config-out\") pod \"prometheus-k8s-0\" (UID: \"e362f670-e98c-4474-a28c-786ee6ce1475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.093349 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.093294 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e362f670-e98c-4474-a28c-786ee6ce1475-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e362f670-e98c-4474-a28c-786ee6ce1475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.093559 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.093360 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e362f670-e98c-4474-a28c-786ee6ce1475-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e362f670-e98c-4474-a28c-786ee6ce1475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.123192 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.123160 2578 patch_prober.go:28] interesting pod/image-registry-597b89995d-65v5w container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 07:54:01.123306 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.123217 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-597b89995d-65v5w" podUID="95a9e913-e1cb-49b7-ae5f-77299dc16875" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 07:54:01.194616 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.194582 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e362f670-e98c-4474-a28c-786ee6ce1475-config-out\") pod \"prometheus-k8s-0\" (UID: \"e362f670-e98c-4474-a28c-786ee6ce1475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.194616 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.194615 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e362f670-e98c-4474-a28c-786ee6ce1475-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e362f670-e98c-4474-a28c-786ee6ce1475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.194789 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.194636 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e362f670-e98c-4474-a28c-786ee6ce1475-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e362f670-e98c-4474-a28c-786ee6ce1475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.194789 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.194660 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lhd6k\" (UniqueName: \"kubernetes.io/projected/e362f670-e98c-4474-a28c-786ee6ce1475-kube-api-access-lhd6k\") pod \"prometheus-k8s-0\" (UID: \"e362f670-e98c-4474-a28c-786ee6ce1475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.194789 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.194679 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e362f670-e98c-4474-a28c-786ee6ce1475-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e362f670-e98c-4474-a28c-786ee6ce1475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.194789 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.194702 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e362f670-e98c-4474-a28c-786ee6ce1475-config\") pod \"prometheus-k8s-0\" (UID: \"e362f670-e98c-4474-a28c-786ee6ce1475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.194789 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.194716 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e362f670-e98c-4474-a28c-786ee6ce1475-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e362f670-e98c-4474-a28c-786ee6ce1475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.194789 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.194733 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e362f670-e98c-4474-a28c-786ee6ce1475-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e362f670-e98c-4474-a28c-786ee6ce1475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.194789 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.194754 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e362f670-e98c-4474-a28c-786ee6ce1475-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e362f670-e98c-4474-a28c-786ee6ce1475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.195179 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.194803 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e362f670-e98c-4474-a28c-786ee6ce1475-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e362f670-e98c-4474-a28c-786ee6ce1475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.195179 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.194834 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e362f670-e98c-4474-a28c-786ee6ce1475-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e362f670-e98c-4474-a28c-786ee6ce1475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.195179 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.194870 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e362f670-e98c-4474-a28c-786ee6ce1475-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e362f670-e98c-4474-a28c-786ee6ce1475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.195179 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.194900 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e362f670-e98c-4474-a28c-786ee6ce1475-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e362f670-e98c-4474-a28c-786ee6ce1475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.195179 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.194923 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e362f670-e98c-4474-a28c-786ee6ce1475-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e362f670-e98c-4474-a28c-786ee6ce1475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.195179 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.194964 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e362f670-e98c-4474-a28c-786ee6ce1475-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e362f670-e98c-4474-a28c-786ee6ce1475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.195179 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.194986 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e362f670-e98c-4474-a28c-786ee6ce1475-web-config\") pod \"prometheus-k8s-0\" (UID: \"e362f670-e98c-4474-a28c-786ee6ce1475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.195179 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.195035 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e362f670-e98c-4474-a28c-786ee6ce1475-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e362f670-e98c-4474-a28c-786ee6ce1475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.195179 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.195076 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e362f670-e98c-4474-a28c-786ee6ce1475-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e362f670-e98c-4474-a28c-786ee6ce1475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.195654 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.195470 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e362f670-e98c-4474-a28c-786ee6ce1475-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e362f670-e98c-4474-a28c-786ee6ce1475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.195706 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.195670 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e362f670-e98c-4474-a28c-786ee6ce1475-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e362f670-e98c-4474-a28c-786ee6ce1475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.195757 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.195720 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e362f670-e98c-4474-a28c-786ee6ce1475-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e362f670-e98c-4474-a28c-786ee6ce1475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.196471 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.195997 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e362f670-e98c-4474-a28c-786ee6ce1475-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e362f670-e98c-4474-a28c-786ee6ce1475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.196584 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.196519 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e362f670-e98c-4474-a28c-786ee6ce1475-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e362f670-e98c-4474-a28c-786ee6ce1475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.199180 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.198388 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e362f670-e98c-4474-a28c-786ee6ce1475-config-out\") pod \"prometheus-k8s-0\" (UID: \"e362f670-e98c-4474-a28c-786ee6ce1475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.199180 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.198720 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e362f670-e98c-4474-a28c-786ee6ce1475-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e362f670-e98c-4474-a28c-786ee6ce1475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.199180 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.199097 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e362f670-e98c-4474-a28c-786ee6ce1475-config\") pod \"prometheus-k8s-0\" (UID: \"e362f670-e98c-4474-a28c-786ee6ce1475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.199575 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.199551 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e362f670-e98c-4474-a28c-786ee6ce1475-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e362f670-e98c-4474-a28c-786ee6ce1475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.199696 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.199678 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e362f670-e98c-4474-a28c-786ee6ce1475-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e362f670-e98c-4474-a28c-786ee6ce1475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.199920 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.199873 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e362f670-e98c-4474-a28c-786ee6ce1475-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e362f670-e98c-4474-a28c-786ee6ce1475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.200021 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.199940 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e362f670-e98c-4474-a28c-786ee6ce1475-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e362f670-e98c-4474-a28c-786ee6ce1475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.200178 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.200133 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e362f670-e98c-4474-a28c-786ee6ce1475-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e362f670-e98c-4474-a28c-786ee6ce1475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.200256 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.200199 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e362f670-e98c-4474-a28c-786ee6ce1475-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e362f670-e98c-4474-a28c-786ee6ce1475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.200500 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.200482 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e362f670-e98c-4474-a28c-786ee6ce1475-web-config\") pod \"prometheus-k8s-0\" (UID: \"e362f670-e98c-4474-a28c-786ee6ce1475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.200569 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.200522 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e362f670-e98c-4474-a28c-786ee6ce1475-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e362f670-e98c-4474-a28c-786ee6ce1475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.201044 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.201022 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e362f670-e98c-4474-a28c-786ee6ce1475-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e362f670-e98c-4474-a28c-786ee6ce1475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.202957 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.202936 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhd6k\" (UniqueName: \"kubernetes.io/projected/e362f670-e98c-4474-a28c-786ee6ce1475-kube-api-access-lhd6k\") pod \"prometheus-k8s-0\" (UID: \"e362f670-e98c-4474-a28c-786ee6ce1475\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.368018 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.367982 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:01.497343 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.497316 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 07:54:01.499188 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:54:01.499160 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode362f670_e98c_4474_a28c_786ee6ce1475.slice/crio-84964a4f6915d401d152442797e0311e6f5f00a98d7f8f67b28efc3881d9271f WatchSource:0}: Error finding container 84964a4f6915d401d152442797e0311e6f5f00a98d7f8f67b28efc3881d9271f: Status 404 returned error can't find the container with id 84964a4f6915d401d152442797e0311e6f5f00a98d7f8f67b28efc3881d9271f Apr 17 07:54:01.528715 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.528640 2578 patch_prober.go:28] interesting pod/image-registry-54c7b778bc-f89zb container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 07:54:01.528715 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:01.528695 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-54c7b778bc-f89zb" podUID="ae8faad5-6108-4619-82ef-7f707ede4f61" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 07:54:02.012602 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:02.012555 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e362f670-e98c-4474-a28c-786ee6ce1475","Type":"ContainerStarted","Data":"84964a4f6915d401d152442797e0311e6f5f00a98d7f8f67b28efc3881d9271f"} Apr 17 07:54:02.014243 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:02.014210 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zt672" event={"ID":"1153f1d4-8d0e-472e-a2d6-5c7105b8aa2d","Type":"ContainerStarted","Data":"03b4ed5391cc2806cd50e4bd91f2925d65c23bd00e7bf844e7e11921188d8d3c"} Apr 17 07:54:02.014510 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:02.014489 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zt672" Apr 17 07:54:02.020755 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:02.020720 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zt672" Apr 17 07:54:02.028479 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:02.028430 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zt672" podStartSLOduration=1.7718797 podStartE2EDuration="3.028419587s" podCreationTimestamp="2026-04-17 07:53:59 +0000 UTC" firstStartedPulling="2026-04-17 07:53:59.986284548 +0000 UTC m=+159.952321303" lastFinishedPulling="2026-04-17 07:54:01.242824435 +0000 UTC m=+161.208861190" observedRunningTime="2026-04-17 07:54:02.028014734 +0000 UTC m=+161.994051512" watchObservedRunningTime="2026-04-17 07:54:02.028419587 +0000 UTC m=+161.994456398" Apr 17 07:54:02.507883 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:02.507852 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86725d0f-c4d4-499a-96f9-106af3387cc2-metrics-tls\") pod \"dns-default-xn7q4\" (UID: \"86725d0f-c4d4-499a-96f9-106af3387cc2\") " pod="openshift-dns/dns-default-xn7q4" Apr 17 07:54:02.507963 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:02.507902 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ecab5c24-0616-4d6e-93a1-4c29b1548a0d-cert\") pod \"ingress-canary-k99jf\" (UID: \"ecab5c24-0616-4d6e-93a1-4c29b1548a0d\") " pod="openshift-ingress-canary/ingress-canary-k99jf" Apr 17 07:54:02.510689 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:02.510667 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ecab5c24-0616-4d6e-93a1-4c29b1548a0d-cert\") pod \"ingress-canary-k99jf\" (UID: \"ecab5c24-0616-4d6e-93a1-4c29b1548a0d\") " pod="openshift-ingress-canary/ingress-canary-k99jf" Apr 17 07:54:02.510854 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:02.510833 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86725d0f-c4d4-499a-96f9-106af3387cc2-metrics-tls\") pod \"dns-default-xn7q4\" (UID: \"86725d0f-c4d4-499a-96f9-106af3387cc2\") " pod="openshift-dns/dns-default-xn7q4" Apr 17 07:54:02.798994 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:02.798964 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-mw88v\"" Apr 17 07:54:02.807270 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:02.807243 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xn7q4" Apr 17 07:54:02.955768 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:02.955729 2578 patch_prober.go:28] interesting pod/image-registry-54c7b778bc-f89zb container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 07:54:02.955934 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:02.955780 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-54c7b778bc-f89zb" podUID="ae8faad5-6108-4619-82ef-7f707ede4f61" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 07:54:03.019212 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:03.019174 2578 generic.go:358] "Generic (PLEG): container finished" podID="e362f670-e98c-4474-a28c-786ee6ce1475" containerID="de1dc2c87713bfbb2b40aeffbd995814071d09099f405a1e31be80b8845c3fa4" exitCode=0 Apr 17 07:54:03.019687 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:03.019245 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e362f670-e98c-4474-a28c-786ee6ce1475","Type":"ContainerDied","Data":"de1dc2c87713bfbb2b40aeffbd995814071d09099f405a1e31be80b8845c3fa4"} Apr 17 07:54:03.152403 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:03.152378 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-xn7q4"] Apr 17 07:54:03.157276 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:54:03.157252 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86725d0f_c4d4_499a_96f9_106af3387cc2.slice/crio-61625b5b1f8321228993a50a5a84a9cee444306eb1e4f2d0c47f9f4b073845de WatchSource:0}: Error finding container 61625b5b1f8321228993a50a5a84a9cee444306eb1e4f2d0c47f9f4b073845de: Status 404 returned error can't find the container with id 61625b5b1f8321228993a50a5a84a9cee444306eb1e4f2d0c47f9f4b073845de Apr 17 07:54:04.023588 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:04.023551 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xn7q4" event={"ID":"86725d0f-c4d4-499a-96f9-106af3387cc2","Type":"ContainerStarted","Data":"61625b5b1f8321228993a50a5a84a9cee444306eb1e4f2d0c47f9f4b073845de"} Apr 17 07:54:04.025863 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:04.025828 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-54f6d8586d-x4jzp" event={"ID":"3b4ef6a3-477e-4314-8383-f7152dad52b5","Type":"ContainerStarted","Data":"262b5e2cd1d4de85c88e6dc20e11d3df9b19f41384c3405f3efd793c19933f25"} Apr 17 07:54:04.025984 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:04.025872 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-54f6d8586d-x4jzp" event={"ID":"3b4ef6a3-477e-4314-8383-f7152dad52b5","Type":"ContainerStarted","Data":"42aac603a8edb9f64d38c4e1d87bf4f2bee50a9180853fe8224d0b128f24d472"} Apr 17 07:54:04.025984 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:04.025883 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-54f6d8586d-x4jzp" event={"ID":"3b4ef6a3-477e-4314-8383-f7152dad52b5","Type":"ContainerStarted","Data":"58830cd73ca4555e08d3cfb2961cccff0e8c5cd11540407fd9f10adbf91a0162"} Apr 17 07:54:04.047158 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:04.046931 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-54f6d8586d-x4jzp" podStartSLOduration=2.402826634 podStartE2EDuration="5.046911946s" podCreationTimestamp="2026-04-17 07:53:59 +0000 UTC" firstStartedPulling="2026-04-17 07:54:00.438081998 +0000 UTC m=+160.404118754" lastFinishedPulling="2026-04-17 07:54:03.082167287 +0000 UTC m=+163.048204066" observedRunningTime="2026-04-17 07:54:04.046135153 +0000 UTC m=+164.012171931" watchObservedRunningTime="2026-04-17 07:54:04.046911946 +0000 UTC m=+164.012948725" Apr 17 07:54:06.033724 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:06.033695 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e362f670-e98c-4474-a28c-786ee6ce1475","Type":"ContainerStarted","Data":"22c3739e7041011fadba39aaf5a4b39a4a65e7089292673f98927ef7c434b8e4"} Apr 17 07:54:06.137501 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:06.137455 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-597b89995d-65v5w" podUID="95a9e913-e1cb-49b7-ae5f-77299dc16875" containerName="registry" containerID="cri-o://98094e0cdbc915b67b1139814d86f1596779d324fee03ff9a63cb80161f95070" gracePeriod=30 Apr 17 07:54:06.366090 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:06.366068 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-597b89995d-65v5w" Apr 17 07:54:06.444225 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:06.444197 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/95a9e913-e1cb-49b7-ae5f-77299dc16875-registry-certificates\") pod \"95a9e913-e1cb-49b7-ae5f-77299dc16875\" (UID: \"95a9e913-e1cb-49b7-ae5f-77299dc16875\") " Apr 17 07:54:06.444377 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:06.444249 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djvrk\" (UniqueName: \"kubernetes.io/projected/95a9e913-e1cb-49b7-ae5f-77299dc16875-kube-api-access-djvrk\") pod \"95a9e913-e1cb-49b7-ae5f-77299dc16875\" (UID: \"95a9e913-e1cb-49b7-ae5f-77299dc16875\") " Apr 17 07:54:06.444377 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:06.444282 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/95a9e913-e1cb-49b7-ae5f-77299dc16875-trusted-ca\") pod \"95a9e913-e1cb-49b7-ae5f-77299dc16875\" (UID: \"95a9e913-e1cb-49b7-ae5f-77299dc16875\") " Apr 17 07:54:06.444467 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:06.444397 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/95a9e913-e1cb-49b7-ae5f-77299dc16875-image-registry-private-configuration\") pod \"95a9e913-e1cb-49b7-ae5f-77299dc16875\" (UID: \"95a9e913-e1cb-49b7-ae5f-77299dc16875\") " Apr 17 07:54:06.444542 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:06.444525 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/95a9e913-e1cb-49b7-ae5f-77299dc16875-ca-trust-extracted\") pod \"95a9e913-e1cb-49b7-ae5f-77299dc16875\" (UID: \"95a9e913-e1cb-49b7-ae5f-77299dc16875\") " Apr 17 07:54:06.444578 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:06.444560 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/95a9e913-e1cb-49b7-ae5f-77299dc16875-bound-sa-token\") pod \"95a9e913-e1cb-49b7-ae5f-77299dc16875\" (UID: \"95a9e913-e1cb-49b7-ae5f-77299dc16875\") " Apr 17 07:54:06.444624 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:06.444586 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/95a9e913-e1cb-49b7-ae5f-77299dc16875-installation-pull-secrets\") pod \"95a9e913-e1cb-49b7-ae5f-77299dc16875\" (UID: \"95a9e913-e1cb-49b7-ae5f-77299dc16875\") " Apr 17 07:54:06.444624 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:06.444613 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/95a9e913-e1cb-49b7-ae5f-77299dc16875-registry-tls\") pod \"95a9e913-e1cb-49b7-ae5f-77299dc16875\" (UID: \"95a9e913-e1cb-49b7-ae5f-77299dc16875\") " Apr 17 07:54:06.444722 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:06.444661 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95a9e913-e1cb-49b7-ae5f-77299dc16875-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "95a9e913-e1cb-49b7-ae5f-77299dc16875" (UID: "95a9e913-e1cb-49b7-ae5f-77299dc16875"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:54:06.444884 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:06.444864 2578 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/95a9e913-e1cb-49b7-ae5f-77299dc16875-registry-certificates\") on node \"ip-10-0-134-176.ec2.internal\" DevicePath \"\"" Apr 17 07:54:06.444971 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:06.444924 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95a9e913-e1cb-49b7-ae5f-77299dc16875-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "95a9e913-e1cb-49b7-ae5f-77299dc16875" (UID: "95a9e913-e1cb-49b7-ae5f-77299dc16875"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:54:06.446785 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:06.446758 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95a9e913-e1cb-49b7-ae5f-77299dc16875-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "95a9e913-e1cb-49b7-ae5f-77299dc16875" (UID: "95a9e913-e1cb-49b7-ae5f-77299dc16875"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 07:54:06.446933 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:06.446899 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95a9e913-e1cb-49b7-ae5f-77299dc16875-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "95a9e913-e1cb-49b7-ae5f-77299dc16875" (UID: "95a9e913-e1cb-49b7-ae5f-77299dc16875"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:54:06.447034 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:06.446971 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95a9e913-e1cb-49b7-ae5f-77299dc16875-kube-api-access-djvrk" (OuterVolumeSpecName: "kube-api-access-djvrk") pod "95a9e913-e1cb-49b7-ae5f-77299dc16875" (UID: "95a9e913-e1cb-49b7-ae5f-77299dc16875"). InnerVolumeSpecName "kube-api-access-djvrk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 07:54:06.447034 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:06.446976 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95a9e913-e1cb-49b7-ae5f-77299dc16875-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "95a9e913-e1cb-49b7-ae5f-77299dc16875" (UID: "95a9e913-e1cb-49b7-ae5f-77299dc16875"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:54:06.447213 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:06.447190 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95a9e913-e1cb-49b7-ae5f-77299dc16875-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "95a9e913-e1cb-49b7-ae5f-77299dc16875" (UID: "95a9e913-e1cb-49b7-ae5f-77299dc16875"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 07:54:06.452762 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:06.452740 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95a9e913-e1cb-49b7-ae5f-77299dc16875-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "95a9e913-e1cb-49b7-ae5f-77299dc16875" (UID: "95a9e913-e1cb-49b7-ae5f-77299dc16875"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 07:54:06.545322 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:06.545291 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-djvrk\" (UniqueName: \"kubernetes.io/projected/95a9e913-e1cb-49b7-ae5f-77299dc16875-kube-api-access-djvrk\") on node \"ip-10-0-134-176.ec2.internal\" DevicePath \"\"" Apr 17 07:54:06.545432 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:06.545328 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/95a9e913-e1cb-49b7-ae5f-77299dc16875-trusted-ca\") on node \"ip-10-0-134-176.ec2.internal\" DevicePath \"\"" Apr 17 07:54:06.545432 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:06.545349 2578 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/95a9e913-e1cb-49b7-ae5f-77299dc16875-image-registry-private-configuration\") on node \"ip-10-0-134-176.ec2.internal\" DevicePath \"\"" Apr 17 07:54:06.545432 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:06.545365 2578 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/95a9e913-e1cb-49b7-ae5f-77299dc16875-ca-trust-extracted\") on node \"ip-10-0-134-176.ec2.internal\" DevicePath \"\"" Apr 17 07:54:06.545432 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:06.545380 2578 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/95a9e913-e1cb-49b7-ae5f-77299dc16875-bound-sa-token\") on node \"ip-10-0-134-176.ec2.internal\" DevicePath \"\"" Apr 17 07:54:06.545432 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:06.545395 2578 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/95a9e913-e1cb-49b7-ae5f-77299dc16875-installation-pull-secrets\") on node \"ip-10-0-134-176.ec2.internal\" DevicePath \"\"" Apr 17 07:54:06.545432 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:06.545409 2578 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/95a9e913-e1cb-49b7-ae5f-77299dc16875-registry-tls\") on node \"ip-10-0-134-176.ec2.internal\" DevicePath \"\"" Apr 17 07:54:07.039694 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:07.039651 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e362f670-e98c-4474-a28c-786ee6ce1475","Type":"ContainerStarted","Data":"acbfd2218a4831cc83b4daa6ce5380c6ac1c3d9d22da58ec92838a3810316a32"} Apr 17 07:54:07.040796 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:07.040765 2578 generic.go:358] "Generic (PLEG): container finished" podID="95a9e913-e1cb-49b7-ae5f-77299dc16875" containerID="98094e0cdbc915b67b1139814d86f1596779d324fee03ff9a63cb80161f95070" exitCode=0 Apr 17 07:54:07.040925 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:07.040832 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-597b89995d-65v5w" Apr 17 07:54:07.040925 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:07.040847 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-597b89995d-65v5w" event={"ID":"95a9e913-e1cb-49b7-ae5f-77299dc16875","Type":"ContainerDied","Data":"98094e0cdbc915b67b1139814d86f1596779d324fee03ff9a63cb80161f95070"} Apr 17 07:54:07.040925 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:07.040893 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-597b89995d-65v5w" event={"ID":"95a9e913-e1cb-49b7-ae5f-77299dc16875","Type":"ContainerDied","Data":"8e05d73744a108ab9cac17e7d815267116d1fc3eaa5b0ba872d7f4eaf54060e8"} Apr 17 07:54:07.040925 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:07.040916 2578 scope.go:117] "RemoveContainer" containerID="98094e0cdbc915b67b1139814d86f1596779d324fee03ff9a63cb80161f95070" Apr 17 07:54:07.042791 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:07.042765 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xn7q4" event={"ID":"86725d0f-c4d4-499a-96f9-106af3387cc2","Type":"ContainerStarted","Data":"3b866497924775fee5ceba42979c5f710740acab15b829b12d60f6463b6db209"} Apr 17 07:54:07.042890 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:07.042801 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xn7q4" event={"ID":"86725d0f-c4d4-499a-96f9-106af3387cc2","Type":"ContainerStarted","Data":"d74fc43d2156533a195e5574ab5884505563c1bdb7def9a199e7a6ebaefbcb78"} Apr 17 07:54:07.043035 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:07.043001 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-xn7q4" Apr 17 07:54:07.050395 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:07.050372 2578 scope.go:117] "RemoveContainer" containerID="98094e0cdbc915b67b1139814d86f1596779d324fee03ff9a63cb80161f95070" Apr 17 07:54:07.050703 ip-10-0-134-176 kubenswrapper[2578]: E0417 07:54:07.050678 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98094e0cdbc915b67b1139814d86f1596779d324fee03ff9a63cb80161f95070\": container with ID starting with 98094e0cdbc915b67b1139814d86f1596779d324fee03ff9a63cb80161f95070 not found: ID does not exist" containerID="98094e0cdbc915b67b1139814d86f1596779d324fee03ff9a63cb80161f95070" Apr 17 07:54:07.050784 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:07.050717 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98094e0cdbc915b67b1139814d86f1596779d324fee03ff9a63cb80161f95070"} err="failed to get container status \"98094e0cdbc915b67b1139814d86f1596779d324fee03ff9a63cb80161f95070\": rpc error: code = NotFound desc = could not find container \"98094e0cdbc915b67b1139814d86f1596779d324fee03ff9a63cb80161f95070\": container with ID starting with 98094e0cdbc915b67b1139814d86f1596779d324fee03ff9a63cb80161f95070 not found: ID does not exist" Apr 17 07:54:07.059709 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:07.059654 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-597b89995d-65v5w"] Apr 17 07:54:07.062927 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:07.062903 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-597b89995d-65v5w"] Apr 17 07:54:07.085292 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:07.085238 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-xn7q4" podStartSLOduration=130.302774251 podStartE2EDuration="2m13.085221097s" podCreationTimestamp="2026-04-17 07:51:54 +0000 UTC" firstStartedPulling="2026-04-17 07:54:03.159612613 +0000 UTC m=+163.125649368" lastFinishedPulling="2026-04-17 07:54:05.942059452 +0000 UTC m=+165.908096214" observedRunningTime="2026-04-17 07:54:07.085160743 +0000 UTC m=+167.051197517" watchObservedRunningTime="2026-04-17 07:54:07.085221097 +0000 UTC m=+167.051257875" Apr 17 07:54:08.049535 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:08.049496 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e362f670-e98c-4474-a28c-786ee6ce1475","Type":"ContainerStarted","Data":"e34be885e3851fb72e3f8232cefcce281d03e03f21203eecfb57fc8af91b69fb"} Apr 17 07:54:08.049535 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:08.049536 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e362f670-e98c-4474-a28c-786ee6ce1475","Type":"ContainerStarted","Data":"b370356d1451f25d9b54bc227e376ad37280eb488b147f8c98404c5253855e21"} Apr 17 07:54:08.049932 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:08.049549 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e362f670-e98c-4474-a28c-786ee6ce1475","Type":"ContainerStarted","Data":"f33c3893f6bb4790c027a5476b98760445fd04eb223f039e886b99fa106a776a"} Apr 17 07:54:08.049932 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:08.049560 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e362f670-e98c-4474-a28c-786ee6ce1475","Type":"ContainerStarted","Data":"e1eb4d498e40bf0a9aee587515ec4d43a03f47d9d2b2e1eeed1d1958cacc4adc"} Apr 17 07:54:08.533356 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:08.533323 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95a9e913-e1cb-49b7-ae5f-77299dc16875" path="/var/lib/kubelet/pods/95a9e913-e1cb-49b7-ae5f-77299dc16875/volumes" Apr 17 07:54:08.533670 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:08.533657 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-k99jf" Apr 17 07:54:08.536433 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:08.536419 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9vrx5\"" Apr 17 07:54:08.544407 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:08.544393 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-k99jf" Apr 17 07:54:08.683873 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:08.683812 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.447588691 podStartE2EDuration="7.683790966s" podCreationTimestamp="2026-04-17 07:54:01 +0000 UTC" firstStartedPulling="2026-04-17 07:54:01.501367543 +0000 UTC m=+161.467404298" lastFinishedPulling="2026-04-17 07:54:07.737569816 +0000 UTC m=+167.703606573" observedRunningTime="2026-04-17 07:54:08.097483767 +0000 UTC m=+168.063520545" watchObservedRunningTime="2026-04-17 07:54:08.683790966 +0000 UTC m=+168.649827747" Apr 17 07:54:08.684529 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:08.684508 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-k99jf"] Apr 17 07:54:08.686897 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:54:08.686876 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podecab5c24_0616_4d6e_93a1_4c29b1548a0d.slice/crio-0aac68f578b563f42022c962e9ac97cfa567331206a2005654ce2032c1fb91ae WatchSource:0}: Error finding container 0aac68f578b563f42022c962e9ac97cfa567331206a2005654ce2032c1fb91ae: Status 404 returned error can't find the container with id 0aac68f578b563f42022c962e9ac97cfa567331206a2005654ce2032c1fb91ae Apr 17 07:54:09.054334 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:09.054296 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-k99jf" event={"ID":"ecab5c24-0616-4d6e-93a1-4c29b1548a0d","Type":"ContainerStarted","Data":"0aac68f578b563f42022c962e9ac97cfa567331206a2005654ce2032c1fb91ae"} Apr 17 07:54:11.062749 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:11.062710 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-k99jf" event={"ID":"ecab5c24-0616-4d6e-93a1-4c29b1548a0d","Type":"ContainerStarted","Data":"6c72ed72e6177d65c950d45c7553ccaec0b1c069d0d096b2fbd389291f85fce8"} Apr 17 07:54:11.080166 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:11.080101 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-k99jf" podStartSLOduration=135.608834169 podStartE2EDuration="2m17.08008485s" podCreationTimestamp="2026-04-17 07:51:54 +0000 UTC" firstStartedPulling="2026-04-17 07:54:08.689331583 +0000 UTC m=+168.655368339" lastFinishedPulling="2026-04-17 07:54:10.160582265 +0000 UTC m=+170.126619020" observedRunningTime="2026-04-17 07:54:11.079158276 +0000 UTC m=+171.045195048" watchObservedRunningTime="2026-04-17 07:54:11.08008485 +0000 UTC m=+171.046121666" Apr 17 07:54:11.368757 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:11.368659 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:54:11.527970 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:11.527942 2578 patch_prober.go:28] interesting pod/image-registry-54c7b778bc-f89zb container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 07:54:11.528112 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:11.527988 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-54c7b778bc-f89zb" podUID="ae8faad5-6108-4619-82ef-7f707ede4f61" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 07:54:12.955365 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:12.955327 2578 patch_prober.go:28] interesting pod/image-registry-54c7b778bc-f89zb container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 07:54:12.955797 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:12.955391 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-54c7b778bc-f89zb" podUID="ae8faad5-6108-4619-82ef-7f707ede4f61" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 07:54:17.051408 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:17.051374 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-xn7q4" Apr 17 07:54:21.528576 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:21.528540 2578 patch_prober.go:28] interesting pod/image-registry-54c7b778bc-f89zb container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 07:54:21.528949 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:21.528600 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-54c7b778bc-f89zb" podUID="ae8faad5-6108-4619-82ef-7f707ede4f61" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 07:54:21.528949 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:21.528637 2578 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-image-registry/image-registry-54c7b778bc-f89zb" Apr 17 07:54:21.529213 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:21.529182 2578 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="registry" containerStatusID={"Type":"cri-o","ID":"4d35337ce108ba589d4e0347bb6d6c30c34acc8b04adcb6d27325aff78d8589f"} pod="openshift-image-registry/image-registry-54c7b778bc-f89zb" containerMessage="Container registry failed liveness probe, will be restarted" Apr 17 07:54:21.532454 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:21.532427 2578 patch_prober.go:28] interesting pod/image-registry-54c7b778bc-f89zb container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 07:54:21.532577 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:21.532471 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-54c7b778bc-f89zb" podUID="ae8faad5-6108-4619-82ef-7f707ede4f61" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 07:54:25.101619 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:25.101525 2578 generic.go:358] "Generic (PLEG): container finished" podID="fa3b3f1b-d142-4dcd-b779-b2342879913b" containerID="daabdefee5f9c6279957c048601477f5686078918d14744f5a682450f9403137" exitCode=0 Apr 17 07:54:25.101619 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:25.101603 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5js9r" event={"ID":"fa3b3f1b-d142-4dcd-b779-b2342879913b","Type":"ContainerDied","Data":"daabdefee5f9c6279957c048601477f5686078918d14744f5a682450f9403137"} Apr 17 07:54:25.102054 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:25.101940 2578 scope.go:117] "RemoveContainer" containerID="daabdefee5f9c6279957c048601477f5686078918d14744f5a682450f9403137" Apr 17 07:54:26.106279 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:26.106246 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5js9r" event={"ID":"fa3b3f1b-d142-4dcd-b779-b2342879913b","Type":"ContainerStarted","Data":"7428508e3d2bb7991fd51ea0e13eb6e30b570ba50f7da12d65eeb320df09770b"} Apr 17 07:54:31.533383 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:31.533356 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-54c7b778bc-f89zb" Apr 17 07:54:40.145982 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:40.145948 2578 generic.go:358] "Generic (PLEG): container finished" podID="12552903-12e4-45a7-8194-2c6cc6a37b21" containerID="ccc527180fa76957634aa9bcbc6b0ed789d47b0d52c07991e36ba830f55cbece" exitCode=0 Apr 17 07:54:40.146374 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:40.146025 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-9wbj9" event={"ID":"12552903-12e4-45a7-8194-2c6cc6a37b21","Type":"ContainerDied","Data":"ccc527180fa76957634aa9bcbc6b0ed789d47b0d52c07991e36ba830f55cbece"} Apr 17 07:54:40.146421 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:40.146401 2578 scope.go:117] "RemoveContainer" containerID="ccc527180fa76957634aa9bcbc6b0ed789d47b0d52c07991e36ba830f55cbece" Apr 17 07:54:41.150106 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:41.150069 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-9wbj9" event={"ID":"12552903-12e4-45a7-8194-2c6cc6a37b21","Type":"ContainerStarted","Data":"2ae65ab48108f156f7ad323bde8dfde8031b5f758289dbc493371375148d5ded"} Apr 17 07:54:46.547251 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:46.547212 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-54c7b778bc-f89zb" podUID="ae8faad5-6108-4619-82ef-7f707ede4f61" containerName="registry" containerID="cri-o://4d35337ce108ba589d4e0347bb6d6c30c34acc8b04adcb6d27325aff78d8589f" gracePeriod=30 Apr 17 07:54:48.171118 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:48.171083 2578 generic.go:358] "Generic (PLEG): container finished" podID="ae8faad5-6108-4619-82ef-7f707ede4f61" containerID="4d35337ce108ba589d4e0347bb6d6c30c34acc8b04adcb6d27325aff78d8589f" exitCode=0 Apr 17 07:54:48.171492 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:48.171176 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-54c7b778bc-f89zb" event={"ID":"ae8faad5-6108-4619-82ef-7f707ede4f61","Type":"ContainerDied","Data":"4d35337ce108ba589d4e0347bb6d6c30c34acc8b04adcb6d27325aff78d8589f"} Apr 17 07:54:48.171492 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:48.171213 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-54c7b778bc-f89zb" event={"ID":"ae8faad5-6108-4619-82ef-7f707ede4f61","Type":"ContainerStarted","Data":"4dda0b8fa518facbca22c278ab937f92abc2daf50c4f349bbc60d274dd03ca8d"} Apr 17 07:54:48.171492 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:54:48.171241 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-54c7b778bc-f89zb" Apr 17 07:55:01.369187 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:55:01.369131 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:01.384273 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:55:01.384238 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:02.225829 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:55:02.225799 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 07:55:09.178951 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:55:09.178916 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-54c7b778bc-f89zb" Apr 17 07:56:20.427731 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:56:20.427697 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rnznv_e22e30ae-58f8-41be-9023-53dbea7c6e98/ovn-acl-logging/0.log" Apr 17 07:56:20.428282 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:56:20.427798 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rnznv_e22e30ae-58f8-41be-9023-53dbea7c6e98/ovn-acl-logging/0.log" Apr 17 07:56:20.433460 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:56:20.433442 2578 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 07:59:07.381939 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:59:07.381904 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-gx48k"] Apr 17 07:59:07.382363 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:59:07.382207 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="95a9e913-e1cb-49b7-ae5f-77299dc16875" containerName="registry" Apr 17 07:59:07.382363 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:59:07.382219 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="95a9e913-e1cb-49b7-ae5f-77299dc16875" containerName="registry" Apr 17 07:59:07.382363 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:59:07.382269 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="95a9e913-e1cb-49b7-ae5f-77299dc16875" containerName="registry" Apr 17 07:59:07.384996 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:59:07.384976 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-gx48k" Apr 17 07:59:07.387518 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:59:07.387500 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 17 07:59:07.387615 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:59:07.387541 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 17 07:59:07.388529 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:59:07.388504 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-ml8vh\"" Apr 17 07:59:07.388622 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:59:07.388510 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 17 07:59:07.393975 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:59:07.393957 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-gx48k"] Apr 17 07:59:07.479214 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:59:07.479181 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4t5q\" (UniqueName: \"kubernetes.io/projected/030cb1ab-da8c-4851-92c9-412877311ed6-kube-api-access-p4t5q\") pod \"s3-init-gx48k\" (UID: \"030cb1ab-da8c-4851-92c9-412877311ed6\") " pod="kserve/s3-init-gx48k" Apr 17 07:59:07.580247 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:59:07.580218 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p4t5q\" (UniqueName: \"kubernetes.io/projected/030cb1ab-da8c-4851-92c9-412877311ed6-kube-api-access-p4t5q\") pod \"s3-init-gx48k\" (UID: \"030cb1ab-da8c-4851-92c9-412877311ed6\") " pod="kserve/s3-init-gx48k" Apr 17 07:59:07.588567 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:59:07.588539 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4t5q\" (UniqueName: \"kubernetes.io/projected/030cb1ab-da8c-4851-92c9-412877311ed6-kube-api-access-p4t5q\") pod \"s3-init-gx48k\" (UID: \"030cb1ab-da8c-4851-92c9-412877311ed6\") " pod="kserve/s3-init-gx48k" Apr 17 07:59:07.702427 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:59:07.702398 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-gx48k" Apr 17 07:59:07.814785 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:59:07.814755 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-gx48k"] Apr 17 07:59:07.817790 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:59:07.817759 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod030cb1ab_da8c_4851_92c9_412877311ed6.slice/crio-946df8a76f1609d365a79f7d615d02e26cd179f684d333b0ed5757ebe0e944be WatchSource:0}: Error finding container 946df8a76f1609d365a79f7d615d02e26cd179f684d333b0ed5757ebe0e944be: Status 404 returned error can't find the container with id 946df8a76f1609d365a79f7d615d02e26cd179f684d333b0ed5757ebe0e944be Apr 17 07:59:07.819626 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:59:07.819607 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 07:59:07.866493 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:59:07.866462 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-gx48k" event={"ID":"030cb1ab-da8c-4851-92c9-412877311ed6","Type":"ContainerStarted","Data":"946df8a76f1609d365a79f7d615d02e26cd179f684d333b0ed5757ebe0e944be"} Apr 17 07:59:12.883108 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:59:12.883062 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-gx48k" event={"ID":"030cb1ab-da8c-4851-92c9-412877311ed6","Type":"ContainerStarted","Data":"75b9c1bcdc3c46f84f7503b171d485bc9c9320bf0618dcd4cf75847588a610e5"} Apr 17 07:59:12.899322 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:59:12.899275 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-gx48k" podStartSLOduration=1.483654445 podStartE2EDuration="5.899259532s" podCreationTimestamp="2026-04-17 07:59:07 +0000 UTC" firstStartedPulling="2026-04-17 07:59:07.819738509 +0000 UTC m=+467.785775264" lastFinishedPulling="2026-04-17 07:59:12.23534359 +0000 UTC m=+472.201380351" observedRunningTime="2026-04-17 07:59:12.897801644 +0000 UTC m=+472.863838422" watchObservedRunningTime="2026-04-17 07:59:12.899259532 +0000 UTC m=+472.865296309" Apr 17 07:59:15.892925 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:59:15.892889 2578 generic.go:358] "Generic (PLEG): container finished" podID="030cb1ab-da8c-4851-92c9-412877311ed6" containerID="75b9c1bcdc3c46f84f7503b171d485bc9c9320bf0618dcd4cf75847588a610e5" exitCode=0 Apr 17 07:59:15.893292 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:59:15.892964 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-gx48k" event={"ID":"030cb1ab-da8c-4851-92c9-412877311ed6","Type":"ContainerDied","Data":"75b9c1bcdc3c46f84f7503b171d485bc9c9320bf0618dcd4cf75847588a610e5"} Apr 17 07:59:17.026116 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:59:17.026094 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-gx48k" Apr 17 07:59:17.058992 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:59:17.058970 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4t5q\" (UniqueName: \"kubernetes.io/projected/030cb1ab-da8c-4851-92c9-412877311ed6-kube-api-access-p4t5q\") pod \"030cb1ab-da8c-4851-92c9-412877311ed6\" (UID: \"030cb1ab-da8c-4851-92c9-412877311ed6\") " Apr 17 07:59:17.061256 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:59:17.061216 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/030cb1ab-da8c-4851-92c9-412877311ed6-kube-api-access-p4t5q" (OuterVolumeSpecName: "kube-api-access-p4t5q") pod "030cb1ab-da8c-4851-92c9-412877311ed6" (UID: "030cb1ab-da8c-4851-92c9-412877311ed6"). InnerVolumeSpecName "kube-api-access-p4t5q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 07:59:17.159633 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:59:17.159610 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p4t5q\" (UniqueName: \"kubernetes.io/projected/030cb1ab-da8c-4851-92c9-412877311ed6-kube-api-access-p4t5q\") on node \"ip-10-0-134-176.ec2.internal\" DevicePath \"\"" Apr 17 07:59:17.900295 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:59:17.900270 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-gx48k" Apr 17 07:59:17.900463 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:59:17.900270 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-gx48k" event={"ID":"030cb1ab-da8c-4851-92c9-412877311ed6","Type":"ContainerDied","Data":"946df8a76f1609d365a79f7d615d02e26cd179f684d333b0ed5757ebe0e944be"} Apr 17 07:59:17.900463 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:59:17.900372 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="946df8a76f1609d365a79f7d615d02e26cd179f684d333b0ed5757ebe0e944be" Apr 17 07:59:52.151343 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:59:52.151308 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-custom-227hf"] Apr 17 07:59:52.151817 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:59:52.151591 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="030cb1ab-da8c-4851-92c9-412877311ed6" containerName="s3-init" Apr 17 07:59:52.151817 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:59:52.151602 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="030cb1ab-da8c-4851-92c9-412877311ed6" containerName="s3-init" Apr 17 07:59:52.151817 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:59:52.151659 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="030cb1ab-da8c-4851-92c9-412877311ed6" containerName="s3-init" Apr 17 07:59:52.170281 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:59:52.170252 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-227hf"] Apr 17 07:59:52.170413 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:59:52.170346 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-227hf" Apr 17 07:59:52.173842 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:59:52.173821 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-ml8vh\"" Apr 17 07:59:52.175019 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:59:52.175000 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 17 07:59:52.175127 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:59:52.175071 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 17 07:59:52.175127 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:59:52.175109 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 17 07:59:52.241779 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:59:52.241752 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42ngx\" (UniqueName: \"kubernetes.io/projected/2b51a082-ce0f-4545-9bbd-5112124373b2-kube-api-access-42ngx\") pod \"s3-tls-init-custom-227hf\" (UID: \"2b51a082-ce0f-4545-9bbd-5112124373b2\") " pod="kserve/s3-tls-init-custom-227hf" Apr 17 07:59:52.342665 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:59:52.342640 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-42ngx\" (UniqueName: \"kubernetes.io/projected/2b51a082-ce0f-4545-9bbd-5112124373b2-kube-api-access-42ngx\") pod \"s3-tls-init-custom-227hf\" (UID: \"2b51a082-ce0f-4545-9bbd-5112124373b2\") " pod="kserve/s3-tls-init-custom-227hf" Apr 17 07:59:52.351978 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:59:52.351948 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-42ngx\" (UniqueName: \"kubernetes.io/projected/2b51a082-ce0f-4545-9bbd-5112124373b2-kube-api-access-42ngx\") pod \"s3-tls-init-custom-227hf\" (UID: \"2b51a082-ce0f-4545-9bbd-5112124373b2\") " pod="kserve/s3-tls-init-custom-227hf" Apr 17 07:59:52.488943 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:59:52.488875 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-227hf" Apr 17 07:59:52.604486 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:59:52.604458 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-227hf"] Apr 17 07:59:52.607708 ip-10-0-134-176 kubenswrapper[2578]: W0417 07:59:52.607677 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b51a082_ce0f_4545_9bbd_5112124373b2.slice/crio-8d18f9c86ea18e63594cff3037bcd7a3e38dee5c875a4464d0d4d997661cd811 WatchSource:0}: Error finding container 8d18f9c86ea18e63594cff3037bcd7a3e38dee5c875a4464d0d4d997661cd811: Status 404 returned error can't find the container with id 8d18f9c86ea18e63594cff3037bcd7a3e38dee5c875a4464d0d4d997661cd811 Apr 17 07:59:53.010922 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:59:53.010879 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-227hf" event={"ID":"2b51a082-ce0f-4545-9bbd-5112124373b2","Type":"ContainerStarted","Data":"c6abbfafac549872cbcad02b62eb7ee9eeef066e534cf3b5f5eabe0142fd0b28"} Apr 17 07:59:53.011083 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:59:53.010926 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-227hf" event={"ID":"2b51a082-ce0f-4545-9bbd-5112124373b2","Type":"ContainerStarted","Data":"8d18f9c86ea18e63594cff3037bcd7a3e38dee5c875a4464d0d4d997661cd811"} Apr 17 07:59:53.028156 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:59:53.028089 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-custom-227hf" podStartSLOduration=1.028072233 podStartE2EDuration="1.028072233s" podCreationTimestamp="2026-04-17 07:59:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 07:59:53.025907773 +0000 UTC m=+512.991944551" watchObservedRunningTime="2026-04-17 07:59:53.028072233 +0000 UTC m=+512.994109011" Apr 17 07:59:57.022426 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:59:57.022399 2578 generic.go:358] "Generic (PLEG): container finished" podID="2b51a082-ce0f-4545-9bbd-5112124373b2" containerID="c6abbfafac549872cbcad02b62eb7ee9eeef066e534cf3b5f5eabe0142fd0b28" exitCode=0 Apr 17 07:59:57.022720 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:59:57.022454 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-227hf" event={"ID":"2b51a082-ce0f-4545-9bbd-5112124373b2","Type":"ContainerDied","Data":"c6abbfafac549872cbcad02b62eb7ee9eeef066e534cf3b5f5eabe0142fd0b28"} Apr 17 07:59:58.144969 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:59:58.144948 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-227hf" Apr 17 07:59:58.290482 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:59:58.290407 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42ngx\" (UniqueName: \"kubernetes.io/projected/2b51a082-ce0f-4545-9bbd-5112124373b2-kube-api-access-42ngx\") pod \"2b51a082-ce0f-4545-9bbd-5112124373b2\" (UID: \"2b51a082-ce0f-4545-9bbd-5112124373b2\") " Apr 17 07:59:58.292468 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:59:58.292444 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b51a082-ce0f-4545-9bbd-5112124373b2-kube-api-access-42ngx" (OuterVolumeSpecName: "kube-api-access-42ngx") pod "2b51a082-ce0f-4545-9bbd-5112124373b2" (UID: "2b51a082-ce0f-4545-9bbd-5112124373b2"). InnerVolumeSpecName "kube-api-access-42ngx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 07:59:58.391231 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:59:58.391203 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-42ngx\" (UniqueName: \"kubernetes.io/projected/2b51a082-ce0f-4545-9bbd-5112124373b2-kube-api-access-42ngx\") on node \"ip-10-0-134-176.ec2.internal\" DevicePath \"\"" Apr 17 07:59:59.029338 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:59:59.029313 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-227hf" Apr 17 07:59:59.029505 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:59:59.029340 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-227hf" event={"ID":"2b51a082-ce0f-4545-9bbd-5112124373b2","Type":"ContainerDied","Data":"8d18f9c86ea18e63594cff3037bcd7a3e38dee5c875a4464d0d4d997661cd811"} Apr 17 07:59:59.029505 ip-10-0-134-176 kubenswrapper[2578]: I0417 07:59:59.029365 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d18f9c86ea18e63594cff3037bcd7a3e38dee5c875a4464d0d4d997661cd811" Apr 17 08:00:02.344194 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:00:02.344159 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-serving-67jk5"] Apr 17 08:00:02.344575 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:00:02.344443 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b51a082-ce0f-4545-9bbd-5112124373b2" containerName="s3-tls-init-custom" Apr 17 08:00:02.344575 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:00:02.344455 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b51a082-ce0f-4545-9bbd-5112124373b2" containerName="s3-tls-init-custom" Apr 17 08:00:02.344575 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:00:02.344516 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="2b51a082-ce0f-4545-9bbd-5112124373b2" containerName="s3-tls-init-custom" Apr 17 08:00:02.347635 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:00:02.347618 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-67jk5" Apr 17 08:00:02.349936 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:00:02.349912 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-ml8vh\"" Apr 17 08:00:02.349936 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:00:02.349923 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Apr 17 08:00:02.350110 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:00:02.349922 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 17 08:00:02.350793 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:00:02.350777 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 17 08:00:02.352865 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:00:02.352842 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-67jk5"] Apr 17 08:00:02.528153 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:00:02.528118 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hbg5\" (UniqueName: \"kubernetes.io/projected/a1bbecba-5270-45bc-a857-dd0f493dd058-kube-api-access-8hbg5\") pod \"s3-tls-init-serving-67jk5\" (UID: \"a1bbecba-5270-45bc-a857-dd0f493dd058\") " pod="kserve/s3-tls-init-serving-67jk5" Apr 17 08:00:02.628599 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:00:02.628528 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8hbg5\" (UniqueName: \"kubernetes.io/projected/a1bbecba-5270-45bc-a857-dd0f493dd058-kube-api-access-8hbg5\") pod \"s3-tls-init-serving-67jk5\" (UID: \"a1bbecba-5270-45bc-a857-dd0f493dd058\") " pod="kserve/s3-tls-init-serving-67jk5" Apr 17 08:00:02.636926 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:00:02.636898 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hbg5\" (UniqueName: \"kubernetes.io/projected/a1bbecba-5270-45bc-a857-dd0f493dd058-kube-api-access-8hbg5\") pod \"s3-tls-init-serving-67jk5\" (UID: \"a1bbecba-5270-45bc-a857-dd0f493dd058\") " pod="kserve/s3-tls-init-serving-67jk5" Apr 17 08:00:02.665698 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:00:02.665676 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-67jk5" Apr 17 08:00:02.787138 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:00:02.787098 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-67jk5"] Apr 17 08:00:02.790333 ip-10-0-134-176 kubenswrapper[2578]: W0417 08:00:02.790306 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1bbecba_5270_45bc_a857_dd0f493dd058.slice/crio-9fbe5aec31abdbcbb993b3e61e253ecfc46030fb3ec1fd712b26d86f6a467d3f WatchSource:0}: Error finding container 9fbe5aec31abdbcbb993b3e61e253ecfc46030fb3ec1fd712b26d86f6a467d3f: Status 404 returned error can't find the container with id 9fbe5aec31abdbcbb993b3e61e253ecfc46030fb3ec1fd712b26d86f6a467d3f Apr 17 08:00:03.041392 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:00:03.041357 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-67jk5" event={"ID":"a1bbecba-5270-45bc-a857-dd0f493dd058","Type":"ContainerStarted","Data":"d8df57e57db0df226bd899b009e17da69d53b3faed5b9132c2a8d0ebd885afb6"} Apr 17 08:00:03.041392 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:00:03.041392 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-67jk5" event={"ID":"a1bbecba-5270-45bc-a857-dd0f493dd058","Type":"ContainerStarted","Data":"9fbe5aec31abdbcbb993b3e61e253ecfc46030fb3ec1fd712b26d86f6a467d3f"} Apr 17 08:00:03.055231 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:00:03.055183 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-serving-67jk5" podStartSLOduration=1.055166912 podStartE2EDuration="1.055166912s" podCreationTimestamp="2026-04-17 08:00:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:00:03.054493093 +0000 UTC m=+523.020529870" watchObservedRunningTime="2026-04-17 08:00:03.055166912 +0000 UTC m=+523.021203691" Apr 17 08:00:07.053493 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:00:07.053460 2578 generic.go:358] "Generic (PLEG): container finished" podID="a1bbecba-5270-45bc-a857-dd0f493dd058" containerID="d8df57e57db0df226bd899b009e17da69d53b3faed5b9132c2a8d0ebd885afb6" exitCode=0 Apr 17 08:00:07.053869 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:00:07.053524 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-67jk5" event={"ID":"a1bbecba-5270-45bc-a857-dd0f493dd058","Type":"ContainerDied","Data":"d8df57e57db0df226bd899b009e17da69d53b3faed5b9132c2a8d0ebd885afb6"} Apr 17 08:00:08.184069 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:00:08.184047 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-67jk5" Apr 17 08:00:08.272446 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:00:08.272418 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hbg5\" (UniqueName: \"kubernetes.io/projected/a1bbecba-5270-45bc-a857-dd0f493dd058-kube-api-access-8hbg5\") pod \"a1bbecba-5270-45bc-a857-dd0f493dd058\" (UID: \"a1bbecba-5270-45bc-a857-dd0f493dd058\") " Apr 17 08:00:08.274396 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:00:08.274374 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1bbecba-5270-45bc-a857-dd0f493dd058-kube-api-access-8hbg5" (OuterVolumeSpecName: "kube-api-access-8hbg5") pod "a1bbecba-5270-45bc-a857-dd0f493dd058" (UID: "a1bbecba-5270-45bc-a857-dd0f493dd058"). InnerVolumeSpecName "kube-api-access-8hbg5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:00:08.373162 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:00:08.373071 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8hbg5\" (UniqueName: \"kubernetes.io/projected/a1bbecba-5270-45bc-a857-dd0f493dd058-kube-api-access-8hbg5\") on node \"ip-10-0-134-176.ec2.internal\" DevicePath \"\"" Apr 17 08:00:09.060966 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:00:09.060936 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-67jk5" Apr 17 08:00:09.060966 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:00:09.060950 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-67jk5" event={"ID":"a1bbecba-5270-45bc-a857-dd0f493dd058","Type":"ContainerDied","Data":"9fbe5aec31abdbcbb993b3e61e253ecfc46030fb3ec1fd712b26d86f6a467d3f"} Apr 17 08:00:09.061188 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:00:09.060977 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9fbe5aec31abdbcbb993b3e61e253ecfc46030fb3ec1fd712b26d86f6a467d3f" Apr 17 08:01:20.453908 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:01:20.453874 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rnznv_e22e30ae-58f8-41be-9023-53dbea7c6e98/ovn-acl-logging/0.log" Apr 17 08:01:20.455203 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:01:20.455179 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rnznv_e22e30ae-58f8-41be-9023-53dbea7c6e98/ovn-acl-logging/0.log" Apr 17 08:06:20.474188 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:06:20.474157 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rnznv_e22e30ae-58f8-41be-9023-53dbea7c6e98/ovn-acl-logging/0.log" Apr 17 08:06:20.475936 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:06:20.475914 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rnznv_e22e30ae-58f8-41be-9023-53dbea7c6e98/ovn-acl-logging/0.log" Apr 17 08:11:20.495613 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:11:20.495581 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rnznv_e22e30ae-58f8-41be-9023-53dbea7c6e98/ovn-acl-logging/0.log" Apr 17 08:11:20.498834 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:11:20.498805 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rnznv_e22e30ae-58f8-41be-9023-53dbea7c6e98/ovn-acl-logging/0.log" Apr 17 08:16:20.518987 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:16:20.518956 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rnznv_e22e30ae-58f8-41be-9023-53dbea7c6e98/ovn-acl-logging/0.log" Apr 17 08:16:20.521427 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:16:20.521393 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rnznv_e22e30ae-58f8-41be-9023-53dbea7c6e98/ovn-acl-logging/0.log" Apr 17 08:21:20.539684 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:21:20.539656 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rnznv_e22e30ae-58f8-41be-9023-53dbea7c6e98/ovn-acl-logging/0.log" Apr 17 08:21:20.542582 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:21:20.542559 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rnznv_e22e30ae-58f8-41be-9023-53dbea7c6e98/ovn-acl-logging/0.log" Apr 17 08:26:20.559501 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:26:20.559469 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rnznv_e22e30ae-58f8-41be-9023-53dbea7c6e98/ovn-acl-logging/0.log" Apr 17 08:26:20.563540 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:26:20.563518 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rnznv_e22e30ae-58f8-41be-9023-53dbea7c6e98/ovn-acl-logging/0.log" Apr 17 08:31:20.578641 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:31:20.578602 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rnznv_e22e30ae-58f8-41be-9023-53dbea7c6e98/ovn-acl-logging/0.log" Apr 17 08:31:20.584046 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:31:20.584026 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rnznv_e22e30ae-58f8-41be-9023-53dbea7c6e98/ovn-acl-logging/0.log" Apr 17 08:36:20.598796 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:36:20.598764 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rnznv_e22e30ae-58f8-41be-9023-53dbea7c6e98/ovn-acl-logging/0.log" Apr 17 08:36:20.604967 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:36:20.604945 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rnznv_e22e30ae-58f8-41be-9023-53dbea7c6e98/ovn-acl-logging/0.log" Apr 17 08:41:20.619187 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:41:20.619077 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rnznv_e22e30ae-58f8-41be-9023-53dbea7c6e98/ovn-acl-logging/0.log" Apr 17 08:41:20.625574 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:41:20.625551 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rnznv_e22e30ae-58f8-41be-9023-53dbea7c6e98/ovn-acl-logging/0.log" Apr 17 08:46:20.639502 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:46:20.639476 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rnznv_e22e30ae-58f8-41be-9023-53dbea7c6e98/ovn-acl-logging/0.log" Apr 17 08:46:20.645484 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:46:20.645464 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rnznv_e22e30ae-58f8-41be-9023-53dbea7c6e98/ovn-acl-logging/0.log" Apr 17 08:50:40.241266 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:50:40.241226 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8b6bs/must-gather-96h72"] Apr 17 08:50:40.243859 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:50:40.241641 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a1bbecba-5270-45bc-a857-dd0f493dd058" containerName="s3-tls-init-serving" Apr 17 08:50:40.243859 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:50:40.241659 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1bbecba-5270-45bc-a857-dd0f493dd058" containerName="s3-tls-init-serving" Apr 17 08:50:40.243859 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:50:40.241734 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="a1bbecba-5270-45bc-a857-dd0f493dd058" containerName="s3-tls-init-serving" Apr 17 08:50:40.244815 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:50:40.244796 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8b6bs/must-gather-96h72" Apr 17 08:50:40.247125 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:50:40.247097 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-8b6bs\"/\"openshift-service-ca.crt\"" Apr 17 08:50:40.247242 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:50:40.247224 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-8b6bs\"/\"kube-root-ca.crt\"" Apr 17 08:50:40.251039 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:50:40.251015 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8b6bs/must-gather-96h72"] Apr 17 08:50:40.342653 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:50:40.342614 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w997j\" (UniqueName: \"kubernetes.io/projected/8b1e3749-ed5c-4915-813c-8cd67641b262-kube-api-access-w997j\") pod \"must-gather-96h72\" (UID: \"8b1e3749-ed5c-4915-813c-8cd67641b262\") " pod="openshift-must-gather-8b6bs/must-gather-96h72" Apr 17 08:50:40.342819 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:50:40.342662 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8b1e3749-ed5c-4915-813c-8cd67641b262-must-gather-output\") pod \"must-gather-96h72\" (UID: \"8b1e3749-ed5c-4915-813c-8cd67641b262\") " pod="openshift-must-gather-8b6bs/must-gather-96h72" Apr 17 08:50:40.443060 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:50:40.443028 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w997j\" (UniqueName: \"kubernetes.io/projected/8b1e3749-ed5c-4915-813c-8cd67641b262-kube-api-access-w997j\") pod \"must-gather-96h72\" (UID: \"8b1e3749-ed5c-4915-813c-8cd67641b262\") " pod="openshift-must-gather-8b6bs/must-gather-96h72" Apr 17 08:50:40.443234 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:50:40.443085 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8b1e3749-ed5c-4915-813c-8cd67641b262-must-gather-output\") pod \"must-gather-96h72\" (UID: \"8b1e3749-ed5c-4915-813c-8cd67641b262\") " pod="openshift-must-gather-8b6bs/must-gather-96h72" Apr 17 08:50:40.443427 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:50:40.443407 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8b1e3749-ed5c-4915-813c-8cd67641b262-must-gather-output\") pod \"must-gather-96h72\" (UID: \"8b1e3749-ed5c-4915-813c-8cd67641b262\") " pod="openshift-must-gather-8b6bs/must-gather-96h72" Apr 17 08:50:40.450691 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:50:40.450670 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w997j\" (UniqueName: \"kubernetes.io/projected/8b1e3749-ed5c-4915-813c-8cd67641b262-kube-api-access-w997j\") pod \"must-gather-96h72\" (UID: \"8b1e3749-ed5c-4915-813c-8cd67641b262\") " pod="openshift-must-gather-8b6bs/must-gather-96h72" Apr 17 08:50:40.562714 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:50:40.562633 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8b6bs/must-gather-96h72" Apr 17 08:50:40.688278 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:50:40.688186 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8b6bs/must-gather-96h72"] Apr 17 08:50:40.690833 ip-10-0-134-176 kubenswrapper[2578]: W0417 08:50:40.690807 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b1e3749_ed5c_4915_813c_8cd67641b262.slice/crio-c8c137b88484c06a459500dc732fa6d07d1103d42da71bb142c1dc3e125dd196 WatchSource:0}: Error finding container c8c137b88484c06a459500dc732fa6d07d1103d42da71bb142c1dc3e125dd196: Status 404 returned error can't find the container with id c8c137b88484c06a459500dc732fa6d07d1103d42da71bb142c1dc3e125dd196 Apr 17 08:50:40.692415 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:50:40.692396 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 08:50:41.635427 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:50:41.635386 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8b6bs/must-gather-96h72" event={"ID":"8b1e3749-ed5c-4915-813c-8cd67641b262","Type":"ContainerStarted","Data":"c8c137b88484c06a459500dc732fa6d07d1103d42da71bb142c1dc3e125dd196"} Apr 17 08:50:45.650378 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:50:45.650353 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8b6bs/must-gather-96h72" event={"ID":"8b1e3749-ed5c-4915-813c-8cd67641b262","Type":"ContainerStarted","Data":"cf5c2e0e290850ec47d632ddf872914a6146a38e84ba363aab007249c4dffd2a"} Apr 17 08:50:45.650806 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:50:45.650389 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8b6bs/must-gather-96h72" event={"ID":"8b1e3749-ed5c-4915-813c-8cd67641b262","Type":"ContainerStarted","Data":"448a343d8a3b52fba3701dbde585ce2a5f72aafaee259ed4385ca4839e7ca904"} Apr 17 08:50:45.665816 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:50:45.665772 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8b6bs/must-gather-96h72" podStartSLOduration=1.169323849 podStartE2EDuration="5.665758739s" podCreationTimestamp="2026-04-17 08:50:40 +0000 UTC" firstStartedPulling="2026-04-17 08:50:40.692533005 +0000 UTC m=+3560.658569760" lastFinishedPulling="2026-04-17 08:50:45.188967892 +0000 UTC m=+3565.155004650" observedRunningTime="2026-04-17 08:50:45.664055459 +0000 UTC m=+3565.630092274" watchObservedRunningTime="2026-04-17 08:50:45.665758739 +0000 UTC m=+3565.631795516" Apr 17 08:51:04.712979 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:04.712944 2578 generic.go:358] "Generic (PLEG): container finished" podID="8b1e3749-ed5c-4915-813c-8cd67641b262" containerID="448a343d8a3b52fba3701dbde585ce2a5f72aafaee259ed4385ca4839e7ca904" exitCode=0 Apr 17 08:51:04.713411 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:04.713018 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8b6bs/must-gather-96h72" event={"ID":"8b1e3749-ed5c-4915-813c-8cd67641b262","Type":"ContainerDied","Data":"448a343d8a3b52fba3701dbde585ce2a5f72aafaee259ed4385ca4839e7ca904"} Apr 17 08:51:04.713411 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:04.713350 2578 scope.go:117] "RemoveContainer" containerID="448a343d8a3b52fba3701dbde585ce2a5f72aafaee259ed4385ca4839e7ca904" Apr 17 08:51:05.576551 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:05.576519 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8b6bs_must-gather-96h72_8b1e3749-ed5c-4915-813c-8cd67641b262/gather/0.log" Apr 17 08:51:08.879588 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:08.879554 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-k9qmk_34f61707-b762-47d3-b7c3-a54999ad703b/global-pull-secret-syncer/0.log" Apr 17 08:51:09.162913 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:09.162879 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-wwr4f_4f3abb21-4fac-471f-a620-eac7abc32e29/konnectivity-agent/0.log" Apr 17 08:51:09.208482 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:09.208453 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-134-176.ec2.internal_f5dc70df488813169343e62031a0c95f/haproxy/0.log" Apr 17 08:51:11.030216 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:11.030180 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8b6bs/must-gather-96h72"] Apr 17 08:51:11.030713 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:11.030393 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-8b6bs/must-gather-96h72" podUID="8b1e3749-ed5c-4915-813c-8cd67641b262" containerName="copy" containerID="cri-o://cf5c2e0e290850ec47d632ddf872914a6146a38e84ba363aab007249c4dffd2a" gracePeriod=2 Apr 17 08:51:11.036112 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:11.036082 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8b6bs/must-gather-96h72"] Apr 17 08:51:11.254871 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:11.254847 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8b6bs_must-gather-96h72_8b1e3749-ed5c-4915-813c-8cd67641b262/copy/0.log" Apr 17 08:51:11.255217 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:11.255202 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8b6bs/must-gather-96h72" Apr 17 08:51:11.257952 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:11.257928 2578 status_manager.go:895] "Failed to get status for pod" podUID="8b1e3749-ed5c-4915-813c-8cd67641b262" pod="openshift-must-gather-8b6bs/must-gather-96h72" err="pods \"must-gather-96h72\" is forbidden: User \"system:node:ip-10-0-134-176.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-8b6bs\": no relationship found between node 'ip-10-0-134-176.ec2.internal' and this object" Apr 17 08:51:11.311383 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:11.311326 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8b1e3749-ed5c-4915-813c-8cd67641b262-must-gather-output\") pod \"8b1e3749-ed5c-4915-813c-8cd67641b262\" (UID: \"8b1e3749-ed5c-4915-813c-8cd67641b262\") " Apr 17 08:51:11.311482 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:11.311388 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w997j\" (UniqueName: \"kubernetes.io/projected/8b1e3749-ed5c-4915-813c-8cd67641b262-kube-api-access-w997j\") pod \"8b1e3749-ed5c-4915-813c-8cd67641b262\" (UID: \"8b1e3749-ed5c-4915-813c-8cd67641b262\") " Apr 17 08:51:11.312739 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:11.312709 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b1e3749-ed5c-4915-813c-8cd67641b262-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "8b1e3749-ed5c-4915-813c-8cd67641b262" (UID: "8b1e3749-ed5c-4915-813c-8cd67641b262"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:51:11.313457 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:11.313430 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b1e3749-ed5c-4915-813c-8cd67641b262-kube-api-access-w997j" (OuterVolumeSpecName: "kube-api-access-w997j") pod "8b1e3749-ed5c-4915-813c-8cd67641b262" (UID: "8b1e3749-ed5c-4915-813c-8cd67641b262"). InnerVolumeSpecName "kube-api-access-w997j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:51:11.412741 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:11.412708 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w997j\" (UniqueName: \"kubernetes.io/projected/8b1e3749-ed5c-4915-813c-8cd67641b262-kube-api-access-w997j\") on node \"ip-10-0-134-176.ec2.internal\" DevicePath \"\"" Apr 17 08:51:11.412741 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:11.412732 2578 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8b1e3749-ed5c-4915-813c-8cd67641b262-must-gather-output\") on node \"ip-10-0-134-176.ec2.internal\" DevicePath \"\"" Apr 17 08:51:11.738492 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:11.738462 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8b6bs_must-gather-96h72_8b1e3749-ed5c-4915-813c-8cd67641b262/copy/0.log" Apr 17 08:51:11.738783 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:11.738760 2578 generic.go:358] "Generic (PLEG): container finished" podID="8b1e3749-ed5c-4915-813c-8cd67641b262" containerID="cf5c2e0e290850ec47d632ddf872914a6146a38e84ba363aab007249c4dffd2a" exitCode=143 Apr 17 08:51:11.738880 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:11.738816 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8b6bs/must-gather-96h72" Apr 17 08:51:11.738880 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:11.738857 2578 scope.go:117] "RemoveContainer" containerID="cf5c2e0e290850ec47d632ddf872914a6146a38e84ba363aab007249c4dffd2a" Apr 17 08:51:11.741810 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:11.741782 2578 status_manager.go:895] "Failed to get status for pod" podUID="8b1e3749-ed5c-4915-813c-8cd67641b262" pod="openshift-must-gather-8b6bs/must-gather-96h72" err="pods \"must-gather-96h72\" is forbidden: User \"system:node:ip-10-0-134-176.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-8b6bs\": no relationship found between node 'ip-10-0-134-176.ec2.internal' and this object" Apr 17 08:51:11.747087 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:11.747070 2578 scope.go:117] "RemoveContainer" containerID="448a343d8a3b52fba3701dbde585ce2a5f72aafaee259ed4385ca4839e7ca904" Apr 17 08:51:11.748668 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:11.748642 2578 status_manager.go:895] "Failed to get status for pod" podUID="8b1e3749-ed5c-4915-813c-8cd67641b262" pod="openshift-must-gather-8b6bs/must-gather-96h72" err="pods \"must-gather-96h72\" is forbidden: User \"system:node:ip-10-0-134-176.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-8b6bs\": no relationship found between node 'ip-10-0-134-176.ec2.internal' and this object" Apr 17 08:51:11.758624 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:11.758608 2578 scope.go:117] "RemoveContainer" containerID="cf5c2e0e290850ec47d632ddf872914a6146a38e84ba363aab007249c4dffd2a" Apr 17 08:51:11.758854 ip-10-0-134-176 kubenswrapper[2578]: E0417 08:51:11.758838 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf5c2e0e290850ec47d632ddf872914a6146a38e84ba363aab007249c4dffd2a\": container with ID starting with cf5c2e0e290850ec47d632ddf872914a6146a38e84ba363aab007249c4dffd2a not found: ID does not exist" containerID="cf5c2e0e290850ec47d632ddf872914a6146a38e84ba363aab007249c4dffd2a" Apr 17 08:51:11.758892 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:11.758861 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf5c2e0e290850ec47d632ddf872914a6146a38e84ba363aab007249c4dffd2a"} err="failed to get container status \"cf5c2e0e290850ec47d632ddf872914a6146a38e84ba363aab007249c4dffd2a\": rpc error: code = NotFound desc = could not find container \"cf5c2e0e290850ec47d632ddf872914a6146a38e84ba363aab007249c4dffd2a\": container with ID starting with cf5c2e0e290850ec47d632ddf872914a6146a38e84ba363aab007249c4dffd2a not found: ID does not exist" Apr 17 08:51:11.758892 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:11.758878 2578 scope.go:117] "RemoveContainer" containerID="448a343d8a3b52fba3701dbde585ce2a5f72aafaee259ed4385ca4839e7ca904" Apr 17 08:51:11.759100 ip-10-0-134-176 kubenswrapper[2578]: E0417 08:51:11.759083 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"448a343d8a3b52fba3701dbde585ce2a5f72aafaee259ed4385ca4839e7ca904\": container with ID starting with 448a343d8a3b52fba3701dbde585ce2a5f72aafaee259ed4385ca4839e7ca904 not found: ID does not exist" containerID="448a343d8a3b52fba3701dbde585ce2a5f72aafaee259ed4385ca4839e7ca904" Apr 17 08:51:11.759216 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:11.759103 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"448a343d8a3b52fba3701dbde585ce2a5f72aafaee259ed4385ca4839e7ca904"} err="failed to get container status \"448a343d8a3b52fba3701dbde585ce2a5f72aafaee259ed4385ca4839e7ca904\": rpc error: code = NotFound desc = could not find container \"448a343d8a3b52fba3701dbde585ce2a5f72aafaee259ed4385ca4839e7ca904\": container with ID starting with 448a343d8a3b52fba3701dbde585ce2a5f72aafaee259ed4385ca4839e7ca904 not found: ID does not exist" Apr 17 08:51:12.533117 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:12.533042 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b1e3749-ed5c-4915-813c-8cd67641b262" path="/var/lib/kubelet/pods/8b1e3749-ed5c-4915-813c-8cd67641b262/volumes" Apr 17 08:51:12.744943 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:12.744913 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-zt672_1153f1d4-8d0e-472e-a2d6-5c7105b8aa2d/monitoring-plugin/0.log" Apr 17 08:51:12.773337 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:12.773316 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-57md7_4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0/node-exporter/0.log" Apr 17 08:51:12.793882 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:12.793815 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-57md7_4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0/kube-rbac-proxy/0.log" Apr 17 08:51:12.816121 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:12.816098 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-57md7_4b2eb30e-df2b-4532-9ee5-3cd6e4fb18f0/init-textfile/0.log" Apr 17 08:51:12.998616 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:12.998587 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-6s55s_87604ba9-674a-4721-a9dd-ffc9217f96aa/kube-rbac-proxy-main/0.log" Apr 17 08:51:13.018483 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:13.018446 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-6s55s_87604ba9-674a-4721-a9dd-ffc9217f96aa/kube-rbac-proxy-self/0.log" Apr 17 08:51:13.043718 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:13.043693 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-6s55s_87604ba9-674a-4721-a9dd-ffc9217f96aa/openshift-state-metrics/0.log" Apr 17 08:51:13.082030 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:13.081964 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e362f670-e98c-4474-a28c-786ee6ce1475/prometheus/0.log" Apr 17 08:51:13.098554 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:13.098529 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e362f670-e98c-4474-a28c-786ee6ce1475/config-reloader/0.log" Apr 17 08:51:13.119425 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:13.119400 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e362f670-e98c-4474-a28c-786ee6ce1475/thanos-sidecar/0.log" Apr 17 08:51:13.142240 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:13.142221 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e362f670-e98c-4474-a28c-786ee6ce1475/kube-rbac-proxy-web/0.log" Apr 17 08:51:13.163630 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:13.163608 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e362f670-e98c-4474-a28c-786ee6ce1475/kube-rbac-proxy/0.log" Apr 17 08:51:13.189921 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:13.189902 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e362f670-e98c-4474-a28c-786ee6ce1475/kube-rbac-proxy-thanos/0.log" Apr 17 08:51:13.211908 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:13.211888 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e362f670-e98c-4474-a28c-786ee6ce1475/init-config-reloader/0.log" Apr 17 08:51:13.319128 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:13.319100 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-54f6d8586d-x4jzp_3b4ef6a3-477e-4314-8383-f7152dad52b5/telemeter-client/0.log" Apr 17 08:51:13.338987 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:13.338921 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-54f6d8586d-x4jzp_3b4ef6a3-477e-4314-8383-f7152dad52b5/reload/0.log" Apr 17 08:51:13.358090 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:13.358066 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-54f6d8586d-x4jzp_3b4ef6a3-477e-4314-8383-f7152dad52b5/kube-rbac-proxy/0.log" Apr 17 08:51:16.065925 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:16.065890 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jbj8t/perf-node-gather-daemonset-fdst4"] Apr 17 08:51:16.066337 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:16.066174 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8b1e3749-ed5c-4915-813c-8cd67641b262" containerName="gather" Apr 17 08:51:16.066337 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:16.066186 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b1e3749-ed5c-4915-813c-8cd67641b262" containerName="gather" Apr 17 08:51:16.066337 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:16.066212 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8b1e3749-ed5c-4915-813c-8cd67641b262" containerName="copy" Apr 17 08:51:16.066337 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:16.066217 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b1e3749-ed5c-4915-813c-8cd67641b262" containerName="copy" Apr 17 08:51:16.066337 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:16.066260 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="8b1e3749-ed5c-4915-813c-8cd67641b262" containerName="copy" Apr 17 08:51:16.066337 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:16.066268 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="8b1e3749-ed5c-4915-813c-8cd67641b262" containerName="gather" Apr 17 08:51:16.072613 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:16.072574 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-fdst4" Apr 17 08:51:16.074881 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:16.074857 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jbj8t\"/\"kube-root-ca.crt\"" Apr 17 08:51:16.075006 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:16.074909 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jbj8t\"/\"openshift-service-ca.crt\"" Apr 17 08:51:16.075811 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:16.075797 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-jbj8t\"/\"default-dockercfg-rq654\"" Apr 17 08:51:16.080192 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:16.080172 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jbj8t/perf-node-gather-daemonset-fdst4"] Apr 17 08:51:16.150811 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:16.150784 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/30e15234-0223-4366-97dd-c05914148181-podres\") pod \"perf-node-gather-daemonset-fdst4\" (UID: \"30e15234-0223-4366-97dd-c05914148181\") " pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-fdst4" Apr 17 08:51:16.150938 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:16.150822 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/30e15234-0223-4366-97dd-c05914148181-lib-modules\") pod \"perf-node-gather-daemonset-fdst4\" (UID: \"30e15234-0223-4366-97dd-c05914148181\") " pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-fdst4" Apr 17 08:51:16.150938 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:16.150857 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/30e15234-0223-4366-97dd-c05914148181-proc\") pod \"perf-node-gather-daemonset-fdst4\" (UID: \"30e15234-0223-4366-97dd-c05914148181\") " pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-fdst4" Apr 17 08:51:16.150938 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:16.150914 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/30e15234-0223-4366-97dd-c05914148181-sys\") pod \"perf-node-gather-daemonset-fdst4\" (UID: \"30e15234-0223-4366-97dd-c05914148181\") " pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-fdst4" Apr 17 08:51:16.150938 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:16.150936 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgql9\" (UniqueName: \"kubernetes.io/projected/30e15234-0223-4366-97dd-c05914148181-kube-api-access-cgql9\") pod \"perf-node-gather-daemonset-fdst4\" (UID: \"30e15234-0223-4366-97dd-c05914148181\") " pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-fdst4" Apr 17 08:51:16.252113 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:16.252076 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cgql9\" (UniqueName: \"kubernetes.io/projected/30e15234-0223-4366-97dd-c05914148181-kube-api-access-cgql9\") pod \"perf-node-gather-daemonset-fdst4\" (UID: \"30e15234-0223-4366-97dd-c05914148181\") " pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-fdst4" Apr 17 08:51:16.252293 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:16.252133 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/30e15234-0223-4366-97dd-c05914148181-podres\") pod \"perf-node-gather-daemonset-fdst4\" (UID: \"30e15234-0223-4366-97dd-c05914148181\") " pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-fdst4" Apr 17 08:51:16.252293 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:16.252194 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/30e15234-0223-4366-97dd-c05914148181-lib-modules\") pod \"perf-node-gather-daemonset-fdst4\" (UID: \"30e15234-0223-4366-97dd-c05914148181\") " pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-fdst4" Apr 17 08:51:16.252293 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:16.252245 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/30e15234-0223-4366-97dd-c05914148181-proc\") pod \"perf-node-gather-daemonset-fdst4\" (UID: \"30e15234-0223-4366-97dd-c05914148181\") " pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-fdst4" Apr 17 08:51:16.252412 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:16.252301 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/30e15234-0223-4366-97dd-c05914148181-sys\") pod \"perf-node-gather-daemonset-fdst4\" (UID: \"30e15234-0223-4366-97dd-c05914148181\") " pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-fdst4" Apr 17 08:51:16.252412 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:16.252371 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/30e15234-0223-4366-97dd-c05914148181-podres\") pod \"perf-node-gather-daemonset-fdst4\" (UID: \"30e15234-0223-4366-97dd-c05914148181\") " pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-fdst4" Apr 17 08:51:16.252412 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:16.252371 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/30e15234-0223-4366-97dd-c05914148181-proc\") pod \"perf-node-gather-daemonset-fdst4\" (UID: \"30e15234-0223-4366-97dd-c05914148181\") " pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-fdst4" Apr 17 08:51:16.252412 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:16.252400 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/30e15234-0223-4366-97dd-c05914148181-lib-modules\") pod \"perf-node-gather-daemonset-fdst4\" (UID: \"30e15234-0223-4366-97dd-c05914148181\") " pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-fdst4" Apr 17 08:51:16.252551 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:16.252377 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/30e15234-0223-4366-97dd-c05914148181-sys\") pod \"perf-node-gather-daemonset-fdst4\" (UID: \"30e15234-0223-4366-97dd-c05914148181\") " pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-fdst4" Apr 17 08:51:16.259745 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:16.259720 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgql9\" (UniqueName: \"kubernetes.io/projected/30e15234-0223-4366-97dd-c05914148181-kube-api-access-cgql9\") pod \"perf-node-gather-daemonset-fdst4\" (UID: \"30e15234-0223-4366-97dd-c05914148181\") " pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-fdst4" Apr 17 08:51:16.382918 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:16.382842 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-fdst4" Apr 17 08:51:16.498010 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:16.497975 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jbj8t/perf-node-gather-daemonset-fdst4"] Apr 17 08:51:16.501137 ip-10-0-134-176 kubenswrapper[2578]: W0417 08:51:16.501102 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod30e15234_0223_4366_97dd_c05914148181.slice/crio-1166ac0d77b4879eb6378b30368a2d7fc4e7631b6869a9a0a2ca7ab868c07338 WatchSource:0}: Error finding container 1166ac0d77b4879eb6378b30368a2d7fc4e7631b6869a9a0a2ca7ab868c07338: Status 404 returned error can't find the container with id 1166ac0d77b4879eb6378b30368a2d7fc4e7631b6869a9a0a2ca7ab868c07338 Apr 17 08:51:16.523596 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:16.523560 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-xn7q4_86725d0f-c4d4-499a-96f9-106af3387cc2/dns/0.log" Apr 17 08:51:16.541988 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:16.541966 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-xn7q4_86725d0f-c4d4-499a-96f9-106af3387cc2/kube-rbac-proxy/0.log" Apr 17 08:51:16.583818 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:16.583789 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-cfdh5_a6642e2d-0acd-4e4b-8013-72420908123d/dns-node-resolver/0.log" Apr 17 08:51:16.755448 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:16.755411 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-fdst4" event={"ID":"30e15234-0223-4366-97dd-c05914148181","Type":"ContainerStarted","Data":"ebc5a80aa6a654c0430c53e902eb2b03eaeffd0edc5516a75d3fe8f36f074492"} Apr 17 08:51:16.755448 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:16.755446 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-fdst4" event={"ID":"30e15234-0223-4366-97dd-c05914148181","Type":"ContainerStarted","Data":"1166ac0d77b4879eb6378b30368a2d7fc4e7631b6869a9a0a2ca7ab868c07338"} Apr 17 08:51:16.755651 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:16.755555 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-fdst4" Apr 17 08:51:16.770201 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:16.770155 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-fdst4" podStartSLOduration=0.770128453 podStartE2EDuration="770.128453ms" podCreationTimestamp="2026-04-17 08:51:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:51:16.769086144 +0000 UTC m=+3596.735122943" watchObservedRunningTime="2026-04-17 08:51:16.770128453 +0000 UTC m=+3596.736165231" Apr 17 08:51:16.961510 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:16.961485 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-54c7b778bc-f89zb_ae8faad5-6108-4619-82ef-7f707ede4f61/registry/0.log" Apr 17 08:51:16.973864 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:16.973840 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-54c7b778bc-f89zb_ae8faad5-6108-4619-82ef-7f707ede4f61/registry/1.log" Apr 17 08:51:16.992717 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:16.992692 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-jf4kz_ca63f18c-753f-468e-b6ab-a7a1608ee9ef/node-ca/0.log" Apr 17 08:51:17.726125 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:17.726099 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5b66495d-5v9dj_c26b8a3e-8882-4232-9053-f698f9bb8392/router/0.log" Apr 17 08:51:18.030102 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:18.030028 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-k99jf_ecab5c24-0616-4d6e-93a1-4c29b1548a0d/serve-healthcheck-canary/0.log" Apr 17 08:51:18.415116 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:18.415085 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-9wbj9_12552903-12e4-45a7-8194-2c6cc6a37b21/insights-operator/0.log" Apr 17 08:51:18.415874 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:18.415855 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-9wbj9_12552903-12e4-45a7-8194-2c6cc6a37b21/insights-operator/1.log" Apr 17 08:51:18.494976 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:18.494950 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-lqcdr_b3c626ad-c69c-40d3-87c2-df8c2f8dc567/kube-rbac-proxy/0.log" Apr 17 08:51:18.513605 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:18.513578 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-lqcdr_b3c626ad-c69c-40d3-87c2-df8c2f8dc567/exporter/0.log" Apr 17 08:51:18.532584 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:18.532553 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-lqcdr_b3c626ad-c69c-40d3-87c2-df8c2f8dc567/extractor/0.log" Apr 17 08:51:20.664331 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:20.664301 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rnznv_e22e30ae-58f8-41be-9023-53dbea7c6e98/ovn-acl-logging/0.log" Apr 17 08:51:20.667644 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:20.667616 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rnznv_e22e30ae-58f8-41be-9023-53dbea7c6e98/ovn-acl-logging/0.log" Apr 17 08:51:20.901086 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:20.901056 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-gx48k_030cb1ab-da8c-4851-92c9-412877311ed6/s3-init/0.log" Apr 17 08:51:20.922731 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:20.922660 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-custom-227hf_2b51a082-ce0f-4545-9bbd-5112124373b2/s3-tls-init-custom/0.log" Apr 17 08:51:20.943910 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:20.943886 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-serving-67jk5_a1bbecba-5270-45bc-a857-dd0f493dd058/s3-tls-init-serving/0.log" Apr 17 08:51:22.767979 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:22.767951 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-jbj8t/perf-node-gather-daemonset-fdst4" Apr 17 08:51:24.786248 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:24.786159 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-ccz4q_93c5801c-fbc8-4496-b232-1869fa1f2267/migrator/0.log" Apr 17 08:51:24.810673 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:24.810648 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-ccz4q_93c5801c-fbc8-4496-b232-1869fa1f2267/graceful-termination/0.log" Apr 17 08:51:26.103400 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:26.103371 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mk6gb_3b851da0-b5d9-4467-80b9-e5ec59af0f5b/kube-multus-additional-cni-plugins/0.log" Apr 17 08:51:26.124545 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:26.124521 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mk6gb_3b851da0-b5d9-4467-80b9-e5ec59af0f5b/egress-router-binary-copy/0.log" Apr 17 08:51:26.144264 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:26.144242 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mk6gb_3b851da0-b5d9-4467-80b9-e5ec59af0f5b/cni-plugins/0.log" Apr 17 08:51:26.164044 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:26.164024 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mk6gb_3b851da0-b5d9-4467-80b9-e5ec59af0f5b/bond-cni-plugin/0.log" Apr 17 08:51:26.184564 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:26.184542 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mk6gb_3b851da0-b5d9-4467-80b9-e5ec59af0f5b/routeoverride-cni/0.log" Apr 17 08:51:26.204910 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:26.204889 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mk6gb_3b851da0-b5d9-4467-80b9-e5ec59af0f5b/whereabouts-cni-bincopy/0.log" Apr 17 08:51:26.225063 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:26.225008 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mk6gb_3b851da0-b5d9-4467-80b9-e5ec59af0f5b/whereabouts-cni/0.log" Apr 17 08:51:26.544130 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:26.544060 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-g768l_551b9d4f-0c1e-440f-8580-a99be726c79b/kube-multus/0.log" Apr 17 08:51:26.639698 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:26.639670 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-6vh7t_85afae1f-542f-4ccc-b1bd-45ba0e0c418f/network-metrics-daemon/0.log" Apr 17 08:51:26.659299 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:26.659277 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-6vh7t_85afae1f-542f-4ccc-b1bd-45ba0e0c418f/kube-rbac-proxy/0.log" Apr 17 08:51:28.171918 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:28.171888 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rnznv_e22e30ae-58f8-41be-9023-53dbea7c6e98/ovn-controller/0.log" Apr 17 08:51:28.190712 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:28.190684 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rnznv_e22e30ae-58f8-41be-9023-53dbea7c6e98/ovn-acl-logging/0.log" Apr 17 08:51:28.205677 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:28.205651 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rnznv_e22e30ae-58f8-41be-9023-53dbea7c6e98/ovn-acl-logging/1.log" Apr 17 08:51:28.225298 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:28.225267 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rnznv_e22e30ae-58f8-41be-9023-53dbea7c6e98/kube-rbac-proxy-node/0.log" Apr 17 08:51:28.248867 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:28.248847 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rnznv_e22e30ae-58f8-41be-9023-53dbea7c6e98/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 08:51:28.265854 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:28.265834 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rnznv_e22e30ae-58f8-41be-9023-53dbea7c6e98/northd/0.log" Apr 17 08:51:28.285316 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:28.285297 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rnznv_e22e30ae-58f8-41be-9023-53dbea7c6e98/nbdb/0.log" Apr 17 08:51:28.305329 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:28.305309 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rnznv_e22e30ae-58f8-41be-9023-53dbea7c6e98/sbdb/0.log" Apr 17 08:51:28.414644 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:28.414615 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rnznv_e22e30ae-58f8-41be-9023-53dbea7c6e98/ovnkube-controller/0.log" Apr 17 08:51:29.425127 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:29.425096 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-s895b_c47a920d-1db5-42ad-9b8e-ae9649778582/network-check-target-container/0.log" Apr 17 08:51:30.279395 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:30.279365 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-l64cj_fc13c055-cb05-4a25-a5e5-93cef3f0760b/iptables-alerter/0.log" Apr 17 08:51:30.929242 ip-10-0-134-176 kubenswrapper[2578]: I0417 08:51:30.929211 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-h8mxt_231772c6-755f-4c20-83c4-6013d0df7223/tuned/0.log"