Apr 24 23:50:56.840068 ip-10-0-135-201 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 24 23:50:56.840078 ip-10-0-135-201 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 24 23:50:56.840087 ip-10-0-135-201 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 24 23:50:56.840379 ip-10-0-135-201 systemd[1]: Failed to start Kubernetes Kubelet. Apr 24 23:51:08.173539 ip-10-0-135-201 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 24 23:51:08.173554 ip-10-0-135-201 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 661c0dd83e904f24930ef58341a1b19f -- Apr 24 23:53:26.364685 ip-10-0-135-201 systemd[1]: Starting Kubernetes Kubelet... Apr 24 23:53:26.805159 ip-10-0-135-201 kubenswrapper[2567]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 23:53:26.805159 ip-10-0-135-201 kubenswrapper[2567]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 23:53:26.805159 ip-10-0-135-201 kubenswrapper[2567]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 23:53:26.805159 ip-10-0-135-201 kubenswrapper[2567]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 23:53:26.805159 ip-10-0-135-201 kubenswrapper[2567]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 23:53:26.805977 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.805857 2567 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 23:53:26.808924 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.808907 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 23:53:26.808924 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.808924 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 23:53:26.808997 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.808928 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 23:53:26.808997 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.808931 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 23:53:26.808997 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.808934 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 23:53:26.808997 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.808937 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 23:53:26.808997 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.808940 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 23:53:26.808997 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.808944 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 23:53:26.808997 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.808948 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 23:53:26.808997 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.808952 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 23:53:26.808997 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.808955 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 23:53:26.808997 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.808958 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 23:53:26.808997 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.808961 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 23:53:26.808997 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.808964 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 23:53:26.808997 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.808966 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 23:53:26.808997 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.808969 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 23:53:26.808997 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.808971 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 23:53:26.808997 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.808974 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 23:53:26.808997 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.808978 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 23:53:26.808997 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.808981 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 23:53:26.808997 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.808983 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 23:53:26.809468 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.808986 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 24 23:53:26.809468 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.808988 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 23:53:26.809468 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.808991 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 23:53:26.809468 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.808993 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 23:53:26.809468 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.808996 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 23:53:26.809468 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.808999 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 23:53:26.809468 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809002 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 23:53:26.809468 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809005 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 23:53:26.809468 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809008 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 23:53:26.809468 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809011 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 23:53:26.809468 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809014 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 23:53:26.809468 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809017 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 23:53:26.809468 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809019 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 23:53:26.809468 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809022 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 23:53:26.809468 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809025 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 23:53:26.809468 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809027 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 23:53:26.809468 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809031 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 23:53:26.809468 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809033 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 23:53:26.809468 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809036 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 23:53:26.809995 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809038 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 23:53:26.809995 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809041 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 23:53:26.809995 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809043 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 23:53:26.809995 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809046 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 23:53:26.809995 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809048 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 23:53:26.809995 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809051 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 23:53:26.809995 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809054 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 23:53:26.809995 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809057 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 23:53:26.809995 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809061 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 23:53:26.809995 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809064 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 23:53:26.809995 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809068 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 23:53:26.809995 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809073 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 23:53:26.809995 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809077 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 23:53:26.809995 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809080 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 23:53:26.809995 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809084 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 23:53:26.809995 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809087 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 23:53:26.809995 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809090 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 23:53:26.809995 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809093 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 23:53:26.809995 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809095 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 23:53:26.809995 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809099 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 23:53:26.810504 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809101 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 23:53:26.810504 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809104 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 23:53:26.810504 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809106 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 23:53:26.810504 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809109 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 23:53:26.810504 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809111 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 23:53:26.810504 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809115 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 23:53:26.810504 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809118 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 23:53:26.810504 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809120 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 23:53:26.810504 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809123 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 23:53:26.810504 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809126 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 23:53:26.810504 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809128 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 23:53:26.810504 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809130 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 23:53:26.810504 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809133 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 23:53:26.810504 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809135 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 23:53:26.810504 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809138 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 23:53:26.810504 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809141 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 23:53:26.810504 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809143 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 23:53:26.810504 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809145 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 23:53:26.810504 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809148 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 23:53:26.810504 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809150 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 23:53:26.811006 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809153 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 23:53:26.811006 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809156 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 23:53:26.811006 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809158 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 23:53:26.811006 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809161 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 23:53:26.811006 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809165 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 23:53:26.811006 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809167 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 23:53:26.811006 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809604 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 23:53:26.811006 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809610 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 23:53:26.811006 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809612 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 23:53:26.811006 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809615 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 23:53:26.811006 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809618 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 23:53:26.811006 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809621 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 23:53:26.811006 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809624 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 23:53:26.811006 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809626 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 23:53:26.811006 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809629 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 23:53:26.811006 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809632 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 23:53:26.811006 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809634 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 23:53:26.811006 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809637 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 23:53:26.811006 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809640 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 23:53:26.811006 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809642 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 23:53:26.811531 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809645 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 23:53:26.811531 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809647 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 23:53:26.811531 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809650 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 23:53:26.811531 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809653 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 23:53:26.811531 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809656 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 23:53:26.811531 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809658 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 23:53:26.811531 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809661 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 23:53:26.811531 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809663 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 23:53:26.811531 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809666 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 23:53:26.811531 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809668 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 23:53:26.811531 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809671 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 23:53:26.811531 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809673 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 23:53:26.811531 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809676 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 23:53:26.811531 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809678 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 23:53:26.811531 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809681 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 23:53:26.811531 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809684 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 23:53:26.811531 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809687 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 23:53:26.811531 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809690 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 23:53:26.811531 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809693 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 23:53:26.811531 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809696 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 24 23:53:26.812026 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809699 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 23:53:26.812026 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809702 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 23:53:26.812026 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809706 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 23:53:26.812026 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809709 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 23:53:26.812026 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809711 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 23:53:26.812026 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809714 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 23:53:26.812026 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809717 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 23:53:26.812026 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809720 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 23:53:26.812026 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809722 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 23:53:26.812026 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809725 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 23:53:26.812026 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809730 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 23:53:26.812026 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809734 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 23:53:26.812026 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809737 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 23:53:26.812026 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809740 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 23:53:26.812026 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809743 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 23:53:26.812026 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809745 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 23:53:26.812026 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809748 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 23:53:26.812026 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809751 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 23:53:26.812026 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809754 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 23:53:26.812547 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809757 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 23:53:26.812547 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809760 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 23:53:26.812547 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809762 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 23:53:26.812547 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809765 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 23:53:26.812547 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809768 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 23:53:26.812547 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809770 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 23:53:26.812547 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809773 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 23:53:26.812547 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809777 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 23:53:26.812547 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809779 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 23:53:26.812547 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809782 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 23:53:26.812547 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809785 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 23:53:26.812547 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809787 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 23:53:26.812547 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809790 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 23:53:26.812547 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809792 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 23:53:26.812547 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809795 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 23:53:26.812547 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809797 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 23:53:26.812547 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809800 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 23:53:26.812547 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809802 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 23:53:26.812547 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809805 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 23:53:26.812547 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809807 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 23:53:26.813036 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809810 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 23:53:26.813036 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809812 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 23:53:26.813036 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809815 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 23:53:26.813036 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809817 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 23:53:26.813036 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809820 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 23:53:26.813036 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809822 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 23:53:26.813036 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809825 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 23:53:26.813036 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809827 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 23:53:26.813036 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809830 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 23:53:26.813036 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809832 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 23:53:26.813036 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809834 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 23:53:26.813036 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809837 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 23:53:26.813036 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.809840 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 23:53:26.813036 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.809920 2567 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 23:53:26.813036 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.809928 2567 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 23:53:26.813036 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.809944 2567 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 23:53:26.813036 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.809949 2567 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 23:53:26.813036 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.809953 2567 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 23:53:26.813036 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.809957 2567 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 23:53:26.813036 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.809961 2567 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 23:53:26.813036 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.809966 2567 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 23:53:26.813577 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.809969 2567 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 23:53:26.813577 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.809972 2567 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 23:53:26.813577 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.809976 2567 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 23:53:26.813577 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.809979 2567 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 23:53:26.813577 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.809983 2567 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 23:53:26.813577 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.809986 2567 flags.go:64] FLAG: --cgroup-root="" Apr 24 23:53:26.813577 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.809988 2567 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 23:53:26.813577 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.809991 2567 flags.go:64] FLAG: --client-ca-file="" Apr 24 23:53:26.813577 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.809994 2567 flags.go:64] FLAG: --cloud-config="" Apr 24 23:53:26.813577 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.809997 2567 flags.go:64] FLAG: --cloud-provider="external" Apr 24 23:53:26.813577 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810000 2567 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 23:53:26.813577 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810005 2567 flags.go:64] FLAG: --cluster-domain="" Apr 24 23:53:26.813577 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810008 2567 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 23:53:26.813577 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810011 2567 flags.go:64] FLAG: --config-dir="" Apr 24 23:53:26.813577 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810014 2567 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 23:53:26.813577 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810018 2567 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 23:53:26.813577 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810022 2567 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 23:53:26.813577 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810025 2567 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 23:53:26.813577 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810028 2567 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 23:53:26.813577 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810032 2567 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 23:53:26.813577 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810035 2567 flags.go:64] FLAG: --contention-profiling="false" Apr 24 23:53:26.813577 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810038 2567 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 23:53:26.813577 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810042 2567 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 23:53:26.813577 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810045 2567 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 23:53:26.813577 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810048 2567 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 23:53:26.814237 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810052 2567 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 23:53:26.814237 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810058 2567 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 23:53:26.814237 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810061 2567 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 23:53:26.814237 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810064 2567 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 23:53:26.814237 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810067 2567 flags.go:64] FLAG: --enable-server="true" Apr 24 23:53:26.814237 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810070 2567 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 23:53:26.814237 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810075 2567 flags.go:64] FLAG: --event-burst="100" Apr 24 23:53:26.814237 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810079 2567 flags.go:64] FLAG: --event-qps="50" Apr 24 23:53:26.814237 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810082 2567 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 23:53:26.814237 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810086 2567 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 23:53:26.814237 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810089 2567 flags.go:64] FLAG: --eviction-hard="" Apr 24 23:53:26.814237 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810094 2567 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 23:53:26.814237 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810097 2567 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 23:53:26.814237 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810100 2567 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 23:53:26.814237 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810103 2567 flags.go:64] FLAG: --eviction-soft="" Apr 24 23:53:26.814237 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810107 2567 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 23:53:26.814237 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810109 2567 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 23:53:26.814237 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810112 2567 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 23:53:26.814237 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810115 2567 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 23:53:26.814237 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810118 2567 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 23:53:26.814237 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810121 2567 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 23:53:26.814237 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810124 2567 flags.go:64] FLAG: --feature-gates="" Apr 24 23:53:26.814237 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810128 2567 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 23:53:26.814237 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810132 2567 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 23:53:26.814237 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810135 2567 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 23:53:26.814868 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810138 2567 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 23:53:26.814868 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810141 2567 flags.go:64] FLAG: --healthz-port="10248" Apr 24 23:53:26.814868 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810144 2567 flags.go:64] FLAG: --help="false" Apr 24 23:53:26.814868 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810147 2567 flags.go:64] FLAG: --hostname-override="ip-10-0-135-201.ec2.internal" Apr 24 23:53:26.814868 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810151 2567 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 23:53:26.814868 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810154 2567 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 23:53:26.814868 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810157 2567 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 23:53:26.814868 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810160 2567 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 23:53:26.814868 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810165 2567 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 23:53:26.814868 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810168 2567 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 23:53:26.814868 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810171 2567 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 23:53:26.814868 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810174 2567 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 23:53:26.814868 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810177 2567 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 23:53:26.814868 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810180 2567 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 23:53:26.814868 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810183 2567 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 23:53:26.814868 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810186 2567 flags.go:64] FLAG: --kube-reserved="" Apr 24 23:53:26.814868 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810189 2567 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 23:53:26.814868 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810192 2567 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 23:53:26.814868 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810196 2567 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 23:53:26.814868 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810198 2567 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 23:53:26.814868 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810201 2567 flags.go:64] FLAG: --lock-file="" Apr 24 23:53:26.814868 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810204 2567 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 23:53:26.814868 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810207 2567 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 23:53:26.814868 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810210 2567 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 23:53:26.815459 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810216 2567 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 23:53:26.815459 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810219 2567 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 23:53:26.815459 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810222 2567 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 23:53:26.815459 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810225 2567 flags.go:64] FLAG: --logging-format="text" Apr 24 23:53:26.815459 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810227 2567 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 23:53:26.815459 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810231 2567 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 23:53:26.815459 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810234 2567 flags.go:64] FLAG: --manifest-url="" Apr 24 23:53:26.815459 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810237 2567 flags.go:64] FLAG: --manifest-url-header="" Apr 24 23:53:26.815459 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810241 2567 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 23:53:26.815459 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810244 2567 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 23:53:26.815459 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810249 2567 flags.go:64] FLAG: --max-pods="110" Apr 24 23:53:26.815459 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810252 2567 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 23:53:26.815459 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810258 2567 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 23:53:26.815459 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810261 2567 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 23:53:26.815459 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810264 2567 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 23:53:26.815459 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810267 2567 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 23:53:26.815459 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810275 2567 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 23:53:26.815459 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810279 2567 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 23:53:26.815459 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810287 2567 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 23:53:26.815459 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810290 2567 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 23:53:26.815459 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810293 2567 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 23:53:26.815459 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810296 2567 flags.go:64] FLAG: --pod-cidr="" Apr 24 23:53:26.815459 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810299 2567 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 23:53:26.816020 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810305 2567 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 23:53:26.816020 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810308 2567 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 23:53:26.816020 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810311 2567 flags.go:64] FLAG: --pods-per-core="0" Apr 24 23:53:26.816020 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810314 2567 flags.go:64] FLAG: --port="10250" Apr 24 23:53:26.816020 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810318 2567 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 23:53:26.816020 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810321 2567 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-093690c91ec7dad60" Apr 24 23:53:26.816020 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810324 2567 flags.go:64] FLAG: --qos-reserved="" Apr 24 23:53:26.816020 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810327 2567 flags.go:64] FLAG: --read-only-port="10255" Apr 24 23:53:26.816020 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810330 2567 flags.go:64] FLAG: --register-node="true" Apr 24 23:53:26.816020 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810333 2567 flags.go:64] FLAG: --register-schedulable="true" Apr 24 23:53:26.816020 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810336 2567 flags.go:64] FLAG: --register-with-taints="" Apr 24 23:53:26.816020 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810340 2567 flags.go:64] FLAG: --registry-burst="10" Apr 24 23:53:26.816020 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810342 2567 flags.go:64] FLAG: --registry-qps="5" Apr 24 23:53:26.816020 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810345 2567 flags.go:64] FLAG: --reserved-cpus="" Apr 24 23:53:26.816020 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810348 2567 flags.go:64] FLAG: --reserved-memory="" Apr 24 23:53:26.816020 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810352 2567 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 23:53:26.816020 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810355 2567 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 23:53:26.816020 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810358 2567 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 23:53:26.816020 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810374 2567 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 23:53:26.816020 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810377 2567 flags.go:64] FLAG: --runonce="false" Apr 24 23:53:26.816020 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810380 2567 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 23:53:26.816020 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810384 2567 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 23:53:26.816020 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810388 2567 flags.go:64] FLAG: --seccomp-default="false" Apr 24 23:53:26.816020 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810391 2567 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 23:53:26.816020 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810394 2567 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 23:53:26.816020 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810398 2567 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 23:53:26.816659 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810402 2567 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 23:53:26.816659 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810405 2567 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 23:53:26.816659 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810408 2567 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 23:53:26.816659 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810411 2567 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 23:53:26.816659 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810414 2567 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 23:53:26.816659 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810417 2567 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 23:53:26.816659 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810421 2567 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 23:53:26.816659 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810424 2567 flags.go:64] FLAG: --system-cgroups="" Apr 24 23:53:26.816659 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810427 2567 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 23:53:26.816659 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810432 2567 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 23:53:26.816659 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810435 2567 flags.go:64] FLAG: --tls-cert-file="" Apr 24 23:53:26.816659 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810438 2567 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 23:53:26.816659 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810442 2567 flags.go:64] FLAG: --tls-min-version="" Apr 24 23:53:26.816659 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810445 2567 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 23:53:26.816659 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810448 2567 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 23:53:26.816659 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810451 2567 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 23:53:26.816659 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810454 2567 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 23:53:26.816659 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810457 2567 flags.go:64] FLAG: --v="2" Apr 24 23:53:26.816659 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810462 2567 flags.go:64] FLAG: --version="false" Apr 24 23:53:26.816659 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810466 2567 flags.go:64] FLAG: --vmodule="" Apr 24 23:53:26.816659 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810470 2567 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 23:53:26.816659 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810473 2567 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 23:53:26.816659 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810563 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 23:53:26.816659 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810568 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 23:53:26.817547 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810572 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 23:53:26.817547 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810575 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 23:53:26.817547 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810578 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 23:53:26.817547 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810581 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 23:53:26.817547 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810586 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 23:53:26.817547 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810589 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 23:53:26.817547 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810591 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 23:53:26.817547 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810595 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 23:53:26.817547 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810598 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 23:53:26.817547 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810600 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 23:53:26.817547 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810603 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 23:53:26.817547 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810606 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 23:53:26.817547 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810609 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 23:53:26.817547 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810611 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 23:53:26.817547 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810614 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 23:53:26.817547 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810616 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 23:53:26.817547 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810619 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 23:53:26.817547 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810622 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 23:53:26.817547 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810625 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 23:53:26.818163 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810627 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 23:53:26.818163 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810630 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 23:53:26.818163 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810633 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 23:53:26.818163 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810636 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 23:53:26.818163 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810638 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 23:53:26.818163 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810641 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 23:53:26.818163 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810643 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 23:53:26.818163 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810646 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 23:53:26.818163 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810649 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 23:53:26.818163 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810652 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 23:53:26.818163 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810655 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 23:53:26.818163 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810659 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 23:53:26.818163 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810662 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 23:53:26.818163 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810664 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 23:53:26.818163 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810667 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 23:53:26.818163 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810669 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 23:53:26.818163 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810672 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 23:53:26.818163 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810676 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 23:53:26.818163 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810679 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 23:53:26.818163 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810682 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 23:53:26.818675 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810686 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 23:53:26.818675 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810688 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 23:53:26.818675 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810691 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 23:53:26.818675 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810693 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 23:53:26.818675 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810696 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 23:53:26.818675 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810699 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 23:53:26.818675 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810701 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 23:53:26.818675 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810704 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 23:53:26.818675 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810707 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 23:53:26.818675 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810709 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 23:53:26.818675 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810712 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 23:53:26.818675 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810714 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 23:53:26.818675 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810717 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 23:53:26.818675 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810719 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 23:53:26.818675 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810722 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 23:53:26.818675 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810726 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 23:53:26.818675 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810728 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 23:53:26.818675 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810731 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 23:53:26.818675 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810734 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 23:53:26.819145 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810736 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 23:53:26.819145 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810739 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 23:53:26.819145 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810741 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 23:53:26.819145 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810744 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 23:53:26.819145 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810746 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 23:53:26.819145 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810749 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 24 23:53:26.819145 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810751 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 23:53:26.819145 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810754 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 23:53:26.819145 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810756 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 23:53:26.819145 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810758 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 23:53:26.819145 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810763 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 23:53:26.819145 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810765 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 23:53:26.819145 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810768 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 23:53:26.819145 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810771 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 23:53:26.819145 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810774 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 23:53:26.819145 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810777 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 23:53:26.819145 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810779 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 23:53:26.819145 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810782 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 23:53:26.819145 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810784 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 23:53:26.819145 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810787 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 23:53:26.819663 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810790 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 23:53:26.819663 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810792 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 23:53:26.819663 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810795 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 23:53:26.819663 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810797 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 23:53:26.819663 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810800 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 23:53:26.819663 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.810802 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 23:53:26.819663 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.810814 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 23:53:26.819663 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.818556 2567 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 23:53:26.819663 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.818573 2567 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 23:53:26.819663 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818639 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 23:53:26.819663 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818644 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 23:53:26.819663 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818648 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 23:53:26.819663 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818651 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 23:53:26.819663 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818654 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 23:53:26.819663 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818658 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 23:53:26.819663 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818660 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 23:53:26.820061 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818664 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 23:53:26.820061 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818667 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 23:53:26.820061 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818671 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 23:53:26.820061 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818673 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 23:53:26.820061 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818676 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 23:53:26.820061 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818680 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 23:53:26.820061 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818683 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 23:53:26.820061 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818686 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 23:53:26.820061 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818689 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 23:53:26.820061 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818693 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 23:53:26.820061 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818696 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 23:53:26.820061 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818698 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 23:53:26.820061 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818701 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 23:53:26.820061 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818704 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 23:53:26.820061 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818708 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 23:53:26.820061 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818713 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 23:53:26.820061 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818716 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 23:53:26.820061 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818720 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 23:53:26.820061 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818725 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 23:53:26.820642 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818728 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 23:53:26.820642 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818730 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 23:53:26.820642 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818734 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 23:53:26.820642 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818736 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 23:53:26.820642 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818740 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 23:53:26.820642 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818743 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 23:53:26.820642 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818745 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 23:53:26.820642 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818748 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 23:53:26.820642 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818751 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 23:53:26.820642 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818754 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 24 23:53:26.820642 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818757 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 23:53:26.820642 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818759 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 23:53:26.820642 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818762 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 23:53:26.820642 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818764 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 23:53:26.820642 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818767 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 23:53:26.820642 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818770 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 23:53:26.820642 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818772 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 23:53:26.820642 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818774 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 23:53:26.820642 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818777 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 23:53:26.820642 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818780 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 23:53:26.821129 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818783 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 23:53:26.821129 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818786 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 23:53:26.821129 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818789 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 23:53:26.821129 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818792 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 23:53:26.821129 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818794 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 23:53:26.821129 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818797 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 23:53:26.821129 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818800 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 23:53:26.821129 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818802 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 23:53:26.821129 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818805 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 23:53:26.821129 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818808 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 23:53:26.821129 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818810 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 23:53:26.821129 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818813 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 23:53:26.821129 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818815 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 23:53:26.821129 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818818 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 23:53:26.821129 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818821 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 23:53:26.821129 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818823 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 23:53:26.821129 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818826 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 23:53:26.821129 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818828 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 23:53:26.821129 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818831 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 23:53:26.821609 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818833 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 23:53:26.821609 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818836 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 23:53:26.821609 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818838 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 23:53:26.821609 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818841 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 23:53:26.821609 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818843 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 23:53:26.821609 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818846 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 23:53:26.821609 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818848 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 23:53:26.821609 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818851 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 23:53:26.821609 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818853 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 23:53:26.821609 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818856 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 23:53:26.821609 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818858 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 23:53:26.821609 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818861 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 23:53:26.821609 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818864 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 23:53:26.821609 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818867 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 23:53:26.821609 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818869 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 23:53:26.821609 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818872 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 23:53:26.821609 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818875 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 23:53:26.821609 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818877 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 23:53:26.821609 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818880 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 23:53:26.821609 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818882 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 23:53:26.822141 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.818885 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 23:53:26.822141 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.818889 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 23:53:26.822141 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819015 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 23:53:26.822141 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819021 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 23:53:26.822141 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819024 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 23:53:26.822141 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819027 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 23:53:26.822141 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819030 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 23:53:26.822141 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819033 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 23:53:26.822141 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819036 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 23:53:26.822141 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819038 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 23:53:26.822141 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819041 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 23:53:26.822141 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819044 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 23:53:26.822141 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819046 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 23:53:26.822141 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819049 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 23:53:26.822141 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819051 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 23:53:26.822141 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819054 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 23:53:26.822563 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819057 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 23:53:26.822563 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819059 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 23:53:26.822563 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819062 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 23:53:26.822563 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819064 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 23:53:26.822563 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819066 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 23:53:26.822563 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819069 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 23:53:26.822563 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819071 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 23:53:26.822563 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819074 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 23:53:26.822563 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819078 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 23:53:26.822563 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819080 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 23:53:26.822563 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819083 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 23:53:26.822563 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819087 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 23:53:26.822563 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819091 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 23:53:26.822563 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819094 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 23:53:26.822563 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819096 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 23:53:26.822563 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819099 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 23:53:26.822563 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819101 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 23:53:26.822563 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819104 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 23:53:26.822563 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819107 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 23:53:26.823041 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819109 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 23:53:26.823041 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819112 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 23:53:26.823041 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819114 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 23:53:26.823041 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819117 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 23:53:26.823041 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819120 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 23:53:26.823041 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819122 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 23:53:26.823041 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819125 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 23:53:26.823041 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819127 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 23:53:26.823041 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819130 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 23:53:26.823041 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819132 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 23:53:26.823041 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819135 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 23:53:26.823041 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819138 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 23:53:26.823041 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819140 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 23:53:26.823041 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819142 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 23:53:26.823041 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819145 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 23:53:26.823041 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819148 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 23:53:26.823041 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819151 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 23:53:26.823041 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819154 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 23:53:26.823041 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819158 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 23:53:26.823041 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819161 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 23:53:26.823552 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819164 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 23:53:26.823552 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819168 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 23:53:26.823552 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819172 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 23:53:26.823552 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819175 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 23:53:26.823552 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819178 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 23:53:26.823552 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819181 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 23:53:26.823552 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819183 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 23:53:26.823552 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819186 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 23:53:26.823552 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819188 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 23:53:26.823552 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819191 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 23:53:26.823552 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819194 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 23:53:26.823552 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819196 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 23:53:26.823552 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819199 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 23:53:26.823552 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819201 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 23:53:26.823552 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819204 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 24 23:53:26.823552 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819206 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 23:53:26.823552 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819209 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 23:53:26.823552 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819211 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 23:53:26.823552 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819214 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 23:53:26.824013 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819217 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 23:53:26.824013 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819219 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 23:53:26.824013 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819222 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 23:53:26.824013 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819224 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 23:53:26.824013 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819226 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 23:53:26.824013 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819229 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 23:53:26.824013 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819232 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 23:53:26.824013 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819234 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 23:53:26.824013 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819237 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 23:53:26.824013 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819239 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 23:53:26.824013 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819241 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 23:53:26.824013 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819244 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 23:53:26.824013 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819246 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 23:53:26.824013 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:26.819249 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 23:53:26.824013 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.819254 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 23:53:26.824013 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.819943 2567 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 23:53:26.824420 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.822591 2567 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 23:53:26.824420 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.823543 2567 server.go:1019] "Starting client certificate rotation" Apr 24 23:53:26.824420 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.823641 2567 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 23:53:26.824420 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.823685 2567 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 23:53:26.846642 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.846619 2567 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 23:53:26.848925 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.848899 2567 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 23:53:26.863616 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.863590 2567 log.go:25] "Validated CRI v1 runtime API" Apr 24 23:53:26.869355 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.869337 2567 log.go:25] "Validated CRI v1 image API" Apr 24 23:53:26.870589 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.870577 2567 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 23:53:26.878524 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.878500 2567 fs.go:135] Filesystem UUIDs: map[70e061a5-3988-48d8-802b-2d4fcb094ecc:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 a8a05efd-376e-485c-893d-ae5f4edfc386:/dev/nvme0n1p3] Apr 24 23:53:26.878607 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.878521 2567 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 23:53:26.879562 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.879541 2567 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 23:53:26.884179 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.884072 2567 manager.go:217] Machine: {Timestamp:2026-04-24 23:53:26.882286422 +0000 UTC m=+0.405570637 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3094318 MemoryCapacity:33164480512 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec21daffc6f316ff6af6d4dbc1796ae0 SystemUUID:ec21daff-c6f3-16ff-6af6-d4dbc1796ae0 BootID:661c0dd8-3e90-4f24-930e-f58341a1b19f Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582238208 Type:vfs Inodes:4048398 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:36:90:11:57:d7 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:36:90:11:57:d7 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ce:29:d2:a8:95:a4 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164480512 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 23:53:26.884179 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.884173 2567 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 23:53:26.884284 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.884257 2567 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 23:53:26.884620 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.884601 2567 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 23:53:26.884755 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.884621 2567 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-135-201.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 23:53:26.884800 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.884765 2567 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 23:53:26.884800 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.884774 2567 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 23:53:26.884800 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.884787 2567 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 23:53:26.885662 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.885651 2567 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 23:53:26.886492 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.886483 2567 state_mem.go:36] "Initialized new in-memory state store" Apr 24 23:53:26.886752 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.886742 2567 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 23:53:26.888964 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.888954 2567 kubelet.go:491] "Attempting to sync node with API server" Apr 24 23:53:26.889001 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.888970 2567 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 23:53:26.889001 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.888983 2567 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 23:53:26.889001 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.888992 2567 kubelet.go:397] "Adding apiserver pod source" Apr 24 23:53:26.889001 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.889000 2567 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 23:53:26.890122 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.890109 2567 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 23:53:26.890211 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.890132 2567 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 23:53:26.893111 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.893095 2567 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 23:53:26.894736 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.894719 2567 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 23:53:26.896128 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.896113 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 23:53:26.896213 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.896132 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 23:53:26.896213 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.896141 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 23:53:26.896213 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.896149 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 23:53:26.896213 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.896169 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 23:53:26.896213 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.896179 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 23:53:26.896213 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.896187 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 23:53:26.896213 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.896196 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 23:53:26.896213 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.896212 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 23:53:26.896496 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.896221 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 23:53:26.896496 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.896242 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 23:53:26.896496 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.896257 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 23:53:26.897114 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.897103 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 23:53:26.897163 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.897118 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 23:53:26.900843 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.900824 2567 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-135-201.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 23:53:26.900929 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:26.900847 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-135-201.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 23:53:26.900929 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:26.900872 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 23:53:26.901010 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.900967 2567 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 23:53:26.901010 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.900996 2567 server.go:1295] "Started kubelet" Apr 24 23:53:26.901104 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.901062 2567 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 23:53:26.901230 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.901190 2567 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 23:53:26.901263 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.901250 2567 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 23:53:26.901758 ip-10-0-135-201 systemd[1]: Started Kubernetes Kubelet. Apr 24 23:53:26.902977 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.902961 2567 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 23:53:26.904033 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.904018 2567 server.go:317] "Adding debug handlers to kubelet server" Apr 24 23:53:26.907699 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.907649 2567 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 23:53:26.908438 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.908188 2567 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 23:53:26.908991 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.908976 2567 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 23:53:26.909088 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.909079 2567 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 23:53:26.912762 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:26.912737 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-201.ec2.internal\" not found" Apr 24 23:53:26.912762 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:26.908609 2567 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-201.ec2.internal.18a9701f1ca0e2aa default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-201.ec2.internal,UID:ip-10-0-135-201.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-135-201.ec2.internal,},FirstTimestamp:2026-04-24 23:53:26.90097425 +0000 UTC m=+0.424258465,LastTimestamp:2026-04-24 23:53:26.90097425 +0000 UTC m=+0.424258465,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-201.ec2.internal,}" Apr 24 23:53:26.913082 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.913063 2567 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 23:53:26.914021 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.913988 2567 factory.go:55] Registering systemd factory Apr 24 23:53:26.914109 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.914025 2567 factory.go:223] Registration of the systemd container factory successfully Apr 24 23:53:26.914443 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.914406 2567 factory.go:153] Registering CRI-O factory Apr 24 23:53:26.914563 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.914552 2567 factory.go:223] Registration of the crio container factory successfully Apr 24 23:53:26.914726 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.914707 2567 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 23:53:26.914835 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.914825 2567 factory.go:103] Registering Raw factory Apr 24 23:53:26.914984 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.914958 2567 manager.go:1196] Started watching for new ooms in manager Apr 24 23:53:26.915493 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.915477 2567 manager.go:319] Starting recovery of all containers Apr 24 23:53:26.915638 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.914566 2567 reconstruct.go:97] "Volume reconstruction finished" Apr 24 23:53:26.915638 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.915641 2567 reconciler.go:26] "Reconciler: start to sync state" Apr 24 23:53:26.915900 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:26.914758 2567 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 23:53:26.916424 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:26.916396 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 23:53:26.916544 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:26.916522 2567 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-135-201.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 24 23:53:26.923392 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.923191 2567 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-48khz" Apr 24 23:53:26.925586 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.925568 2567 manager.go:324] Recovery completed Apr 24 23:53:26.930103 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.930087 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:26.931438 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.931421 2567 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-48khz" Apr 24 23:53:26.932772 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.932757 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-201.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:26.932813 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.932783 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-201.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:26.932813 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.932794 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-201.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:26.933356 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.933340 2567 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 23:53:26.933356 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.933354 2567 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 23:53:26.933459 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.933383 2567 state_mem.go:36] "Initialized new in-memory state store" Apr 24 23:53:26.934895 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:26.934831 2567 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-201.ec2.internal.18a9701f1e861276 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-201.ec2.internal,UID:ip-10-0-135-201.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-135-201.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-135-201.ec2.internal,},FirstTimestamp:2026-04-24 23:53:26.932771446 +0000 UTC m=+0.456055661,LastTimestamp:2026-04-24 23:53:26.932771446 +0000 UTC m=+0.456055661,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-201.ec2.internal,}" Apr 24 23:53:26.936635 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.936622 2567 policy_none.go:49] "None policy: Start" Apr 24 23:53:26.936635 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.936637 2567 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 23:53:26.936715 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.936647 2567 state_mem.go:35] "Initializing new in-memory state store" Apr 24 23:53:26.975018 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.975001 2567 manager.go:341] "Starting Device Plugin manager" Apr 24 23:53:26.998427 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:26.975032 2567 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 23:53:26.998427 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.975041 2567 server.go:85] "Starting device plugin registration server" Apr 24 23:53:26.998427 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.975279 2567 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 23:53:26.998427 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.975293 2567 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 23:53:26.998427 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.975394 2567 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 23:53:26.998427 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.975465 2567 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 23:53:26.998427 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:26.975473 2567 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 23:53:26.998427 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:26.976004 2567 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 23:53:26.998427 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:26.976044 2567 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-135-201.ec2.internal\" not found" Apr 24 23:53:27.012830 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:27.012793 2567 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 23:53:27.013978 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:27.013949 2567 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 23:53:27.013978 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:27.013982 2567 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 23:53:27.014134 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:27.014001 2567 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 23:53:27.014134 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:27.014007 2567 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 23:53:27.014134 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:27.014040 2567 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 23:53:27.020462 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:27.020443 2567 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:53:27.076345 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:27.076269 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:27.077437 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:27.077419 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-201.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:27.077521 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:27.077448 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-201.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:27.077521 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:27.077459 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-201.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:27.077521 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:27.077487 2567 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-135-201.ec2.internal" Apr 24 23:53:27.083719 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:27.083705 2567 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-135-201.ec2.internal" Apr 24 23:53:27.083763 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:27.083727 2567 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-135-201.ec2.internal\": node \"ip-10-0-135-201.ec2.internal\" not found" Apr 24 23:53:27.097847 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:27.097830 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-201.ec2.internal\" not found" Apr 24 23:53:27.115115 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:27.115090 2567 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-201.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-135-201.ec2.internal"] Apr 24 23:53:27.115183 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:27.115173 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:27.116625 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:27.116605 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-201.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:27.116715 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:27.116633 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-201.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:27.116715 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:27.116649 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-201.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:27.118835 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:27.118823 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:27.118988 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:27.118970 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-201.ec2.internal" Apr 24 23:53:27.119048 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:27.119008 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:27.119615 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:27.119598 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-201.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:27.119689 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:27.119628 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-201.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:27.119689 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:27.119639 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-201.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:27.119689 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:27.119605 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-201.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:27.119775 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:27.119704 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-201.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:27.119775 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:27.119716 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-201.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:27.121681 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:27.121665 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-201.ec2.internal" Apr 24 23:53:27.121753 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:27.121693 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:27.122332 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:27.122318 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-201.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:27.122424 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:27.122344 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-201.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:27.122424 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:27.122356 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-201.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:27.145010 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:27.144986 2567 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-201.ec2.internal\" not found" node="ip-10-0-135-201.ec2.internal" Apr 24 23:53:27.150783 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:27.150494 2567 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-201.ec2.internal\" not found" node="ip-10-0-135-201.ec2.internal" Apr 24 23:53:27.198437 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:27.198404 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-201.ec2.internal\" not found" Apr 24 23:53:27.216837 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:27.216804 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/180c36d91721dcc56c12d3f2d3227bee-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-201.ec2.internal\" (UID: \"180c36d91721dcc56c12d3f2d3227bee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-201.ec2.internal" Apr 24 23:53:27.216923 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:27.216843 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/32797e584e4864d985182231bd63814e-config\") pod \"kube-apiserver-proxy-ip-10-0-135-201.ec2.internal\" (UID: \"32797e584e4864d985182231bd63814e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-201.ec2.internal" Apr 24 23:53:27.216923 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:27.216861 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/180c36d91721dcc56c12d3f2d3227bee-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-201.ec2.internal\" (UID: \"180c36d91721dcc56c12d3f2d3227bee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-201.ec2.internal" Apr 24 23:53:27.298961 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:27.298921 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-201.ec2.internal\" not found" Apr 24 23:53:27.317405 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:27.317355 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/180c36d91721dcc56c12d3f2d3227bee-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-201.ec2.internal\" (UID: \"180c36d91721dcc56c12d3f2d3227bee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-201.ec2.internal" Apr 24 23:53:27.317522 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:27.317448 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/180c36d91721dcc56c12d3f2d3227bee-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-201.ec2.internal\" (UID: \"180c36d91721dcc56c12d3f2d3227bee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-201.ec2.internal" Apr 24 23:53:27.317522 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:27.317445 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/180c36d91721dcc56c12d3f2d3227bee-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-201.ec2.internal\" (UID: \"180c36d91721dcc56c12d3f2d3227bee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-201.ec2.internal" Apr 24 23:53:27.317522 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:27.317480 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/180c36d91721dcc56c12d3f2d3227bee-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-201.ec2.internal\" (UID: \"180c36d91721dcc56c12d3f2d3227bee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-201.ec2.internal" Apr 24 23:53:27.317522 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:27.317491 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/32797e584e4864d985182231bd63814e-config\") pod \"kube-apiserver-proxy-ip-10-0-135-201.ec2.internal\" (UID: \"32797e584e4864d985182231bd63814e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-201.ec2.internal" Apr 24 23:53:27.317522 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:27.317516 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/32797e584e4864d985182231bd63814e-config\") pod \"kube-apiserver-proxy-ip-10-0-135-201.ec2.internal\" (UID: \"32797e584e4864d985182231bd63814e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-201.ec2.internal" Apr 24 23:53:27.399903 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:27.399837 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-201.ec2.internal\" not found" Apr 24 23:53:27.447416 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:27.447383 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-201.ec2.internal" Apr 24 23:53:27.452918 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:27.452895 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-201.ec2.internal" Apr 24 23:53:27.500594 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:27.500557 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-201.ec2.internal\" not found" Apr 24 23:53:27.601093 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:27.601059 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-201.ec2.internal\" not found" Apr 24 23:53:27.701706 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:27.701603 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-201.ec2.internal\" not found" Apr 24 23:53:27.802202 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:27.802172 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-201.ec2.internal\" not found" Apr 24 23:53:27.823627 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:27.823607 2567 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 23:53:27.824144 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:27.823734 2567 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 23:53:27.903433 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:27.903401 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-201.ec2.internal\" not found" Apr 24 23:53:27.906613 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:27.906593 2567 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:53:27.908241 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:27.908225 2567 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 23:53:27.913541 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:27.913522 2567 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-201.ec2.internal" Apr 24 23:53:27.920750 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:27.920719 2567 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 23:53:27.927170 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:27.927154 2567 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 23:53:27.928824 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:27.928808 2567 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-201.ec2.internal" Apr 24 23:53:27.933827 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:27.933799 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 23:48:26 +0000 UTC" deadline="2027-12-22 15:56:13.261796801 +0000 UTC" Apr 24 23:53:27.933827 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:27.933823 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14560h2m45.327976369s" Apr 24 23:53:27.937618 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:27.937604 2567 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 23:53:27.942658 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:27.942635 2567 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-vm6dn" Apr 24 23:53:27.951669 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:27.951655 2567 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-vm6dn" Apr 24 23:53:28.008397 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:28.008353 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod180c36d91721dcc56c12d3f2d3227bee.slice/crio-90533d9b7198cd638b83b7d1bab717aebf1c766754435d837cf0abe0fea62772 WatchSource:0}: Error finding container 90533d9b7198cd638b83b7d1bab717aebf1c766754435d837cf0abe0fea62772: Status 404 returned error can't find the container with id 90533d9b7198cd638b83b7d1bab717aebf1c766754435d837cf0abe0fea62772 Apr 24 23:53:28.008895 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:28.008872 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32797e584e4864d985182231bd63814e.slice/crio-6b44fda44e9e567d6a571cf4450e5a98bd227ab134fa60881783302537f7c990 WatchSource:0}: Error finding container 6b44fda44e9e567d6a571cf4450e5a98bd227ab134fa60881783302537f7c990: Status 404 returned error can't find the container with id 6b44fda44e9e567d6a571cf4450e5a98bd227ab134fa60881783302537f7c990 Apr 24 23:53:28.013353 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.013339 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 23:53:28.013679 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.013662 2567 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:53:28.016496 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.016459 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-201.ec2.internal" event={"ID":"180c36d91721dcc56c12d3f2d3227bee","Type":"ContainerStarted","Data":"90533d9b7198cd638b83b7d1bab717aebf1c766754435d837cf0abe0fea62772"} Apr 24 23:53:28.017272 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.017253 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-201.ec2.internal" event={"ID":"32797e584e4864d985182231bd63814e","Type":"ContainerStarted","Data":"6b44fda44e9e567d6a571cf4450e5a98bd227ab134fa60881783302537f7c990"} Apr 24 23:53:28.334969 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.334890 2567 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:53:28.890551 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.890509 2567 apiserver.go:52] "Watching apiserver" Apr 24 23:53:28.898541 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.898507 2567 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 23:53:28.899444 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.899420 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-nb82k","openshift-dns/node-resolver-75bcc","openshift-image-registry/node-ca-cjk8s","openshift-multus/multus-additional-cni-plugins-mmdsm","openshift-multus/multus-x9qdz","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-201.ec2.internal","openshift-multus/network-metrics-daemon-7wg4q","openshift-network-diagnostics/network-check-target-f9vfv","openshift-network-operator/iptables-alerter-2m9kc","openshift-ovn-kubernetes/ovnkube-node-7hz6p","kube-system/konnectivity-agent-ckn8x","kube-system/kube-apiserver-proxy-ip-10-0-135-201.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qq9fz"] Apr 24 23:53:28.903956 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.903937 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-nb82k" Apr 24 23:53:28.906041 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.906020 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-75bcc" Apr 24 23:53:28.906822 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.906803 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 23:53:28.906822 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.906809 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 23:53:28.906952 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.906839 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-tzfnr\"" Apr 24 23:53:28.908132 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.908112 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-cjk8s" Apr 24 23:53:28.908221 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.908200 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 23:53:28.908435 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.908420 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-d9czf\"" Apr 24 23:53:28.908493 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.908479 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 23:53:28.911843 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.910307 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 23:53:28.911843 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.910318 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-n62l4\"" Apr 24 23:53:28.911843 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.910608 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 23:53:28.911843 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.910699 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-mmdsm" Apr 24 23:53:28.911843 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.910799 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 23:53:28.913076 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.913054 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 23:53:28.913505 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.913288 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 23:53:28.913505 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.913325 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 23:53:28.913505 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.913449 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 23:53:28.913505 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.913488 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-x9qdz" Apr 24 23:53:28.913747 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.913674 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 23:53:28.913747 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.913730 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-r5stf\"" Apr 24 23:53:28.915597 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.915578 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7wg4q" Apr 24 23:53:28.915689 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.915664 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f9vfv" Apr 24 23:53:28.915750 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:28.915681 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7wg4q" podUID="4a7b82bd-bf6c-4091-8f48-64cea3e964a8" Apr 24 23:53:28.915750 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:28.915721 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f9vfv" podUID="c44399ed-9019-430d-83d7-8cde0e6f0d03" Apr 24 23:53:28.915750 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.915743 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 23:53:28.915987 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.915968 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-fgsl7\"" Apr 24 23:53:28.917885 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.917868 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-2m9kc" Apr 24 23:53:28.920100 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.920082 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-n52cn\"" Apr 24 23:53:28.920219 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.920127 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 23:53:28.920219 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.920210 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 23:53:28.920417 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.920402 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 23:53:28.920533 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.920514 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:28.922701 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.922682 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-ckn8x" Apr 24 23:53:28.923046 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.923028 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 23:53:28.924526 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.924507 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 23:53:28.924630 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.924541 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-q4glt\"" Apr 24 23:53:28.924689 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.924651 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 23:53:28.924743 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.924706 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 23:53:28.924743 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.924680 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 23:53:28.924840 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.924774 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 23:53:28.924906 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.924891 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qq9fz" Apr 24 23:53:28.925331 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.925312 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 23:53:28.925432 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.925348 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/49ea4681-f36b-4e20-a2b5-d76f46611b7a-host\") pod \"node-ca-cjk8s\" (UID: \"49ea4681-f36b-4e20-a2b5-d76f46611b7a\") " pod="openshift-image-registry/node-ca-cjk8s" Apr 24 23:53:28.925432 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.925410 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfbcl\" (UniqueName: \"kubernetes.io/projected/49ea4681-f36b-4e20-a2b5-d76f46611b7a-kube-api-access-wfbcl\") pod \"node-ca-cjk8s\" (UID: \"49ea4681-f36b-4e20-a2b5-d76f46611b7a\") " pod="openshift-image-registry/node-ca-cjk8s" Apr 24 23:53:28.925432 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.925420 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 23:53:28.925590 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.925456 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-l24q2\"" Apr 24 23:53:28.925590 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.925454 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/55dc1a60-4f7d-4366-aca1-303ce72f4d84-system-cni-dir\") pod \"multus-x9qdz\" (UID: \"55dc1a60-4f7d-4366-aca1-303ce72f4d84\") " pod="openshift-multus/multus-x9qdz" Apr 24 23:53:28.925590 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.925531 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/50d11a09-c912-418a-ab1b-e1f5272b1d2f-etc-kubernetes\") pod \"tuned-nb82k\" (UID: \"50d11a09-c912-418a-ab1b-e1f5272b1d2f\") " pod="openshift-cluster-node-tuning-operator/tuned-nb82k" Apr 24 23:53:28.925590 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.925585 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/50d11a09-c912-418a-ab1b-e1f5272b1d2f-var-lib-kubelet\") pod \"tuned-nb82k\" (UID: \"50d11a09-c912-418a-ab1b-e1f5272b1d2f\") " pod="openshift-cluster-node-tuning-operator/tuned-nb82k" Apr 24 23:53:28.925769 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.925611 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/50d11a09-c912-418a-ab1b-e1f5272b1d2f-etc-tuned\") pod \"tuned-nb82k\" (UID: \"50d11a09-c912-418a-ab1b-e1f5272b1d2f\") " pod="openshift-cluster-node-tuning-operator/tuned-nb82k" Apr 24 23:53:28.925769 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.925652 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/50d11a09-c912-418a-ab1b-e1f5272b1d2f-tmp\") pod \"tuned-nb82k\" (UID: \"50d11a09-c912-418a-ab1b-e1f5272b1d2f\") " pod="openshift-cluster-node-tuning-operator/tuned-nb82k" Apr 24 23:53:28.925769 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.925679 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1134624f-34d1-4f6e-8821-435df2b54c9b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mmdsm\" (UID: \"1134624f-34d1-4f6e-8821-435df2b54c9b\") " pod="openshift-multus/multus-additional-cni-plugins-mmdsm" Apr 24 23:53:28.925769 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.925704 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/55dc1a60-4f7d-4366-aca1-303ce72f4d84-host-run-netns\") pod \"multus-x9qdz\" (UID: \"55dc1a60-4f7d-4366-aca1-303ce72f4d84\") " pod="openshift-multus/multus-x9qdz" Apr 24 23:53:28.925769 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.925728 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/55dc1a60-4f7d-4366-aca1-303ce72f4d84-host-var-lib-cni-multus\") pod \"multus-x9qdz\" (UID: \"55dc1a60-4f7d-4366-aca1-303ce72f4d84\") " pod="openshift-multus/multus-x9qdz" Apr 24 23:53:28.925769 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.925763 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/55dc1a60-4f7d-4366-aca1-303ce72f4d84-host-var-lib-kubelet\") pod \"multus-x9qdz\" (UID: \"55dc1a60-4f7d-4366-aca1-303ce72f4d84\") " pod="openshift-multus/multus-x9qdz" Apr 24 23:53:28.925979 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.925804 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55dc1a60-4f7d-4366-aca1-303ce72f4d84-etc-kubernetes\") pod \"multus-x9qdz\" (UID: \"55dc1a60-4f7d-4366-aca1-303ce72f4d84\") " pod="openshift-multus/multus-x9qdz" Apr 24 23:53:28.925979 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.925833 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/323b58fb-a204-4400-a836-973ccf33cd8e-tmp-dir\") pod \"node-resolver-75bcc\" (UID: \"323b58fb-a204-4400-a836-973ccf33cd8e\") " pod="openshift-dns/node-resolver-75bcc" Apr 24 23:53:28.925979 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.925859 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/49ea4681-f36b-4e20-a2b5-d76f46611b7a-serviceca\") pod \"node-ca-cjk8s\" (UID: \"49ea4681-f36b-4e20-a2b5-d76f46611b7a\") " pod="openshift-image-registry/node-ca-cjk8s" Apr 24 23:53:28.925979 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.925882 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/55dc1a60-4f7d-4366-aca1-303ce72f4d84-os-release\") pod \"multus-x9qdz\" (UID: \"55dc1a60-4f7d-4366-aca1-303ce72f4d84\") " pod="openshift-multus/multus-x9qdz" Apr 24 23:53:28.925979 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.925917 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/55dc1a60-4f7d-4366-aca1-303ce72f4d84-cni-binary-copy\") pod \"multus-x9qdz\" (UID: \"55dc1a60-4f7d-4366-aca1-303ce72f4d84\") " pod="openshift-multus/multus-x9qdz" Apr 24 23:53:28.925979 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.925953 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggfgt\" (UniqueName: \"kubernetes.io/projected/55dc1a60-4f7d-4366-aca1-303ce72f4d84-kube-api-access-ggfgt\") pod \"multus-x9qdz\" (UID: \"55dc1a60-4f7d-4366-aca1-303ce72f4d84\") " pod="openshift-multus/multus-x9qdz" Apr 24 23:53:28.925979 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.925979 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a7b82bd-bf6c-4091-8f48-64cea3e964a8-metrics-certs\") pod \"network-metrics-daemon-7wg4q\" (UID: \"4a7b82bd-bf6c-4091-8f48-64cea3e964a8\") " pod="openshift-multus/network-metrics-daemon-7wg4q" Apr 24 23:53:28.926302 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.926004 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdxs4\" (UniqueName: \"kubernetes.io/projected/323b58fb-a204-4400-a836-973ccf33cd8e-kube-api-access-cdxs4\") pod \"node-resolver-75bcc\" (UID: \"323b58fb-a204-4400-a836-973ccf33cd8e\") " pod="openshift-dns/node-resolver-75bcc" Apr 24 23:53:28.926302 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.926029 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1134624f-34d1-4f6e-8821-435df2b54c9b-system-cni-dir\") pod \"multus-additional-cni-plugins-mmdsm\" (UID: \"1134624f-34d1-4f6e-8821-435df2b54c9b\") " pod="openshift-multus/multus-additional-cni-plugins-mmdsm" Apr 24 23:53:28.926302 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.926054 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1134624f-34d1-4f6e-8821-435df2b54c9b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-mmdsm\" (UID: \"1134624f-34d1-4f6e-8821-435df2b54c9b\") " pod="openshift-multus/multus-additional-cni-plugins-mmdsm" Apr 24 23:53:28.926302 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.926088 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/55dc1a60-4f7d-4366-aca1-303ce72f4d84-multus-cni-dir\") pod \"multus-x9qdz\" (UID: \"55dc1a60-4f7d-4366-aca1-303ce72f4d84\") " pod="openshift-multus/multus-x9qdz" Apr 24 23:53:28.926302 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.926112 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/55dc1a60-4f7d-4366-aca1-303ce72f4d84-cnibin\") pod \"multus-x9qdz\" (UID: \"55dc1a60-4f7d-4366-aca1-303ce72f4d84\") " pod="openshift-multus/multus-x9qdz" Apr 24 23:53:28.926302 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.926135 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/55dc1a60-4f7d-4366-aca1-303ce72f4d84-multus-socket-dir-parent\") pod \"multus-x9qdz\" (UID: \"55dc1a60-4f7d-4366-aca1-303ce72f4d84\") " pod="openshift-multus/multus-x9qdz" Apr 24 23:53:28.926302 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.926176 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/50d11a09-c912-418a-ab1b-e1f5272b1d2f-etc-sysctl-d\") pod \"tuned-nb82k\" (UID: \"50d11a09-c912-418a-ab1b-e1f5272b1d2f\") " pod="openshift-cluster-node-tuning-operator/tuned-nb82k" Apr 24 23:53:28.926302 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.926199 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/50d11a09-c912-418a-ab1b-e1f5272b1d2f-etc-systemd\") pod \"tuned-nb82k\" (UID: \"50d11a09-c912-418a-ab1b-e1f5272b1d2f\") " pod="openshift-cluster-node-tuning-operator/tuned-nb82k" Apr 24 23:53:28.926302 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.926218 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/50d11a09-c912-418a-ab1b-e1f5272b1d2f-run\") pod \"tuned-nb82k\" (UID: \"50d11a09-c912-418a-ab1b-e1f5272b1d2f\") " pod="openshift-cluster-node-tuning-operator/tuned-nb82k" Apr 24 23:53:28.926302 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.926253 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/55dc1a60-4f7d-4366-aca1-303ce72f4d84-host-run-multus-certs\") pod \"multus-x9qdz\" (UID: \"55dc1a60-4f7d-4366-aca1-303ce72f4d84\") " pod="openshift-multus/multus-x9qdz" Apr 24 23:53:28.926302 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.926291 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/50d11a09-c912-418a-ab1b-e1f5272b1d2f-etc-sysconfig\") pod \"tuned-nb82k\" (UID: \"50d11a09-c912-418a-ab1b-e1f5272b1d2f\") " pod="openshift-cluster-node-tuning-operator/tuned-nb82k" Apr 24 23:53:28.926799 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.926315 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/50d11a09-c912-418a-ab1b-e1f5272b1d2f-etc-sysctl-conf\") pod \"tuned-nb82k\" (UID: \"50d11a09-c912-418a-ab1b-e1f5272b1d2f\") " pod="openshift-cluster-node-tuning-operator/tuned-nb82k" Apr 24 23:53:28.926799 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.926335 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/50d11a09-c912-418a-ab1b-e1f5272b1d2f-sys\") pod \"tuned-nb82k\" (UID: \"50d11a09-c912-418a-ab1b-e1f5272b1d2f\") " pod="openshift-cluster-node-tuning-operator/tuned-nb82k" Apr 24 23:53:28.926799 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.926372 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1134624f-34d1-4f6e-8821-435df2b54c9b-cnibin\") pod \"multus-additional-cni-plugins-mmdsm\" (UID: \"1134624f-34d1-4f6e-8821-435df2b54c9b\") " pod="openshift-multus/multus-additional-cni-plugins-mmdsm" Apr 24 23:53:28.926799 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.926397 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d9ft\" (UniqueName: \"kubernetes.io/projected/1134624f-34d1-4f6e-8821-435df2b54c9b-kube-api-access-2d9ft\") pod \"multus-additional-cni-plugins-mmdsm\" (UID: \"1134624f-34d1-4f6e-8821-435df2b54c9b\") " pod="openshift-multus/multus-additional-cni-plugins-mmdsm" Apr 24 23:53:28.926799 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.926417 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/55dc1a60-4f7d-4366-aca1-303ce72f4d84-hostroot\") pod \"multus-x9qdz\" (UID: \"55dc1a60-4f7d-4366-aca1-303ce72f4d84\") " pod="openshift-multus/multus-x9qdz" Apr 24 23:53:28.926799 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.926438 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/323b58fb-a204-4400-a836-973ccf33cd8e-hosts-file\") pod \"node-resolver-75bcc\" (UID: \"323b58fb-a204-4400-a836-973ccf33cd8e\") " pod="openshift-dns/node-resolver-75bcc" Apr 24 23:53:28.926799 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.926459 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1134624f-34d1-4f6e-8821-435df2b54c9b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mmdsm\" (UID: \"1134624f-34d1-4f6e-8821-435df2b54c9b\") " pod="openshift-multus/multus-additional-cni-plugins-mmdsm" Apr 24 23:53:28.926799 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.926499 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/55dc1a60-4f7d-4366-aca1-303ce72f4d84-host-run-k8s-cni-cncf-io\") pod \"multus-x9qdz\" (UID: \"55dc1a60-4f7d-4366-aca1-303ce72f4d84\") " pod="openshift-multus/multus-x9qdz" Apr 24 23:53:28.926799 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.926538 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/50d11a09-c912-418a-ab1b-e1f5272b1d2f-host\") pod \"tuned-nb82k\" (UID: \"50d11a09-c912-418a-ab1b-e1f5272b1d2f\") " pod="openshift-cluster-node-tuning-operator/tuned-nb82k" Apr 24 23:53:28.926799 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.926565 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmfn2\" (UniqueName: \"kubernetes.io/projected/50d11a09-c912-418a-ab1b-e1f5272b1d2f-kube-api-access-bmfn2\") pod \"tuned-nb82k\" (UID: \"50d11a09-c912-418a-ab1b-e1f5272b1d2f\") " pod="openshift-cluster-node-tuning-operator/tuned-nb82k" Apr 24 23:53:28.926799 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.926588 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1134624f-34d1-4f6e-8821-435df2b54c9b-os-release\") pod \"multus-additional-cni-plugins-mmdsm\" (UID: \"1134624f-34d1-4f6e-8821-435df2b54c9b\") " pod="openshift-multus/multus-additional-cni-plugins-mmdsm" Apr 24 23:53:28.926799 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.926613 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1134624f-34d1-4f6e-8821-435df2b54c9b-cni-binary-copy\") pod \"multus-additional-cni-plugins-mmdsm\" (UID: \"1134624f-34d1-4f6e-8821-435df2b54c9b\") " pod="openshift-multus/multus-additional-cni-plugins-mmdsm" Apr 24 23:53:28.926799 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.926635 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/55dc1a60-4f7d-4366-aca1-303ce72f4d84-host-var-lib-cni-bin\") pod \"multus-x9qdz\" (UID: \"55dc1a60-4f7d-4366-aca1-303ce72f4d84\") " pod="openshift-multus/multus-x9qdz" Apr 24 23:53:28.926799 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.926674 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/55dc1a60-4f7d-4366-aca1-303ce72f4d84-multus-conf-dir\") pod \"multus-x9qdz\" (UID: \"55dc1a60-4f7d-4366-aca1-303ce72f4d84\") " pod="openshift-multus/multus-x9qdz" Apr 24 23:53:28.926799 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.926697 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/55dc1a60-4f7d-4366-aca1-303ce72f4d84-multus-daemon-config\") pod \"multus-x9qdz\" (UID: \"55dc1a60-4f7d-4366-aca1-303ce72f4d84\") " pod="openshift-multus/multus-x9qdz" Apr 24 23:53:28.926799 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.926746 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q9cq\" (UniqueName: \"kubernetes.io/projected/4a7b82bd-bf6c-4091-8f48-64cea3e964a8-kube-api-access-2q9cq\") pod \"network-metrics-daemon-7wg4q\" (UID: \"4a7b82bd-bf6c-4091-8f48-64cea3e964a8\") " pod="openshift-multus/network-metrics-daemon-7wg4q" Apr 24 23:53:28.927446 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.926788 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/50d11a09-c912-418a-ab1b-e1f5272b1d2f-etc-modprobe-d\") pod \"tuned-nb82k\" (UID: \"50d11a09-c912-418a-ab1b-e1f5272b1d2f\") " pod="openshift-cluster-node-tuning-operator/tuned-nb82k" Apr 24 23:53:28.927446 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.926815 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/50d11a09-c912-418a-ab1b-e1f5272b1d2f-lib-modules\") pod \"tuned-nb82k\" (UID: \"50d11a09-c912-418a-ab1b-e1f5272b1d2f\") " pod="openshift-cluster-node-tuning-operator/tuned-nb82k" Apr 24 23:53:28.927536 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.927524 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 23:53:28.927718 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.927697 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 23:53:28.927718 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.927705 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-5b882\"" Apr 24 23:53:28.927839 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.927703 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 23:53:28.952662 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.952631 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 23:48:27 +0000 UTC" deadline="2027-11-25 13:15:23.572822036 +0000 UTC" Apr 24 23:53:28.952662 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:28.952663 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13909h21m54.620163017s" Apr 24 23:53:29.015348 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.015316 2567 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 23:53:29.027538 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.027508 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/49ea4681-f36b-4e20-a2b5-d76f46611b7a-host\") pod \"node-ca-cjk8s\" (UID: \"49ea4681-f36b-4e20-a2b5-d76f46611b7a\") " pod="openshift-image-registry/node-ca-cjk8s" Apr 24 23:53:29.027707 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.027553 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e4504335-0abc-4d74-83b7-89d51ce42839-registration-dir\") pod \"aws-ebs-csi-driver-node-qq9fz\" (UID: \"e4504335-0abc-4d74-83b7-89d51ce42839\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qq9fz" Apr 24 23:53:29.027707 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.027590 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/49ea4681-f36b-4e20-a2b5-d76f46611b7a-host\") pod \"node-ca-cjk8s\" (UID: \"49ea4681-f36b-4e20-a2b5-d76f46611b7a\") " pod="openshift-image-registry/node-ca-cjk8s" Apr 24 23:53:29.027707 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.027602 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1134624f-34d1-4f6e-8821-435df2b54c9b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mmdsm\" (UID: \"1134624f-34d1-4f6e-8821-435df2b54c9b\") " pod="openshift-multus/multus-additional-cni-plugins-mmdsm" Apr 24 23:53:29.027707 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.027625 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/55dc1a60-4f7d-4366-aca1-303ce72f4d84-host-var-lib-cni-multus\") pod \"multus-x9qdz\" (UID: \"55dc1a60-4f7d-4366-aca1-303ce72f4d84\") " pod="openshift-multus/multus-x9qdz" Apr 24 23:53:29.027707 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.027647 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55dc1a60-4f7d-4366-aca1-303ce72f4d84-etc-kubernetes\") pod \"multus-x9qdz\" (UID: \"55dc1a60-4f7d-4366-aca1-303ce72f4d84\") " pod="openshift-multus/multus-x9qdz" Apr 24 23:53:29.027922 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.027744 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/55dc1a60-4f7d-4366-aca1-303ce72f4d84-host-var-lib-cni-multus\") pod \"multus-x9qdz\" (UID: \"55dc1a60-4f7d-4366-aca1-303ce72f4d84\") " pod="openshift-multus/multus-x9qdz" Apr 24 23:53:29.027922 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.027786 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/323b58fb-a204-4400-a836-973ccf33cd8e-tmp-dir\") pod \"node-resolver-75bcc\" (UID: \"323b58fb-a204-4400-a836-973ccf33cd8e\") " pod="openshift-dns/node-resolver-75bcc" Apr 24 23:53:29.027922 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.027814 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55dc1a60-4f7d-4366-aca1-303ce72f4d84-etc-kubernetes\") pod \"multus-x9qdz\" (UID: \"55dc1a60-4f7d-4366-aca1-303ce72f4d84\") " pod="openshift-multus/multus-x9qdz" Apr 24 23:53:29.027922 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.027872 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/119f77e1-fd03-4239-a927-07a04816a852-iptables-alerter-script\") pod \"iptables-alerter-2m9kc\" (UID: \"119f77e1-fd03-4239-a927-07a04816a852\") " pod="openshift-network-operator/iptables-alerter-2m9kc" Apr 24 23:53:29.028086 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.027976 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljwnq\" (UniqueName: \"kubernetes.io/projected/c44399ed-9019-430d-83d7-8cde0e6f0d03-kube-api-access-ljwnq\") pod \"network-check-target-f9vfv\" (UID: \"c44399ed-9019-430d-83d7-8cde0e6f0d03\") " pod="openshift-network-diagnostics/network-check-target-f9vfv" Apr 24 23:53:29.028086 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.028022 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/49ea4681-f36b-4e20-a2b5-d76f46611b7a-serviceca\") pod \"node-ca-cjk8s\" (UID: \"49ea4681-f36b-4e20-a2b5-d76f46611b7a\") " pod="openshift-image-registry/node-ca-cjk8s" Apr 24 23:53:29.028086 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.028060 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ggfgt\" (UniqueName: \"kubernetes.io/projected/55dc1a60-4f7d-4366-aca1-303ce72f4d84-kube-api-access-ggfgt\") pod \"multus-x9qdz\" (UID: \"55dc1a60-4f7d-4366-aca1-303ce72f4d84\") " pod="openshift-multus/multus-x9qdz" Apr 24 23:53:29.028215 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.028089 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a7b82bd-bf6c-4091-8f48-64cea3e964a8-metrics-certs\") pod \"network-metrics-daemon-7wg4q\" (UID: \"4a7b82bd-bf6c-4091-8f48-64cea3e964a8\") " pod="openshift-multus/network-metrics-daemon-7wg4q" Apr 24 23:53:29.028215 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.028115 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1134624f-34d1-4f6e-8821-435df2b54c9b-system-cni-dir\") pod \"multus-additional-cni-plugins-mmdsm\" (UID: \"1134624f-34d1-4f6e-8821-435df2b54c9b\") " pod="openshift-multus/multus-additional-cni-plugins-mmdsm" Apr 24 23:53:29.028215 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.028179 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1134624f-34d1-4f6e-8821-435df2b54c9b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mmdsm\" (UID: \"1134624f-34d1-4f6e-8821-435df2b54c9b\") " pod="openshift-multus/multus-additional-cni-plugins-mmdsm" Apr 24 23:53:29.028215 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.028191 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1134624f-34d1-4f6e-8821-435df2b54c9b-system-cni-dir\") pod \"multus-additional-cni-plugins-mmdsm\" (UID: \"1134624f-34d1-4f6e-8821-435df2b54c9b\") " pod="openshift-multus/multus-additional-cni-plugins-mmdsm" Apr 24 23:53:29.028414 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.028240 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/281d90a3-293d-45ac-8b4c-87bdc25f3882-host-run-netns\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.028414 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:29.028260 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:29.028414 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.028270 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/323b58fb-a204-4400-a836-973ccf33cd8e-tmp-dir\") pod \"node-resolver-75bcc\" (UID: \"323b58fb-a204-4400-a836-973ccf33cd8e\") " pod="openshift-dns/node-resolver-75bcc" Apr 24 23:53:29.028414 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.028321 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e4504335-0abc-4d74-83b7-89d51ce42839-device-dir\") pod \"aws-ebs-csi-driver-node-qq9fz\" (UID: \"e4504335-0abc-4d74-83b7-89d51ce42839\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qq9fz" Apr 24 23:53:29.028414 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:29.028347 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a7b82bd-bf6c-4091-8f48-64cea3e964a8-metrics-certs podName:4a7b82bd-bf6c-4091-8f48-64cea3e964a8 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:29.528324755 +0000 UTC m=+3.051608980 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4a7b82bd-bf6c-4091-8f48-64cea3e964a8-metrics-certs") pod "network-metrics-daemon-7wg4q" (UID: "4a7b82bd-bf6c-4091-8f48-64cea3e964a8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:29.028630 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.028423 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/55dc1a60-4f7d-4366-aca1-303ce72f4d84-host-run-multus-certs\") pod \"multus-x9qdz\" (UID: \"55dc1a60-4f7d-4366-aca1-303ce72f4d84\") " pod="openshift-multus/multus-x9qdz" Apr 24 23:53:29.028630 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.028457 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/281d90a3-293d-45ac-8b4c-87bdc25f3882-host-slash\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.028630 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.028465 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/49ea4681-f36b-4e20-a2b5-d76f46611b7a-serviceca\") pod \"node-ca-cjk8s\" (UID: \"49ea4681-f36b-4e20-a2b5-d76f46611b7a\") " pod="openshift-image-registry/node-ca-cjk8s" Apr 24 23:53:29.028630 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.028500 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/281d90a3-293d-45ac-8b4c-87bdc25f3882-var-lib-openvswitch\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.028630 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.028510 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/55dc1a60-4f7d-4366-aca1-303ce72f4d84-host-run-multus-certs\") pod \"multus-x9qdz\" (UID: \"55dc1a60-4f7d-4366-aca1-303ce72f4d84\") " pod="openshift-multus/multus-x9qdz" Apr 24 23:53:29.028630 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.028531 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/281d90a3-293d-45ac-8b4c-87bdc25f3882-run-openvswitch\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.028630 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.028569 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/281d90a3-293d-45ac-8b4c-87bdc25f3882-log-socket\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.028630 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.028592 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/119f77e1-fd03-4239-a927-07a04816a852-host-slash\") pod \"iptables-alerter-2m9kc\" (UID: \"119f77e1-fd03-4239-a927-07a04816a852\") " pod="openshift-network-operator/iptables-alerter-2m9kc" Apr 24 23:53:29.028630 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.028609 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/217a9b6d-2b5e-4ed8-87d7-5820bac053c5-agent-certs\") pod \"konnectivity-agent-ckn8x\" (UID: \"217a9b6d-2b5e-4ed8-87d7-5820bac053c5\") " pod="kube-system/konnectivity-agent-ckn8x" Apr 24 23:53:29.029023 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.028649 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2d9ft\" (UniqueName: \"kubernetes.io/projected/1134624f-34d1-4f6e-8821-435df2b54c9b-kube-api-access-2d9ft\") pod \"multus-additional-cni-plugins-mmdsm\" (UID: \"1134624f-34d1-4f6e-8821-435df2b54c9b\") " pod="openshift-multus/multus-additional-cni-plugins-mmdsm" Apr 24 23:53:29.029023 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.028796 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/55dc1a60-4f7d-4366-aca1-303ce72f4d84-hostroot\") pod \"multus-x9qdz\" (UID: \"55dc1a60-4f7d-4366-aca1-303ce72f4d84\") " pod="openshift-multus/multus-x9qdz" Apr 24 23:53:29.029023 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.028835 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1134624f-34d1-4f6e-8821-435df2b54c9b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mmdsm\" (UID: \"1134624f-34d1-4f6e-8821-435df2b54c9b\") " pod="openshift-multus/multus-additional-cni-plugins-mmdsm" Apr 24 23:53:29.029023 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.028867 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/281d90a3-293d-45ac-8b4c-87bdc25f3882-etc-openvswitch\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.029023 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.028906 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/281d90a3-293d-45ac-8b4c-87bdc25f3882-run-ovn\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.029023 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.028909 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/55dc1a60-4f7d-4366-aca1-303ce72f4d84-hostroot\") pod \"multus-x9qdz\" (UID: \"55dc1a60-4f7d-4366-aca1-303ce72f4d84\") " pod="openshift-multus/multus-x9qdz" Apr 24 23:53:29.029023 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.028935 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e4504335-0abc-4d74-83b7-89d51ce42839-etc-selinux\") pod \"aws-ebs-csi-driver-node-qq9fz\" (UID: \"e4504335-0abc-4d74-83b7-89d51ce42839\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qq9fz" Apr 24 23:53:29.029023 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.028970 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/55dc1a60-4f7d-4366-aca1-303ce72f4d84-host-var-lib-cni-bin\") pod \"multus-x9qdz\" (UID: \"55dc1a60-4f7d-4366-aca1-303ce72f4d84\") " pod="openshift-multus/multus-x9qdz" Apr 24 23:53:29.029023 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.028995 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/55dc1a60-4f7d-4366-aca1-303ce72f4d84-multus-daemon-config\") pod \"multus-x9qdz\" (UID: \"55dc1a60-4f7d-4366-aca1-303ce72f4d84\") " pod="openshift-multus/multus-x9qdz" Apr 24 23:53:29.029023 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.029006 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1134624f-34d1-4f6e-8821-435df2b54c9b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mmdsm\" (UID: \"1134624f-34d1-4f6e-8821-435df2b54c9b\") " pod="openshift-multus/multus-additional-cni-plugins-mmdsm" Apr 24 23:53:29.029473 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.029039 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/55dc1a60-4f7d-4366-aca1-303ce72f4d84-host-var-lib-cni-bin\") pod \"multus-x9qdz\" (UID: \"55dc1a60-4f7d-4366-aca1-303ce72f4d84\") " pod="openshift-multus/multus-x9qdz" Apr 24 23:53:29.029473 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.029066 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2q9cq\" (UniqueName: \"kubernetes.io/projected/4a7b82bd-bf6c-4091-8f48-64cea3e964a8-kube-api-access-2q9cq\") pod \"network-metrics-daemon-7wg4q\" (UID: \"4a7b82bd-bf6c-4091-8f48-64cea3e964a8\") " pod="openshift-multus/network-metrics-daemon-7wg4q" Apr 24 23:53:29.029473 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.029109 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/50d11a09-c912-418a-ab1b-e1f5272b1d2f-etc-modprobe-d\") pod \"tuned-nb82k\" (UID: \"50d11a09-c912-418a-ab1b-e1f5272b1d2f\") " pod="openshift-cluster-node-tuning-operator/tuned-nb82k" Apr 24 23:53:29.029473 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.029134 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/50d11a09-c912-418a-ab1b-e1f5272b1d2f-lib-modules\") pod \"tuned-nb82k\" (UID: \"50d11a09-c912-418a-ab1b-e1f5272b1d2f\") " pod="openshift-cluster-node-tuning-operator/tuned-nb82k" Apr 24 23:53:29.029473 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.029243 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/50d11a09-c912-418a-ab1b-e1f5272b1d2f-etc-modprobe-d\") pod \"tuned-nb82k\" (UID: \"50d11a09-c912-418a-ab1b-e1f5272b1d2f\") " pod="openshift-cluster-node-tuning-operator/tuned-nb82k" Apr 24 23:53:29.029473 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.029258 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/50d11a09-c912-418a-ab1b-e1f5272b1d2f-lib-modules\") pod \"tuned-nb82k\" (UID: \"50d11a09-c912-418a-ab1b-e1f5272b1d2f\") " pod="openshift-cluster-node-tuning-operator/tuned-nb82k" Apr 24 23:53:29.029473 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.029351 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/281d90a3-293d-45ac-8b4c-87bdc25f3882-host-kubelet\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.029473 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.029395 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/281d90a3-293d-45ac-8b4c-87bdc25f3882-host-cni-bin\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.029473 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.029420 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/281d90a3-293d-45ac-8b4c-87bdc25f3882-host-cni-netd\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.029473 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.029468 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wfbcl\" (UniqueName: \"kubernetes.io/projected/49ea4681-f36b-4e20-a2b5-d76f46611b7a-kube-api-access-wfbcl\") pod \"node-ca-cjk8s\" (UID: \"49ea4681-f36b-4e20-a2b5-d76f46611b7a\") " pod="openshift-image-registry/node-ca-cjk8s" Apr 24 23:53:29.029934 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.029500 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/55dc1a60-4f7d-4366-aca1-303ce72f4d84-system-cni-dir\") pod \"multus-x9qdz\" (UID: \"55dc1a60-4f7d-4366-aca1-303ce72f4d84\") " pod="openshift-multus/multus-x9qdz" Apr 24 23:53:29.029934 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.029526 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/50d11a09-c912-418a-ab1b-e1f5272b1d2f-etc-kubernetes\") pod \"tuned-nb82k\" (UID: \"50d11a09-c912-418a-ab1b-e1f5272b1d2f\") " pod="openshift-cluster-node-tuning-operator/tuned-nb82k" Apr 24 23:53:29.029934 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.029557 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/281d90a3-293d-45ac-8b4c-87bdc25f3882-ovn-node-metrics-cert\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.029934 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.029585 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e4504335-0abc-4d74-83b7-89d51ce42839-sys-fs\") pod \"aws-ebs-csi-driver-node-qq9fz\" (UID: \"e4504335-0abc-4d74-83b7-89d51ce42839\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qq9fz" Apr 24 23:53:29.029934 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.029593 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/55dc1a60-4f7d-4366-aca1-303ce72f4d84-system-cni-dir\") pod \"multus-x9qdz\" (UID: \"55dc1a60-4f7d-4366-aca1-303ce72f4d84\") " pod="openshift-multus/multus-x9qdz" Apr 24 23:53:29.029934 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.029613 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/55dc1a60-4f7d-4366-aca1-303ce72f4d84-host-run-netns\") pod \"multus-x9qdz\" (UID: \"55dc1a60-4f7d-4366-aca1-303ce72f4d84\") " pod="openshift-multus/multus-x9qdz" Apr 24 23:53:29.029934 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.029634 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/55dc1a60-4f7d-4366-aca1-303ce72f4d84-multus-daemon-config\") pod \"multus-x9qdz\" (UID: \"55dc1a60-4f7d-4366-aca1-303ce72f4d84\") " pod="openshift-multus/multus-x9qdz" Apr 24 23:53:29.029934 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.029649 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/55dc1a60-4f7d-4366-aca1-303ce72f4d84-host-var-lib-kubelet\") pod \"multus-x9qdz\" (UID: \"55dc1a60-4f7d-4366-aca1-303ce72f4d84\") " pod="openshift-multus/multus-x9qdz" Apr 24 23:53:29.029934 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.029659 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/55dc1a60-4f7d-4366-aca1-303ce72f4d84-host-run-netns\") pod \"multus-x9qdz\" (UID: \"55dc1a60-4f7d-4366-aca1-303ce72f4d84\") " pod="openshift-multus/multus-x9qdz" Apr 24 23:53:29.029934 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.029680 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/281d90a3-293d-45ac-8b4c-87bdc25f3882-node-log\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.029934 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.029686 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/50d11a09-c912-418a-ab1b-e1f5272b1d2f-etc-kubernetes\") pod \"tuned-nb82k\" (UID: \"50d11a09-c912-418a-ab1b-e1f5272b1d2f\") " pod="openshift-cluster-node-tuning-operator/tuned-nb82k" Apr 24 23:53:29.029934 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.029717 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsvch\" (UniqueName: \"kubernetes.io/projected/281d90a3-293d-45ac-8b4c-87bdc25f3882-kube-api-access-rsvch\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.029934 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.029707 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/55dc1a60-4f7d-4366-aca1-303ce72f4d84-host-var-lib-kubelet\") pod \"multus-x9qdz\" (UID: \"55dc1a60-4f7d-4366-aca1-303ce72f4d84\") " pod="openshift-multus/multus-x9qdz" Apr 24 23:53:29.029934 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.029747 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zs8v\" (UniqueName: \"kubernetes.io/projected/119f77e1-fd03-4239-a927-07a04816a852-kube-api-access-8zs8v\") pod \"iptables-alerter-2m9kc\" (UID: \"119f77e1-fd03-4239-a927-07a04816a852\") " pod="openshift-network-operator/iptables-alerter-2m9kc" Apr 24 23:53:29.029934 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.029774 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/217a9b6d-2b5e-4ed8-87d7-5820bac053c5-konnectivity-ca\") pod \"konnectivity-agent-ckn8x\" (UID: \"217a9b6d-2b5e-4ed8-87d7-5820bac053c5\") " pod="kube-system/konnectivity-agent-ckn8x" Apr 24 23:53:29.029934 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.029801 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/55dc1a60-4f7d-4366-aca1-303ce72f4d84-os-release\") pod \"multus-x9qdz\" (UID: \"55dc1a60-4f7d-4366-aca1-303ce72f4d84\") " pod="openshift-multus/multus-x9qdz" Apr 24 23:53:29.029934 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.029826 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/55dc1a60-4f7d-4366-aca1-303ce72f4d84-cni-binary-copy\") pod \"multus-x9qdz\" (UID: \"55dc1a60-4f7d-4366-aca1-303ce72f4d84\") " pod="openshift-multus/multus-x9qdz" Apr 24 23:53:29.030639 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.029854 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cdxs4\" (UniqueName: \"kubernetes.io/projected/323b58fb-a204-4400-a836-973ccf33cd8e-kube-api-access-cdxs4\") pod \"node-resolver-75bcc\" (UID: \"323b58fb-a204-4400-a836-973ccf33cd8e\") " pod="openshift-dns/node-resolver-75bcc" Apr 24 23:53:29.030639 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.029884 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/55dc1a60-4f7d-4366-aca1-303ce72f4d84-os-release\") pod \"multus-x9qdz\" (UID: \"55dc1a60-4f7d-4366-aca1-303ce72f4d84\") " pod="openshift-multus/multus-x9qdz" Apr 24 23:53:29.030639 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.029890 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/281d90a3-293d-45ac-8b4c-87bdc25f3882-systemd-units\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.030639 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.029934 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/281d90a3-293d-45ac-8b4c-87bdc25f3882-ovnkube-script-lib\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.030639 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.030041 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1134624f-34d1-4f6e-8821-435df2b54c9b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-mmdsm\" (UID: \"1134624f-34d1-4f6e-8821-435df2b54c9b\") " pod="openshift-multus/multus-additional-cni-plugins-mmdsm" Apr 24 23:53:29.030639 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.030085 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/55dc1a60-4f7d-4366-aca1-303ce72f4d84-multus-cni-dir\") pod \"multus-x9qdz\" (UID: \"55dc1a60-4f7d-4366-aca1-303ce72f4d84\") " pod="openshift-multus/multus-x9qdz" Apr 24 23:53:29.030639 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.030110 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/55dc1a60-4f7d-4366-aca1-303ce72f4d84-cnibin\") pod \"multus-x9qdz\" (UID: \"55dc1a60-4f7d-4366-aca1-303ce72f4d84\") " pod="openshift-multus/multus-x9qdz" Apr 24 23:53:29.030639 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.030134 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/55dc1a60-4f7d-4366-aca1-303ce72f4d84-multus-socket-dir-parent\") pod \"multus-x9qdz\" (UID: \"55dc1a60-4f7d-4366-aca1-303ce72f4d84\") " pod="openshift-multus/multus-x9qdz" Apr 24 23:53:29.030639 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.030158 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/50d11a09-c912-418a-ab1b-e1f5272b1d2f-etc-sysctl-d\") pod \"tuned-nb82k\" (UID: \"50d11a09-c912-418a-ab1b-e1f5272b1d2f\") " pod="openshift-cluster-node-tuning-operator/tuned-nb82k" Apr 24 23:53:29.030639 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.030178 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/55dc1a60-4f7d-4366-aca1-303ce72f4d84-multus-cni-dir\") pod \"multus-x9qdz\" (UID: \"55dc1a60-4f7d-4366-aca1-303ce72f4d84\") " pod="openshift-multus/multus-x9qdz" Apr 24 23:53:29.030639 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.030189 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/50d11a09-c912-418a-ab1b-e1f5272b1d2f-etc-systemd\") pod \"tuned-nb82k\" (UID: \"50d11a09-c912-418a-ab1b-e1f5272b1d2f\") " pod="openshift-cluster-node-tuning-operator/tuned-nb82k" Apr 24 23:53:29.030639 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.030231 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/50d11a09-c912-418a-ab1b-e1f5272b1d2f-run\") pod \"tuned-nb82k\" (UID: \"50d11a09-c912-418a-ab1b-e1f5272b1d2f\") " pod="openshift-cluster-node-tuning-operator/tuned-nb82k" Apr 24 23:53:29.030639 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.030244 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/55dc1a60-4f7d-4366-aca1-303ce72f4d84-cnibin\") pod \"multus-x9qdz\" (UID: \"55dc1a60-4f7d-4366-aca1-303ce72f4d84\") " pod="openshift-multus/multus-x9qdz" Apr 24 23:53:29.030639 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.030256 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/50d11a09-c912-418a-ab1b-e1f5272b1d2f-sys\") pod \"tuned-nb82k\" (UID: \"50d11a09-c912-418a-ab1b-e1f5272b1d2f\") " pod="openshift-cluster-node-tuning-operator/tuned-nb82k" Apr 24 23:53:29.030639 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.030292 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/55dc1a60-4f7d-4366-aca1-303ce72f4d84-multus-socket-dir-parent\") pod \"multus-x9qdz\" (UID: \"55dc1a60-4f7d-4366-aca1-303ce72f4d84\") " pod="openshift-multus/multus-x9qdz" Apr 24 23:53:29.030639 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.030311 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/50d11a09-c912-418a-ab1b-e1f5272b1d2f-etc-sysconfig\") pod \"tuned-nb82k\" (UID: \"50d11a09-c912-418a-ab1b-e1f5272b1d2f\") " pod="openshift-cluster-node-tuning-operator/tuned-nb82k" Apr 24 23:53:29.030639 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.030325 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/50d11a09-c912-418a-ab1b-e1f5272b1d2f-etc-sysctl-d\") pod \"tuned-nb82k\" (UID: \"50d11a09-c912-418a-ab1b-e1f5272b1d2f\") " pod="openshift-cluster-node-tuning-operator/tuned-nb82k" Apr 24 23:53:29.030639 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.030332 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/55dc1a60-4f7d-4366-aca1-303ce72f4d84-cni-binary-copy\") pod \"multus-x9qdz\" (UID: \"55dc1a60-4f7d-4366-aca1-303ce72f4d84\") " pod="openshift-multus/multus-x9qdz" Apr 24 23:53:29.031328 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.030346 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/50d11a09-c912-418a-ab1b-e1f5272b1d2f-sys\") pod \"tuned-nb82k\" (UID: \"50d11a09-c912-418a-ab1b-e1f5272b1d2f\") " pod="openshift-cluster-node-tuning-operator/tuned-nb82k" Apr 24 23:53:29.031328 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.030338 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/50d11a09-c912-418a-ab1b-e1f5272b1d2f-etc-sysctl-conf\") pod \"tuned-nb82k\" (UID: \"50d11a09-c912-418a-ab1b-e1f5272b1d2f\") " pod="openshift-cluster-node-tuning-operator/tuned-nb82k" Apr 24 23:53:29.031328 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.030332 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/50d11a09-c912-418a-ab1b-e1f5272b1d2f-run\") pod \"tuned-nb82k\" (UID: \"50d11a09-c912-418a-ab1b-e1f5272b1d2f\") " pod="openshift-cluster-node-tuning-operator/tuned-nb82k" Apr 24 23:53:29.031328 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.030399 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/50d11a09-c912-418a-ab1b-e1f5272b1d2f-etc-sysconfig\") pod \"tuned-nb82k\" (UID: \"50d11a09-c912-418a-ab1b-e1f5272b1d2f\") " pod="openshift-cluster-node-tuning-operator/tuned-nb82k" Apr 24 23:53:29.031328 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.030429 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/281d90a3-293d-45ac-8b4c-87bdc25f3882-run-systemd\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.031328 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.030479 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/50d11a09-c912-418a-ab1b-e1f5272b1d2f-etc-sysctl-conf\") pod \"tuned-nb82k\" (UID: \"50d11a09-c912-418a-ab1b-e1f5272b1d2f\") " pod="openshift-cluster-node-tuning-operator/tuned-nb82k" Apr 24 23:53:29.031328 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.030512 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/281d90a3-293d-45ac-8b4c-87bdc25f3882-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.031328 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.030548 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/281d90a3-293d-45ac-8b4c-87bdc25f3882-ovnkube-config\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.031328 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.030515 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/50d11a09-c912-418a-ab1b-e1f5272b1d2f-etc-systemd\") pod \"tuned-nb82k\" (UID: \"50d11a09-c912-418a-ab1b-e1f5272b1d2f\") " pod="openshift-cluster-node-tuning-operator/tuned-nb82k" Apr 24 23:53:29.031328 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.030573 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e4504335-0abc-4d74-83b7-89d51ce42839-socket-dir\") pod \"aws-ebs-csi-driver-node-qq9fz\" (UID: \"e4504335-0abc-4d74-83b7-89d51ce42839\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qq9fz" Apr 24 23:53:29.031328 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.030586 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1134624f-34d1-4f6e-8821-435df2b54c9b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-mmdsm\" (UID: \"1134624f-34d1-4f6e-8821-435df2b54c9b\") " pod="openshift-multus/multus-additional-cni-plugins-mmdsm" Apr 24 23:53:29.031328 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.030620 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkcdk\" (UniqueName: \"kubernetes.io/projected/e4504335-0abc-4d74-83b7-89d51ce42839-kube-api-access-qkcdk\") pod \"aws-ebs-csi-driver-node-qq9fz\" (UID: \"e4504335-0abc-4d74-83b7-89d51ce42839\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qq9fz" Apr 24 23:53:29.031328 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.030652 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1134624f-34d1-4f6e-8821-435df2b54c9b-cnibin\") pod \"multus-additional-cni-plugins-mmdsm\" (UID: \"1134624f-34d1-4f6e-8821-435df2b54c9b\") " pod="openshift-multus/multus-additional-cni-plugins-mmdsm" Apr 24 23:53:29.031328 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.030678 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/323b58fb-a204-4400-a836-973ccf33cd8e-hosts-file\") pod \"node-resolver-75bcc\" (UID: \"323b58fb-a204-4400-a836-973ccf33cd8e\") " pod="openshift-dns/node-resolver-75bcc" Apr 24 23:53:29.031328 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.030703 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/55dc1a60-4f7d-4366-aca1-303ce72f4d84-host-run-k8s-cni-cncf-io\") pod \"multus-x9qdz\" (UID: \"55dc1a60-4f7d-4366-aca1-303ce72f4d84\") " pod="openshift-multus/multus-x9qdz" Apr 24 23:53:29.031328 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.030706 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1134624f-34d1-4f6e-8821-435df2b54c9b-cnibin\") pod \"multus-additional-cni-plugins-mmdsm\" (UID: \"1134624f-34d1-4f6e-8821-435df2b54c9b\") " pod="openshift-multus/multus-additional-cni-plugins-mmdsm" Apr 24 23:53:29.031328 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.030730 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/50d11a09-c912-418a-ab1b-e1f5272b1d2f-host\") pod \"tuned-nb82k\" (UID: \"50d11a09-c912-418a-ab1b-e1f5272b1d2f\") " pod="openshift-cluster-node-tuning-operator/tuned-nb82k" Apr 24 23:53:29.031997 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.030764 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/55dc1a60-4f7d-4366-aca1-303ce72f4d84-host-run-k8s-cni-cncf-io\") pod \"multus-x9qdz\" (UID: \"55dc1a60-4f7d-4366-aca1-303ce72f4d84\") " pod="openshift-multus/multus-x9qdz" Apr 24 23:53:29.031997 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.030767 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/323b58fb-a204-4400-a836-973ccf33cd8e-hosts-file\") pod \"node-resolver-75bcc\" (UID: \"323b58fb-a204-4400-a836-973ccf33cd8e\") " pod="openshift-dns/node-resolver-75bcc" Apr 24 23:53:29.031997 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.030783 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bmfn2\" (UniqueName: \"kubernetes.io/projected/50d11a09-c912-418a-ab1b-e1f5272b1d2f-kube-api-access-bmfn2\") pod \"tuned-nb82k\" (UID: \"50d11a09-c912-418a-ab1b-e1f5272b1d2f\") " pod="openshift-cluster-node-tuning-operator/tuned-nb82k" Apr 24 23:53:29.031997 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.030827 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/281d90a3-293d-45ac-8b4c-87bdc25f3882-host-run-ovn-kubernetes\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.031997 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.030785 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/50d11a09-c912-418a-ab1b-e1f5272b1d2f-host\") pod \"tuned-nb82k\" (UID: \"50d11a09-c912-418a-ab1b-e1f5272b1d2f\") " pod="openshift-cluster-node-tuning-operator/tuned-nb82k" Apr 24 23:53:29.031997 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.030853 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/281d90a3-293d-45ac-8b4c-87bdc25f3882-env-overrides\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.031997 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.030891 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e4504335-0abc-4d74-83b7-89d51ce42839-kubelet-dir\") pod \"aws-ebs-csi-driver-node-qq9fz\" (UID: \"e4504335-0abc-4d74-83b7-89d51ce42839\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qq9fz" Apr 24 23:53:29.031997 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.030914 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1134624f-34d1-4f6e-8821-435df2b54c9b-os-release\") pod \"multus-additional-cni-plugins-mmdsm\" (UID: \"1134624f-34d1-4f6e-8821-435df2b54c9b\") " pod="openshift-multus/multus-additional-cni-plugins-mmdsm" Apr 24 23:53:29.031997 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.030939 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1134624f-34d1-4f6e-8821-435df2b54c9b-cni-binary-copy\") pod \"multus-additional-cni-plugins-mmdsm\" (UID: \"1134624f-34d1-4f6e-8821-435df2b54c9b\") " pod="openshift-multus/multus-additional-cni-plugins-mmdsm" Apr 24 23:53:29.031997 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.030960 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/55dc1a60-4f7d-4366-aca1-303ce72f4d84-multus-conf-dir\") pod \"multus-x9qdz\" (UID: \"55dc1a60-4f7d-4366-aca1-303ce72f4d84\") " pod="openshift-multus/multus-x9qdz" Apr 24 23:53:29.031997 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.030976 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/50d11a09-c912-418a-ab1b-e1f5272b1d2f-var-lib-kubelet\") pod \"tuned-nb82k\" (UID: \"50d11a09-c912-418a-ab1b-e1f5272b1d2f\") " pod="openshift-cluster-node-tuning-operator/tuned-nb82k" Apr 24 23:53:29.031997 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.030995 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/50d11a09-c912-418a-ab1b-e1f5272b1d2f-etc-tuned\") pod \"tuned-nb82k\" (UID: \"50d11a09-c912-418a-ab1b-e1f5272b1d2f\") " pod="openshift-cluster-node-tuning-operator/tuned-nb82k" Apr 24 23:53:29.031997 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.031026 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1134624f-34d1-4f6e-8821-435df2b54c9b-os-release\") pod \"multus-additional-cni-plugins-mmdsm\" (UID: \"1134624f-34d1-4f6e-8821-435df2b54c9b\") " pod="openshift-multus/multus-additional-cni-plugins-mmdsm" Apr 24 23:53:29.031997 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.031042 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/55dc1a60-4f7d-4366-aca1-303ce72f4d84-multus-conf-dir\") pod \"multus-x9qdz\" (UID: \"55dc1a60-4f7d-4366-aca1-303ce72f4d84\") " pod="openshift-multus/multus-x9qdz" Apr 24 23:53:29.031997 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.031046 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/50d11a09-c912-418a-ab1b-e1f5272b1d2f-var-lib-kubelet\") pod \"tuned-nb82k\" (UID: \"50d11a09-c912-418a-ab1b-e1f5272b1d2f\") " pod="openshift-cluster-node-tuning-operator/tuned-nb82k" Apr 24 23:53:29.031997 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.031063 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/50d11a09-c912-418a-ab1b-e1f5272b1d2f-tmp\") pod \"tuned-nb82k\" (UID: \"50d11a09-c912-418a-ab1b-e1f5272b1d2f\") " pod="openshift-cluster-node-tuning-operator/tuned-nb82k" Apr 24 23:53:29.031997 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.031315 2567 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 23:53:29.031997 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.031437 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1134624f-34d1-4f6e-8821-435df2b54c9b-cni-binary-copy\") pod \"multus-additional-cni-plugins-mmdsm\" (UID: \"1134624f-34d1-4f6e-8821-435df2b54c9b\") " pod="openshift-multus/multus-additional-cni-plugins-mmdsm" Apr 24 23:53:29.034519 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.034498 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/50d11a09-c912-418a-ab1b-e1f5272b1d2f-tmp\") pod \"tuned-nb82k\" (UID: \"50d11a09-c912-418a-ab1b-e1f5272b1d2f\") " pod="openshift-cluster-node-tuning-operator/tuned-nb82k" Apr 24 23:53:29.034971 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.034956 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/50d11a09-c912-418a-ab1b-e1f5272b1d2f-etc-tuned\") pod \"tuned-nb82k\" (UID: \"50d11a09-c912-418a-ab1b-e1f5272b1d2f\") " pod="openshift-cluster-node-tuning-operator/tuned-nb82k" Apr 24 23:53:29.039801 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.039774 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d9ft\" (UniqueName: \"kubernetes.io/projected/1134624f-34d1-4f6e-8821-435df2b54c9b-kube-api-access-2d9ft\") pod \"multus-additional-cni-plugins-mmdsm\" (UID: \"1134624f-34d1-4f6e-8821-435df2b54c9b\") " pod="openshift-multus/multus-additional-cni-plugins-mmdsm" Apr 24 23:53:29.039917 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.039784 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggfgt\" (UniqueName: \"kubernetes.io/projected/55dc1a60-4f7d-4366-aca1-303ce72f4d84-kube-api-access-ggfgt\") pod \"multus-x9qdz\" (UID: \"55dc1a60-4f7d-4366-aca1-303ce72f4d84\") " pod="openshift-multus/multus-x9qdz" Apr 24 23:53:29.040597 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.040570 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdxs4\" (UniqueName: \"kubernetes.io/projected/323b58fb-a204-4400-a836-973ccf33cd8e-kube-api-access-cdxs4\") pod \"node-resolver-75bcc\" (UID: \"323b58fb-a204-4400-a836-973ccf33cd8e\") " pod="openshift-dns/node-resolver-75bcc" Apr 24 23:53:29.040743 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.040724 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmfn2\" (UniqueName: \"kubernetes.io/projected/50d11a09-c912-418a-ab1b-e1f5272b1d2f-kube-api-access-bmfn2\") pod \"tuned-nb82k\" (UID: \"50d11a09-c912-418a-ab1b-e1f5272b1d2f\") " pod="openshift-cluster-node-tuning-operator/tuned-nb82k" Apr 24 23:53:29.041187 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.041161 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q9cq\" (UniqueName: \"kubernetes.io/projected/4a7b82bd-bf6c-4091-8f48-64cea3e964a8-kube-api-access-2q9cq\") pod \"network-metrics-daemon-7wg4q\" (UID: \"4a7b82bd-bf6c-4091-8f48-64cea3e964a8\") " pod="openshift-multus/network-metrics-daemon-7wg4q" Apr 24 23:53:29.042004 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.041983 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfbcl\" (UniqueName: \"kubernetes.io/projected/49ea4681-f36b-4e20-a2b5-d76f46611b7a-kube-api-access-wfbcl\") pod \"node-ca-cjk8s\" (UID: \"49ea4681-f36b-4e20-a2b5-d76f46611b7a\") " pod="openshift-image-registry/node-ca-cjk8s" Apr 24 23:53:29.131957 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.131921 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/281d90a3-293d-45ac-8b4c-87bdc25f3882-run-systemd\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.132134 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.131985 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/281d90a3-293d-45ac-8b4c-87bdc25f3882-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.132134 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.132005 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/281d90a3-293d-45ac-8b4c-87bdc25f3882-ovnkube-config\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.132134 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.132007 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/281d90a3-293d-45ac-8b4c-87bdc25f3882-run-systemd\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.132134 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.132020 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e4504335-0abc-4d74-83b7-89d51ce42839-socket-dir\") pod \"aws-ebs-csi-driver-node-qq9fz\" (UID: \"e4504335-0abc-4d74-83b7-89d51ce42839\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qq9fz" Apr 24 23:53:29.132134 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.132072 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qkcdk\" (UniqueName: \"kubernetes.io/projected/e4504335-0abc-4d74-83b7-89d51ce42839-kube-api-access-qkcdk\") pod \"aws-ebs-csi-driver-node-qq9fz\" (UID: \"e4504335-0abc-4d74-83b7-89d51ce42839\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qq9fz" Apr 24 23:53:29.132134 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.132085 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/281d90a3-293d-45ac-8b4c-87bdc25f3882-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.132134 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.132101 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/281d90a3-293d-45ac-8b4c-87bdc25f3882-host-run-ovn-kubernetes\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.132134 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.132126 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/281d90a3-293d-45ac-8b4c-87bdc25f3882-env-overrides\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.132134 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.132131 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e4504335-0abc-4d74-83b7-89d51ce42839-socket-dir\") pod \"aws-ebs-csi-driver-node-qq9fz\" (UID: \"e4504335-0abc-4d74-83b7-89d51ce42839\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qq9fz" Apr 24 23:53:29.132621 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.132179 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/281d90a3-293d-45ac-8b4c-87bdc25f3882-host-run-ovn-kubernetes\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.132621 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.132221 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e4504335-0abc-4d74-83b7-89d51ce42839-kubelet-dir\") pod \"aws-ebs-csi-driver-node-qq9fz\" (UID: \"e4504335-0abc-4d74-83b7-89d51ce42839\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qq9fz" Apr 24 23:53:29.132621 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.132258 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e4504335-0abc-4d74-83b7-89d51ce42839-registration-dir\") pod \"aws-ebs-csi-driver-node-qq9fz\" (UID: \"e4504335-0abc-4d74-83b7-89d51ce42839\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qq9fz" Apr 24 23:53:29.132621 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.132289 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/119f77e1-fd03-4239-a927-07a04816a852-iptables-alerter-script\") pod \"iptables-alerter-2m9kc\" (UID: \"119f77e1-fd03-4239-a927-07a04816a852\") " pod="openshift-network-operator/iptables-alerter-2m9kc" Apr 24 23:53:29.132621 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.132317 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ljwnq\" (UniqueName: \"kubernetes.io/projected/c44399ed-9019-430d-83d7-8cde0e6f0d03-kube-api-access-ljwnq\") pod \"network-check-target-f9vfv\" (UID: \"c44399ed-9019-430d-83d7-8cde0e6f0d03\") " pod="openshift-network-diagnostics/network-check-target-f9vfv" Apr 24 23:53:29.132621 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.132322 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e4504335-0abc-4d74-83b7-89d51ce42839-kubelet-dir\") pod \"aws-ebs-csi-driver-node-qq9fz\" (UID: \"e4504335-0abc-4d74-83b7-89d51ce42839\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qq9fz" Apr 24 23:53:29.132621 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.132378 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/281d90a3-293d-45ac-8b4c-87bdc25f3882-host-run-netns\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.132621 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.132387 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e4504335-0abc-4d74-83b7-89d51ce42839-registration-dir\") pod \"aws-ebs-csi-driver-node-qq9fz\" (UID: \"e4504335-0abc-4d74-83b7-89d51ce42839\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qq9fz" Apr 24 23:53:29.132621 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.132404 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e4504335-0abc-4d74-83b7-89d51ce42839-device-dir\") pod \"aws-ebs-csi-driver-node-qq9fz\" (UID: \"e4504335-0abc-4d74-83b7-89d51ce42839\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qq9fz" Apr 24 23:53:29.132621 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.132433 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/281d90a3-293d-45ac-8b4c-87bdc25f3882-host-slash\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.132621 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.132463 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/281d90a3-293d-45ac-8b4c-87bdc25f3882-var-lib-openvswitch\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.132621 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.132487 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/281d90a3-293d-45ac-8b4c-87bdc25f3882-run-openvswitch\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.132621 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.132514 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/281d90a3-293d-45ac-8b4c-87bdc25f3882-log-socket\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.132621 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.132539 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/119f77e1-fd03-4239-a927-07a04816a852-host-slash\") pod \"iptables-alerter-2m9kc\" (UID: \"119f77e1-fd03-4239-a927-07a04816a852\") " pod="openshift-network-operator/iptables-alerter-2m9kc" Apr 24 23:53:29.132621 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.132565 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/217a9b6d-2b5e-4ed8-87d7-5820bac053c5-agent-certs\") pod \"konnectivity-agent-ckn8x\" (UID: \"217a9b6d-2b5e-4ed8-87d7-5820bac053c5\") " pod="kube-system/konnectivity-agent-ckn8x" Apr 24 23:53:29.132621 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.132593 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/281d90a3-293d-45ac-8b4c-87bdc25f3882-var-lib-openvswitch\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.132621 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.132601 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/281d90a3-293d-45ac-8b4c-87bdc25f3882-etc-openvswitch\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.133419 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.132638 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/281d90a3-293d-45ac-8b4c-87bdc25f3882-host-run-netns\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.133419 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.132654 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/281d90a3-293d-45ac-8b4c-87bdc25f3882-env-overrides\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.133419 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.132665 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/281d90a3-293d-45ac-8b4c-87bdc25f3882-etc-openvswitch\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.133419 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.132671 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/281d90a3-293d-45ac-8b4c-87bdc25f3882-run-ovn\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.133419 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.132702 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e4504335-0abc-4d74-83b7-89d51ce42839-etc-selinux\") pod \"aws-ebs-csi-driver-node-qq9fz\" (UID: \"e4504335-0abc-4d74-83b7-89d51ce42839\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qq9fz" Apr 24 23:53:29.133419 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.132723 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/281d90a3-293d-45ac-8b4c-87bdc25f3882-run-openvswitch\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.133419 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.132734 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/281d90a3-293d-45ac-8b4c-87bdc25f3882-host-kubelet\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.133419 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.132760 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/281d90a3-293d-45ac-8b4c-87bdc25f3882-host-cni-bin\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.133419 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.132766 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/119f77e1-fd03-4239-a927-07a04816a852-host-slash\") pod \"iptables-alerter-2m9kc\" (UID: \"119f77e1-fd03-4239-a927-07a04816a852\") " pod="openshift-network-operator/iptables-alerter-2m9kc" Apr 24 23:53:29.133419 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.132786 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/281d90a3-293d-45ac-8b4c-87bdc25f3882-host-cni-netd\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.133419 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.132805 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/281d90a3-293d-45ac-8b4c-87bdc25f3882-log-socket\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.133419 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.132814 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/281d90a3-293d-45ac-8b4c-87bdc25f3882-ovn-node-metrics-cert\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.133419 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.132839 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e4504335-0abc-4d74-83b7-89d51ce42839-sys-fs\") pod \"aws-ebs-csi-driver-node-qq9fz\" (UID: \"e4504335-0abc-4d74-83b7-89d51ce42839\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qq9fz" Apr 24 23:53:29.133419 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.132851 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e4504335-0abc-4d74-83b7-89d51ce42839-device-dir\") pod \"aws-ebs-csi-driver-node-qq9fz\" (UID: \"e4504335-0abc-4d74-83b7-89d51ce42839\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qq9fz" Apr 24 23:53:29.133419 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.132867 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/281d90a3-293d-45ac-8b4c-87bdc25f3882-node-log\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.133419 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.132878 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/281d90a3-293d-45ac-8b4c-87bdc25f3882-host-kubelet\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.133419 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.132891 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/281d90a3-293d-45ac-8b4c-87bdc25f3882-host-slash\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.134129 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.132892 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rsvch\" (UniqueName: \"kubernetes.io/projected/281d90a3-293d-45ac-8b4c-87bdc25f3882-kube-api-access-rsvch\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.134129 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.132882 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/119f77e1-fd03-4239-a927-07a04816a852-iptables-alerter-script\") pod \"iptables-alerter-2m9kc\" (UID: \"119f77e1-fd03-4239-a927-07a04816a852\") " pod="openshift-network-operator/iptables-alerter-2m9kc" Apr 24 23:53:29.134129 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.132942 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/281d90a3-293d-45ac-8b4c-87bdc25f3882-run-ovn\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.134129 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.132979 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/281d90a3-293d-45ac-8b4c-87bdc25f3882-host-cni-netd\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.134129 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.133423 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e4504335-0abc-4d74-83b7-89d51ce42839-sys-fs\") pod \"aws-ebs-csi-driver-node-qq9fz\" (UID: \"e4504335-0abc-4d74-83b7-89d51ce42839\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qq9fz" Apr 24 23:53:29.134129 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.133512 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/281d90a3-293d-45ac-8b4c-87bdc25f3882-host-cni-bin\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.134129 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.133574 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e4504335-0abc-4d74-83b7-89d51ce42839-etc-selinux\") pod \"aws-ebs-csi-driver-node-qq9fz\" (UID: \"e4504335-0abc-4d74-83b7-89d51ce42839\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qq9fz" Apr 24 23:53:29.134129 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.133579 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/281d90a3-293d-45ac-8b4c-87bdc25f3882-ovnkube-config\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.134129 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.133740 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8zs8v\" (UniqueName: \"kubernetes.io/projected/119f77e1-fd03-4239-a927-07a04816a852-kube-api-access-8zs8v\") pod \"iptables-alerter-2m9kc\" (UID: \"119f77e1-fd03-4239-a927-07a04816a852\") " pod="openshift-network-operator/iptables-alerter-2m9kc" Apr 24 23:53:29.134129 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.133787 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/217a9b6d-2b5e-4ed8-87d7-5820bac053c5-konnectivity-ca\") pod \"konnectivity-agent-ckn8x\" (UID: \"217a9b6d-2b5e-4ed8-87d7-5820bac053c5\") " pod="kube-system/konnectivity-agent-ckn8x" Apr 24 23:53:29.134593 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.134315 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/281d90a3-293d-45ac-8b4c-87bdc25f3882-systemd-units\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.134593 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.134318 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/281d90a3-293d-45ac-8b4c-87bdc25f3882-systemd-units\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.134593 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.134420 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/281d90a3-293d-45ac-8b4c-87bdc25f3882-ovnkube-script-lib\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.134593 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.134490 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/281d90a3-293d-45ac-8b4c-87bdc25f3882-node-log\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.136893 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.135078 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/281d90a3-293d-45ac-8b4c-87bdc25f3882-ovnkube-script-lib\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.136893 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.135483 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/217a9b6d-2b5e-4ed8-87d7-5820bac053c5-konnectivity-ca\") pod \"konnectivity-agent-ckn8x\" (UID: \"217a9b6d-2b5e-4ed8-87d7-5820bac053c5\") " pod="kube-system/konnectivity-agent-ckn8x" Apr 24 23:53:29.136893 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.136462 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/281d90a3-293d-45ac-8b4c-87bdc25f3882-ovn-node-metrics-cert\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.137625 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.137603 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/217a9b6d-2b5e-4ed8-87d7-5820bac053c5-agent-certs\") pod \"konnectivity-agent-ckn8x\" (UID: \"217a9b6d-2b5e-4ed8-87d7-5820bac053c5\") " pod="kube-system/konnectivity-agent-ckn8x" Apr 24 23:53:29.137712 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.137676 2567 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:53:29.138397 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:29.138354 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:53:29.138397 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:29.138396 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:53:29.138551 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:29.138410 2567 projected.go:194] Error preparing data for projected volume kube-api-access-ljwnq for pod openshift-network-diagnostics/network-check-target-f9vfv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:29.138551 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:29.138485 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c44399ed-9019-430d-83d7-8cde0e6f0d03-kube-api-access-ljwnq podName:c44399ed-9019-430d-83d7-8cde0e6f0d03 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:29.638467199 +0000 UTC m=+3.161751405 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-ljwnq" (UniqueName: "kubernetes.io/projected/c44399ed-9019-430d-83d7-8cde0e6f0d03-kube-api-access-ljwnq") pod "network-check-target-f9vfv" (UID: "c44399ed-9019-430d-83d7-8cde0e6f0d03") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:29.140701 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.140633 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkcdk\" (UniqueName: \"kubernetes.io/projected/e4504335-0abc-4d74-83b7-89d51ce42839-kube-api-access-qkcdk\") pod \"aws-ebs-csi-driver-node-qq9fz\" (UID: \"e4504335-0abc-4d74-83b7-89d51ce42839\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qq9fz" Apr 24 23:53:29.140974 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.140956 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsvch\" (UniqueName: \"kubernetes.io/projected/281d90a3-293d-45ac-8b4c-87bdc25f3882-kube-api-access-rsvch\") pod \"ovnkube-node-7hz6p\" (UID: \"281d90a3-293d-45ac-8b4c-87bdc25f3882\") " pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.142717 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.142674 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zs8v\" (UniqueName: \"kubernetes.io/projected/119f77e1-fd03-4239-a927-07a04816a852-kube-api-access-8zs8v\") pod \"iptables-alerter-2m9kc\" (UID: \"119f77e1-fd03-4239-a927-07a04816a852\") " pod="openshift-network-operator/iptables-alerter-2m9kc" Apr 24 23:53:29.216191 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.216156 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-nb82k" Apr 24 23:53:29.224955 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.224930 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-cjk8s" Apr 24 23:53:29.232519 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.232497 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-75bcc" Apr 24 23:53:29.239156 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.239135 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-mmdsm" Apr 24 23:53:29.244715 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.244692 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-x9qdz" Apr 24 23:53:29.252189 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.252173 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-2m9kc" Apr 24 23:53:29.257796 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.257778 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:29.264270 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.264250 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-ckn8x" Apr 24 23:53:29.268845 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.268827 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qq9fz" Apr 24 23:53:29.537226 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.537155 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a7b82bd-bf6c-4091-8f48-64cea3e964a8-metrics-certs\") pod \"network-metrics-daemon-7wg4q\" (UID: \"4a7b82bd-bf6c-4091-8f48-64cea3e964a8\") " pod="openshift-multus/network-metrics-daemon-7wg4q" Apr 24 23:53:29.537386 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:29.537282 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:29.537386 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:29.537338 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a7b82bd-bf6c-4091-8f48-64cea3e964a8-metrics-certs podName:4a7b82bd-bf6c-4091-8f48-64cea3e964a8 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:30.537324144 +0000 UTC m=+4.060608346 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4a7b82bd-bf6c-4091-8f48-64cea3e964a8-metrics-certs") pod "network-metrics-daemon-7wg4q" (UID: "4a7b82bd-bf6c-4091-8f48-64cea3e964a8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:29.677040 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:29.677015 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55dc1a60_4f7d_4366_aca1_303ce72f4d84.slice/crio-1e0886d05085b1ebbd00dddb55270cb6f9295436962ae2cd25e3081f0a5fb819 WatchSource:0}: Error finding container 1e0886d05085b1ebbd00dddb55270cb6f9295436962ae2cd25e3081f0a5fb819: Status 404 returned error can't find the container with id 1e0886d05085b1ebbd00dddb55270cb6f9295436962ae2cd25e3081f0a5fb819 Apr 24 23:53:29.678661 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:29.678581 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50d11a09_c912_418a_ab1b_e1f5272b1d2f.slice/crio-4b9a5a5949696f35b1bc6f0f188ac31050c535b2c06d474a8d61b5065828e39e WatchSource:0}: Error finding container 4b9a5a5949696f35b1bc6f0f188ac31050c535b2c06d474a8d61b5065828e39e: Status 404 returned error can't find the container with id 4b9a5a5949696f35b1bc6f0f188ac31050c535b2c06d474a8d61b5065828e39e Apr 24 23:53:29.682343 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:29.682317 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49ea4681_f36b_4e20_a2b5_d76f46611b7a.slice/crio-7288057c3111e9e191239a7d3ba268aa1b73fa2d73bedb61e52212397b7c3042 WatchSource:0}: Error finding container 7288057c3111e9e191239a7d3ba268aa1b73fa2d73bedb61e52212397b7c3042: Status 404 returned error can't find the container with id 7288057c3111e9e191239a7d3ba268aa1b73fa2d73bedb61e52212397b7c3042 Apr 24 23:53:29.682827 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:29.682805 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1134624f_34d1_4f6e_8821_435df2b54c9b.slice/crio-f9eb315acb54228224dc805cfa79f9f18393b995372e33c2925e2e17ca490c08 WatchSource:0}: Error finding container f9eb315acb54228224dc805cfa79f9f18393b995372e33c2925e2e17ca490c08: Status 404 returned error can't find the container with id f9eb315acb54228224dc805cfa79f9f18393b995372e33c2925e2e17ca490c08 Apr 24 23:53:29.683787 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:29.683732 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4504335_0abc_4d74_83b7_89d51ce42839.slice/crio-bcd996c26b407a938cc125d9970513079592f8dfea34a80c50ed251c527860d3 WatchSource:0}: Error finding container bcd996c26b407a938cc125d9970513079592f8dfea34a80c50ed251c527860d3: Status 404 returned error can't find the container with id bcd996c26b407a938cc125d9970513079592f8dfea34a80c50ed251c527860d3 Apr 24 23:53:29.684845 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:29.684743 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod119f77e1_fd03_4239_a927_07a04816a852.slice/crio-43d984a70e5e57cbd16a35ac390696260f37ae1b1d57fd26172649864971379d WatchSource:0}: Error finding container 43d984a70e5e57cbd16a35ac390696260f37ae1b1d57fd26172649864971379d: Status 404 returned error can't find the container with id 43d984a70e5e57cbd16a35ac390696260f37ae1b1d57fd26172649864971379d Apr 24 23:53:29.685578 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:29.685501 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod281d90a3_293d_45ac_8b4c_87bdc25f3882.slice/crio-73096e4b43ab9dea96e8ea690a61e0b7cdc35ffefe2c16d56bf5ff706d6652fe WatchSource:0}: Error finding container 73096e4b43ab9dea96e8ea690a61e0b7cdc35ffefe2c16d56bf5ff706d6652fe: Status 404 returned error can't find the container with id 73096e4b43ab9dea96e8ea690a61e0b7cdc35ffefe2c16d56bf5ff706d6652fe Apr 24 23:53:29.687704 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:29.687511 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod323b58fb_a204_4400_a836_973ccf33cd8e.slice/crio-3adf8fe05795f06506f76db5d246b6afe56ea4ecbd035e7c5c27c7e4b780a78c WatchSource:0}: Error finding container 3adf8fe05795f06506f76db5d246b6afe56ea4ecbd035e7c5c27c7e4b780a78c: Status 404 returned error can't find the container with id 3adf8fe05795f06506f76db5d246b6afe56ea4ecbd035e7c5c27c7e4b780a78c Apr 24 23:53:29.688018 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:53:29.687955 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod217a9b6d_2b5e_4ed8_87d7_5820bac053c5.slice/crio-8c839a53ff3c586395782baf22363e9c2bee9fe3c0849f107e8b0a8935d446b6 WatchSource:0}: Error finding container 8c839a53ff3c586395782baf22363e9c2bee9fe3c0849f107e8b0a8935d446b6: Status 404 returned error can't find the container with id 8c839a53ff3c586395782baf22363e9c2bee9fe3c0849f107e8b0a8935d446b6 Apr 24 23:53:29.738714 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.738542 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ljwnq\" (UniqueName: \"kubernetes.io/projected/c44399ed-9019-430d-83d7-8cde0e6f0d03-kube-api-access-ljwnq\") pod \"network-check-target-f9vfv\" (UID: \"c44399ed-9019-430d-83d7-8cde0e6f0d03\") " pod="openshift-network-diagnostics/network-check-target-f9vfv" Apr 24 23:53:29.738714 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:29.738703 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:53:29.738714 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:29.738722 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:53:29.738968 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:29.738732 2567 projected.go:194] Error preparing data for projected volume kube-api-access-ljwnq for pod openshift-network-diagnostics/network-check-target-f9vfv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:29.738968 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:29.738790 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c44399ed-9019-430d-83d7-8cde0e6f0d03-kube-api-access-ljwnq podName:c44399ed-9019-430d-83d7-8cde0e6f0d03 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:30.738765594 +0000 UTC m=+4.262049796 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-ljwnq" (UniqueName: "kubernetes.io/projected/c44399ed-9019-430d-83d7-8cde0e6f0d03-kube-api-access-ljwnq") pod "network-check-target-f9vfv" (UID: "c44399ed-9019-430d-83d7-8cde0e6f0d03") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:29.953959 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.953823 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 23:48:27 +0000 UTC" deadline="2028-01-14 22:21:43.502415716 +0000 UTC" Apr 24 23:53:29.953959 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:29.953864 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15118h28m13.548555504s" Apr 24 23:53:30.014354 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:30.014321 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f9vfv" Apr 24 23:53:30.014549 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:30.014477 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f9vfv" podUID="c44399ed-9019-430d-83d7-8cde0e6f0d03" Apr 24 23:53:30.031188 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:30.031148 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-2m9kc" event={"ID":"119f77e1-fd03-4239-a927-07a04816a852","Type":"ContainerStarted","Data":"43d984a70e5e57cbd16a35ac390696260f37ae1b1d57fd26172649864971379d"} Apr 24 23:53:30.034860 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:30.034825 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qq9fz" event={"ID":"e4504335-0abc-4d74-83b7-89d51ce42839","Type":"ContainerStarted","Data":"bcd996c26b407a938cc125d9970513079592f8dfea34a80c50ed251c527860d3"} Apr 24 23:53:30.040993 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:30.039204 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-cjk8s" event={"ID":"49ea4681-f36b-4e20-a2b5-d76f46611b7a","Type":"ContainerStarted","Data":"7288057c3111e9e191239a7d3ba268aa1b73fa2d73bedb61e52212397b7c3042"} Apr 24 23:53:30.043285 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:30.042846 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-nb82k" event={"ID":"50d11a09-c912-418a-ab1b-e1f5272b1d2f","Type":"ContainerStarted","Data":"4b9a5a5949696f35b1bc6f0f188ac31050c535b2c06d474a8d61b5065828e39e"} Apr 24 23:53:30.045834 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:30.045777 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-x9qdz" event={"ID":"55dc1a60-4f7d-4366-aca1-303ce72f4d84","Type":"ContainerStarted","Data":"1e0886d05085b1ebbd00dddb55270cb6f9295436962ae2cd25e3081f0a5fb819"} Apr 24 23:53:30.053129 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:30.053076 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-ckn8x" event={"ID":"217a9b6d-2b5e-4ed8-87d7-5820bac053c5","Type":"ContainerStarted","Data":"8c839a53ff3c586395782baf22363e9c2bee9fe3c0849f107e8b0a8935d446b6"} Apr 24 23:53:30.063167 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:30.062067 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-75bcc" event={"ID":"323b58fb-a204-4400-a836-973ccf33cd8e","Type":"ContainerStarted","Data":"3adf8fe05795f06506f76db5d246b6afe56ea4ecbd035e7c5c27c7e4b780a78c"} Apr 24 23:53:30.063891 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:30.063857 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mmdsm" event={"ID":"1134624f-34d1-4f6e-8821-435df2b54c9b","Type":"ContainerStarted","Data":"f9eb315acb54228224dc805cfa79f9f18393b995372e33c2925e2e17ca490c08"} Apr 24 23:53:30.069301 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:30.069264 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-201.ec2.internal" event={"ID":"32797e584e4864d985182231bd63814e","Type":"ContainerStarted","Data":"a000f44d8a4c3941e760a769abf4f184376ce17143fa81344eac9b33b0bffcf5"} Apr 24 23:53:30.072689 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:30.072645 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" event={"ID":"281d90a3-293d-45ac-8b4c-87bdc25f3882","Type":"ContainerStarted","Data":"73096e4b43ab9dea96e8ea690a61e0b7cdc35ffefe2c16d56bf5ff706d6652fe"} Apr 24 23:53:30.082716 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:30.082641 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-201.ec2.internal" podStartSLOduration=3.082622603 podStartE2EDuration="3.082622603s" podCreationTimestamp="2026-04-24 23:53:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:53:30.082157282 +0000 UTC m=+3.605441515" watchObservedRunningTime="2026-04-24 23:53:30.082622603 +0000 UTC m=+3.605906829" Apr 24 23:53:30.544506 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:30.544466 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a7b82bd-bf6c-4091-8f48-64cea3e964a8-metrics-certs\") pod \"network-metrics-daemon-7wg4q\" (UID: \"4a7b82bd-bf6c-4091-8f48-64cea3e964a8\") " pod="openshift-multus/network-metrics-daemon-7wg4q" Apr 24 23:53:30.544682 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:30.544616 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:30.544754 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:30.544685 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a7b82bd-bf6c-4091-8f48-64cea3e964a8-metrics-certs podName:4a7b82bd-bf6c-4091-8f48-64cea3e964a8 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:32.544666079 +0000 UTC m=+6.067950284 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4a7b82bd-bf6c-4091-8f48-64cea3e964a8-metrics-certs") pod "network-metrics-daemon-7wg4q" (UID: "4a7b82bd-bf6c-4091-8f48-64cea3e964a8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:30.745843 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:30.745765 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ljwnq\" (UniqueName: \"kubernetes.io/projected/c44399ed-9019-430d-83d7-8cde0e6f0d03-kube-api-access-ljwnq\") pod \"network-check-target-f9vfv\" (UID: \"c44399ed-9019-430d-83d7-8cde0e6f0d03\") " pod="openshift-network-diagnostics/network-check-target-f9vfv" Apr 24 23:53:30.746005 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:30.745965 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:53:30.746005 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:30.745983 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:53:30.746005 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:30.745995 2567 projected.go:194] Error preparing data for projected volume kube-api-access-ljwnq for pod openshift-network-diagnostics/network-check-target-f9vfv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:30.746156 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:30.746051 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c44399ed-9019-430d-83d7-8cde0e6f0d03-kube-api-access-ljwnq podName:c44399ed-9019-430d-83d7-8cde0e6f0d03 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:32.746031986 +0000 UTC m=+6.269316207 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-ljwnq" (UniqueName: "kubernetes.io/projected/c44399ed-9019-430d-83d7-8cde0e6f0d03-kube-api-access-ljwnq") pod "network-check-target-f9vfv" (UID: "c44399ed-9019-430d-83d7-8cde0e6f0d03") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:31.017165 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:31.017129 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7wg4q" Apr 24 23:53:31.017654 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:31.017275 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7wg4q" podUID="4a7b82bd-bf6c-4091-8f48-64cea3e964a8" Apr 24 23:53:31.096060 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:31.095762 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-201.ec2.internal" event={"ID":"180c36d91721dcc56c12d3f2d3227bee","Type":"ContainerStarted","Data":"f20002142e98bc0bf957bb91def9bd5712bb17be92cc698e53c0649e058b0af3"} Apr 24 23:53:32.014894 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:32.014858 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f9vfv" Apr 24 23:53:32.015096 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:32.014990 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f9vfv" podUID="c44399ed-9019-430d-83d7-8cde0e6f0d03" Apr 24 23:53:32.105254 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:32.104463 2567 generic.go:358] "Generic (PLEG): container finished" podID="180c36d91721dcc56c12d3f2d3227bee" containerID="f20002142e98bc0bf957bb91def9bd5712bb17be92cc698e53c0649e058b0af3" exitCode=0 Apr 24 23:53:32.105254 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:32.104520 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-201.ec2.internal" event={"ID":"180c36d91721dcc56c12d3f2d3227bee","Type":"ContainerDied","Data":"f20002142e98bc0bf957bb91def9bd5712bb17be92cc698e53c0649e058b0af3"} Apr 24 23:53:32.559709 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:32.559669 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a7b82bd-bf6c-4091-8f48-64cea3e964a8-metrics-certs\") pod \"network-metrics-daemon-7wg4q\" (UID: \"4a7b82bd-bf6c-4091-8f48-64cea3e964a8\") " pod="openshift-multus/network-metrics-daemon-7wg4q" Apr 24 23:53:32.559915 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:32.559848 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:32.559985 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:32.559933 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a7b82bd-bf6c-4091-8f48-64cea3e964a8-metrics-certs podName:4a7b82bd-bf6c-4091-8f48-64cea3e964a8 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:36.559912075 +0000 UTC m=+10.083196287 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4a7b82bd-bf6c-4091-8f48-64cea3e964a8-metrics-certs") pod "network-metrics-daemon-7wg4q" (UID: "4a7b82bd-bf6c-4091-8f48-64cea3e964a8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:32.761284 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:32.761188 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ljwnq\" (UniqueName: \"kubernetes.io/projected/c44399ed-9019-430d-83d7-8cde0e6f0d03-kube-api-access-ljwnq\") pod \"network-check-target-f9vfv\" (UID: \"c44399ed-9019-430d-83d7-8cde0e6f0d03\") " pod="openshift-network-diagnostics/network-check-target-f9vfv" Apr 24 23:53:32.761480 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:32.761411 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:53:32.761480 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:32.761431 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:53:32.761480 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:32.761444 2567 projected.go:194] Error preparing data for projected volume kube-api-access-ljwnq for pod openshift-network-diagnostics/network-check-target-f9vfv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:32.761642 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:32.761505 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c44399ed-9019-430d-83d7-8cde0e6f0d03-kube-api-access-ljwnq podName:c44399ed-9019-430d-83d7-8cde0e6f0d03 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:36.761485594 +0000 UTC m=+10.284769798 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-ljwnq" (UniqueName: "kubernetes.io/projected/c44399ed-9019-430d-83d7-8cde0e6f0d03-kube-api-access-ljwnq") pod "network-check-target-f9vfv" (UID: "c44399ed-9019-430d-83d7-8cde0e6f0d03") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:33.016006 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:33.015458 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7wg4q" Apr 24 23:53:33.016006 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:33.015606 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7wg4q" podUID="4a7b82bd-bf6c-4091-8f48-64cea3e964a8" Apr 24 23:53:34.014405 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:34.014358 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f9vfv" Apr 24 23:53:34.014758 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:34.014480 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f9vfv" podUID="c44399ed-9019-430d-83d7-8cde0e6f0d03" Apr 24 23:53:35.016043 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:35.015534 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7wg4q" Apr 24 23:53:35.016043 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:35.015682 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7wg4q" podUID="4a7b82bd-bf6c-4091-8f48-64cea3e964a8" Apr 24 23:53:36.015077 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:36.014945 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f9vfv" Apr 24 23:53:36.015287 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:36.015087 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f9vfv" podUID="c44399ed-9019-430d-83d7-8cde0e6f0d03" Apr 24 23:53:36.601030 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:36.594178 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a7b82bd-bf6c-4091-8f48-64cea3e964a8-metrics-certs\") pod \"network-metrics-daemon-7wg4q\" (UID: \"4a7b82bd-bf6c-4091-8f48-64cea3e964a8\") " pod="openshift-multus/network-metrics-daemon-7wg4q" Apr 24 23:53:36.601525 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:36.601162 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:36.601525 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:36.601274 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a7b82bd-bf6c-4091-8f48-64cea3e964a8-metrics-certs podName:4a7b82bd-bf6c-4091-8f48-64cea3e964a8 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:44.601251788 +0000 UTC m=+18.124535993 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4a7b82bd-bf6c-4091-8f48-64cea3e964a8-metrics-certs") pod "network-metrics-daemon-7wg4q" (UID: "4a7b82bd-bf6c-4091-8f48-64cea3e964a8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:36.803047 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:36.802934 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ljwnq\" (UniqueName: \"kubernetes.io/projected/c44399ed-9019-430d-83d7-8cde0e6f0d03-kube-api-access-ljwnq\") pod \"network-check-target-f9vfv\" (UID: \"c44399ed-9019-430d-83d7-8cde0e6f0d03\") " pod="openshift-network-diagnostics/network-check-target-f9vfv" Apr 24 23:53:36.803235 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:36.803137 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:53:36.803235 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:36.803164 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:53:36.803235 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:36.803180 2567 projected.go:194] Error preparing data for projected volume kube-api-access-ljwnq for pod openshift-network-diagnostics/network-check-target-f9vfv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:36.803408 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:36.803246 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c44399ed-9019-430d-83d7-8cde0e6f0d03-kube-api-access-ljwnq podName:c44399ed-9019-430d-83d7-8cde0e6f0d03 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:44.803226033 +0000 UTC m=+18.326510260 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-ljwnq" (UniqueName: "kubernetes.io/projected/c44399ed-9019-430d-83d7-8cde0e6f0d03-kube-api-access-ljwnq") pod "network-check-target-f9vfv" (UID: "c44399ed-9019-430d-83d7-8cde0e6f0d03") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:37.016170 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:37.016138 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7wg4q" Apr 24 23:53:37.016377 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:37.016257 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7wg4q" podUID="4a7b82bd-bf6c-4091-8f48-64cea3e964a8" Apr 24 23:53:38.014913 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:38.014869 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f9vfv" Apr 24 23:53:38.015405 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:38.015010 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f9vfv" podUID="c44399ed-9019-430d-83d7-8cde0e6f0d03" Apr 24 23:53:38.131987 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:38.131904 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-6bc8h"] Apr 24 23:53:38.134875 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:38.134848 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6bc8h" Apr 24 23:53:38.135016 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:38.134931 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6bc8h" podUID="96c2ed0b-d4e5-4737-9f3e-52e6828f930d" Apr 24 23:53:38.215611 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:38.215561 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/96c2ed0b-d4e5-4737-9f3e-52e6828f930d-dbus\") pod \"global-pull-secret-syncer-6bc8h\" (UID: \"96c2ed0b-d4e5-4737-9f3e-52e6828f930d\") " pod="kube-system/global-pull-secret-syncer-6bc8h" Apr 24 23:53:38.215780 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:38.215621 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/96c2ed0b-d4e5-4737-9f3e-52e6828f930d-kubelet-config\") pod \"global-pull-secret-syncer-6bc8h\" (UID: \"96c2ed0b-d4e5-4737-9f3e-52e6828f930d\") " pod="kube-system/global-pull-secret-syncer-6bc8h" Apr 24 23:53:38.215780 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:38.215702 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/96c2ed0b-d4e5-4737-9f3e-52e6828f930d-original-pull-secret\") pod \"global-pull-secret-syncer-6bc8h\" (UID: \"96c2ed0b-d4e5-4737-9f3e-52e6828f930d\") " pod="kube-system/global-pull-secret-syncer-6bc8h" Apr 24 23:53:38.317157 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:38.317066 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/96c2ed0b-d4e5-4737-9f3e-52e6828f930d-original-pull-secret\") pod \"global-pull-secret-syncer-6bc8h\" (UID: \"96c2ed0b-d4e5-4737-9f3e-52e6828f930d\") " pod="kube-system/global-pull-secret-syncer-6bc8h" Apr 24 23:53:38.317157 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:38.317116 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/96c2ed0b-d4e5-4737-9f3e-52e6828f930d-dbus\") pod \"global-pull-secret-syncer-6bc8h\" (UID: \"96c2ed0b-d4e5-4737-9f3e-52e6828f930d\") " pod="kube-system/global-pull-secret-syncer-6bc8h" Apr 24 23:53:38.317157 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:38.317153 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/96c2ed0b-d4e5-4737-9f3e-52e6828f930d-kubelet-config\") pod \"global-pull-secret-syncer-6bc8h\" (UID: \"96c2ed0b-d4e5-4737-9f3e-52e6828f930d\") " pod="kube-system/global-pull-secret-syncer-6bc8h" Apr 24 23:53:38.317453 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:38.317222 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:38.317453 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:38.317277 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/96c2ed0b-d4e5-4737-9f3e-52e6828f930d-original-pull-secret podName:96c2ed0b-d4e5-4737-9f3e-52e6828f930d nodeName:}" failed. No retries permitted until 2026-04-24 23:53:38.817263119 +0000 UTC m=+12.340547325 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/96c2ed0b-d4e5-4737-9f3e-52e6828f930d-original-pull-secret") pod "global-pull-secret-syncer-6bc8h" (UID: "96c2ed0b-d4e5-4737-9f3e-52e6828f930d") : object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:38.317453 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:38.317354 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/96c2ed0b-d4e5-4737-9f3e-52e6828f930d-dbus\") pod \"global-pull-secret-syncer-6bc8h\" (UID: \"96c2ed0b-d4e5-4737-9f3e-52e6828f930d\") " pod="kube-system/global-pull-secret-syncer-6bc8h" Apr 24 23:53:38.317453 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:38.317442 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/96c2ed0b-d4e5-4737-9f3e-52e6828f930d-kubelet-config\") pod \"global-pull-secret-syncer-6bc8h\" (UID: \"96c2ed0b-d4e5-4737-9f3e-52e6828f930d\") " pod="kube-system/global-pull-secret-syncer-6bc8h" Apr 24 23:53:38.821122 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:38.821087 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/96c2ed0b-d4e5-4737-9f3e-52e6828f930d-original-pull-secret\") pod \"global-pull-secret-syncer-6bc8h\" (UID: \"96c2ed0b-d4e5-4737-9f3e-52e6828f930d\") " pod="kube-system/global-pull-secret-syncer-6bc8h" Apr 24 23:53:38.821311 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:38.821226 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:38.821311 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:38.821302 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/96c2ed0b-d4e5-4737-9f3e-52e6828f930d-original-pull-secret podName:96c2ed0b-d4e5-4737-9f3e-52e6828f930d nodeName:}" failed. No retries permitted until 2026-04-24 23:53:39.821283109 +0000 UTC m=+13.344567311 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/96c2ed0b-d4e5-4737-9f3e-52e6828f930d-original-pull-secret") pod "global-pull-secret-syncer-6bc8h" (UID: "96c2ed0b-d4e5-4737-9f3e-52e6828f930d") : object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:39.014891 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:39.014802 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7wg4q" Apr 24 23:53:39.015048 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:39.015005 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7wg4q" podUID="4a7b82bd-bf6c-4091-8f48-64cea3e964a8" Apr 24 23:53:39.828453 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:39.828413 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/96c2ed0b-d4e5-4737-9f3e-52e6828f930d-original-pull-secret\") pod \"global-pull-secret-syncer-6bc8h\" (UID: \"96c2ed0b-d4e5-4737-9f3e-52e6828f930d\") " pod="kube-system/global-pull-secret-syncer-6bc8h" Apr 24 23:53:39.828619 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:39.828558 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:39.828684 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:39.828620 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/96c2ed0b-d4e5-4737-9f3e-52e6828f930d-original-pull-secret podName:96c2ed0b-d4e5-4737-9f3e-52e6828f930d nodeName:}" failed. No retries permitted until 2026-04-24 23:53:41.828606264 +0000 UTC m=+15.351890466 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/96c2ed0b-d4e5-4737-9f3e-52e6828f930d-original-pull-secret") pod "global-pull-secret-syncer-6bc8h" (UID: "96c2ed0b-d4e5-4737-9f3e-52e6828f930d") : object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:40.015155 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:40.015112 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f9vfv" Apr 24 23:53:40.015644 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:40.015113 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6bc8h" Apr 24 23:53:40.015644 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:40.015251 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f9vfv" podUID="c44399ed-9019-430d-83d7-8cde0e6f0d03" Apr 24 23:53:40.015644 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:40.015331 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6bc8h" podUID="96c2ed0b-d4e5-4737-9f3e-52e6828f930d" Apr 24 23:53:41.014713 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:41.014678 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7wg4q" Apr 24 23:53:41.014987 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:41.014817 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7wg4q" podUID="4a7b82bd-bf6c-4091-8f48-64cea3e964a8" Apr 24 23:53:41.840929 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:41.840886 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/96c2ed0b-d4e5-4737-9f3e-52e6828f930d-original-pull-secret\") pod \"global-pull-secret-syncer-6bc8h\" (UID: \"96c2ed0b-d4e5-4737-9f3e-52e6828f930d\") " pod="kube-system/global-pull-secret-syncer-6bc8h" Apr 24 23:53:41.841410 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:41.841021 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:41.841410 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:41.841096 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/96c2ed0b-d4e5-4737-9f3e-52e6828f930d-original-pull-secret podName:96c2ed0b-d4e5-4737-9f3e-52e6828f930d nodeName:}" failed. No retries permitted until 2026-04-24 23:53:45.841077035 +0000 UTC m=+19.364361240 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/96c2ed0b-d4e5-4737-9f3e-52e6828f930d-original-pull-secret") pod "global-pull-secret-syncer-6bc8h" (UID: "96c2ed0b-d4e5-4737-9f3e-52e6828f930d") : object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:42.014418 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:42.014380 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6bc8h" Apr 24 23:53:42.014576 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:42.014380 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f9vfv" Apr 24 23:53:42.014576 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:42.014515 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6bc8h" podUID="96c2ed0b-d4e5-4737-9f3e-52e6828f930d" Apr 24 23:53:42.014701 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:42.014620 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f9vfv" podUID="c44399ed-9019-430d-83d7-8cde0e6f0d03" Apr 24 23:53:43.015026 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:43.014989 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7wg4q" Apr 24 23:53:43.015436 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:43.015127 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7wg4q" podUID="4a7b82bd-bf6c-4091-8f48-64cea3e964a8" Apr 24 23:53:44.015111 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:44.015060 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6bc8h" Apr 24 23:53:44.015111 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:44.015059 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f9vfv" Apr 24 23:53:44.015785 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:44.015198 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6bc8h" podUID="96c2ed0b-d4e5-4737-9f3e-52e6828f930d" Apr 24 23:53:44.015785 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:44.015271 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f9vfv" podUID="c44399ed-9019-430d-83d7-8cde0e6f0d03" Apr 24 23:53:44.662658 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:44.662612 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a7b82bd-bf6c-4091-8f48-64cea3e964a8-metrics-certs\") pod \"network-metrics-daemon-7wg4q\" (UID: \"4a7b82bd-bf6c-4091-8f48-64cea3e964a8\") " pod="openshift-multus/network-metrics-daemon-7wg4q" Apr 24 23:53:44.662942 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:44.662778 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:44.662942 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:44.662854 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a7b82bd-bf6c-4091-8f48-64cea3e964a8-metrics-certs podName:4a7b82bd-bf6c-4091-8f48-64cea3e964a8 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:00.662838218 +0000 UTC m=+34.186122419 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4a7b82bd-bf6c-4091-8f48-64cea3e964a8-metrics-certs") pod "network-metrics-daemon-7wg4q" (UID: "4a7b82bd-bf6c-4091-8f48-64cea3e964a8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:44.864454 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:44.864416 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ljwnq\" (UniqueName: \"kubernetes.io/projected/c44399ed-9019-430d-83d7-8cde0e6f0d03-kube-api-access-ljwnq\") pod \"network-check-target-f9vfv\" (UID: \"c44399ed-9019-430d-83d7-8cde0e6f0d03\") " pod="openshift-network-diagnostics/network-check-target-f9vfv" Apr 24 23:53:44.864650 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:44.864600 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:53:44.864650 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:44.864622 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:53:44.864650 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:44.864632 2567 projected.go:194] Error preparing data for projected volume kube-api-access-ljwnq for pod openshift-network-diagnostics/network-check-target-f9vfv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:44.864815 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:44.864694 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c44399ed-9019-430d-83d7-8cde0e6f0d03-kube-api-access-ljwnq podName:c44399ed-9019-430d-83d7-8cde0e6f0d03 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:00.864676345 +0000 UTC m=+34.387960548 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-ljwnq" (UniqueName: "kubernetes.io/projected/c44399ed-9019-430d-83d7-8cde0e6f0d03-kube-api-access-ljwnq") pod "network-check-target-f9vfv" (UID: "c44399ed-9019-430d-83d7-8cde0e6f0d03") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:45.015154 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:45.015066 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7wg4q" Apr 24 23:53:45.015611 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:45.015206 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7wg4q" podUID="4a7b82bd-bf6c-4091-8f48-64cea3e964a8" Apr 24 23:53:45.872609 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:45.872572 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/96c2ed0b-d4e5-4737-9f3e-52e6828f930d-original-pull-secret\") pod \"global-pull-secret-syncer-6bc8h\" (UID: \"96c2ed0b-d4e5-4737-9f3e-52e6828f930d\") " pod="kube-system/global-pull-secret-syncer-6bc8h" Apr 24 23:53:45.872785 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:45.872714 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:45.872785 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:45.872771 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/96c2ed0b-d4e5-4737-9f3e-52e6828f930d-original-pull-secret podName:96c2ed0b-d4e5-4737-9f3e-52e6828f930d nodeName:}" failed. No retries permitted until 2026-04-24 23:53:53.872758497 +0000 UTC m=+27.396042699 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/96c2ed0b-d4e5-4737-9f3e-52e6828f930d-original-pull-secret") pod "global-pull-secret-syncer-6bc8h" (UID: "96c2ed0b-d4e5-4737-9f3e-52e6828f930d") : object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:46.014799 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:46.014774 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f9vfv" Apr 24 23:53:46.014960 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:46.014817 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6bc8h" Apr 24 23:53:46.014960 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:46.014901 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6bc8h" podUID="96c2ed0b-d4e5-4737-9f3e-52e6828f930d" Apr 24 23:53:46.015041 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:46.014971 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f9vfv" podUID="c44399ed-9019-430d-83d7-8cde0e6f0d03" Apr 24 23:53:47.003220 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:47.003184 2567 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod281d90a3_293d_45ac_8b4c_87bdc25f3882.slice/crio-conmon-d7436439c2d96b95f3885d7ea24a211b8f99e914482f232e41d76d495e94c012.scope\": RecentStats: unable to find data in memory cache]" Apr 24 23:53:47.016833 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:47.016809 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7wg4q" Apr 24 23:53:47.016999 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:47.016959 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7wg4q" podUID="4a7b82bd-bf6c-4091-8f48-64cea3e964a8" Apr 24 23:53:47.133197 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:47.133059 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7hz6p_281d90a3-293d-45ac-8b4c-87bdc25f3882/ovn-acl-logging/0.log" Apr 24 23:53:47.133602 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:47.133578 2567 generic.go:358] "Generic (PLEG): container finished" podID="281d90a3-293d-45ac-8b4c-87bdc25f3882" containerID="d7436439c2d96b95f3885d7ea24a211b8f99e914482f232e41d76d495e94c012" exitCode=1 Apr 24 23:53:47.133724 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:47.133654 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" event={"ID":"281d90a3-293d-45ac-8b4c-87bdc25f3882","Type":"ContainerStarted","Data":"94394c463ea2af2c20503451db0827d4be2d86c15ba972bee04f40bfb6e84eb0"} Apr 24 23:53:47.133724 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:47.133697 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" event={"ID":"281d90a3-293d-45ac-8b4c-87bdc25f3882","Type":"ContainerDied","Data":"d7436439c2d96b95f3885d7ea24a211b8f99e914482f232e41d76d495e94c012"} Apr 24 23:53:47.133724 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:47.133717 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" event={"ID":"281d90a3-293d-45ac-8b4c-87bdc25f3882","Type":"ContainerStarted","Data":"940d668c9fd0f111a7a7c930ccb1581c7b075d53c6a313851630c51622eb8bb0"} Apr 24 23:53:47.135085 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:47.135058 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qq9fz" event={"ID":"e4504335-0abc-4d74-83b7-89d51ce42839","Type":"ContainerStarted","Data":"5831bd7042cccd56e8f5e109b4a87265c4ee2f3f8148bbfe3655b7b637ff54bb"} Apr 24 23:53:47.136533 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:47.136485 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-cjk8s" event={"ID":"49ea4681-f36b-4e20-a2b5-d76f46611b7a","Type":"ContainerStarted","Data":"fd7e36636239aaefe4af6e1c769f6d092a7a7d51b7ffae071aaa44aec5f63dca"} Apr 24 23:53:47.138054 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:47.138034 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-nb82k" event={"ID":"50d11a09-c912-418a-ab1b-e1f5272b1d2f","Type":"ContainerStarted","Data":"e894f353b8dd4355a44ea44801e214ace6f4f60a5de2f6ca9554ba380a955b80"} Apr 24 23:53:47.139417 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:47.139391 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-x9qdz" event={"ID":"55dc1a60-4f7d-4366-aca1-303ce72f4d84","Type":"ContainerStarted","Data":"821df6719c8bc80641d08abb22b6dc81704ddf3bb84165977e2154b623bfdad3"} Apr 24 23:53:47.140901 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:47.140880 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-201.ec2.internal" event={"ID":"180c36d91721dcc56c12d3f2d3227bee","Type":"ContainerStarted","Data":"ffa628a139ca915595d58c9e5895a0c1a055990c4f6ad59c1c9d8f6c13236e55"} Apr 24 23:53:47.142139 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:47.142118 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-ckn8x" event={"ID":"217a9b6d-2b5e-4ed8-87d7-5820bac053c5","Type":"ContainerStarted","Data":"a14482d802bce3b9e329590f48403a883807f4754de490227ec4d888b5017b2f"} Apr 24 23:53:47.143555 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:47.143535 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-75bcc" event={"ID":"323b58fb-a204-4400-a836-973ccf33cd8e","Type":"ContainerStarted","Data":"d36b26ad49d56cd8ab6f6278dca47d9cde8fa43186a40f61c520d9ded07438d7"} Apr 24 23:53:47.145067 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:47.145041 2567 generic.go:358] "Generic (PLEG): container finished" podID="1134624f-34d1-4f6e-8821-435df2b54c9b" containerID="01e58a38c511c33bcb212056f7a97263a89db9cf5d6e48aaa2a18d174b7a80e6" exitCode=0 Apr 24 23:53:47.145147 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:47.145084 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mmdsm" event={"ID":"1134624f-34d1-4f6e-8821-435df2b54c9b","Type":"ContainerDied","Data":"01e58a38c511c33bcb212056f7a97263a89db9cf5d6e48aaa2a18d174b7a80e6"} Apr 24 23:53:47.150284 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:47.150238 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-cjk8s" podStartSLOduration=3.429730353 podStartE2EDuration="20.150222755s" podCreationTimestamp="2026-04-24 23:53:27 +0000 UTC" firstStartedPulling="2026-04-24 23:53:29.684411821 +0000 UTC m=+3.207696029" lastFinishedPulling="2026-04-24 23:53:46.404904214 +0000 UTC m=+19.928188431" observedRunningTime="2026-04-24 23:53:47.149629834 +0000 UTC m=+20.672914058" watchObservedRunningTime="2026-04-24 23:53:47.150222755 +0000 UTC m=+20.673506981" Apr 24 23:53:47.180291 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:47.180247 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-x9qdz" podStartSLOduration=3.343224633 podStartE2EDuration="20.180232082s" podCreationTimestamp="2026-04-24 23:53:27 +0000 UTC" firstStartedPulling="2026-04-24 23:53:29.679256958 +0000 UTC m=+3.202541160" lastFinishedPulling="2026-04-24 23:53:46.516264392 +0000 UTC m=+20.039548609" observedRunningTime="2026-04-24 23:53:47.179871493 +0000 UTC m=+20.703155716" watchObservedRunningTime="2026-04-24 23:53:47.180232082 +0000 UTC m=+20.703516305" Apr 24 23:53:47.192402 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:47.192344 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-ckn8x" podStartSLOduration=11.283782871 podStartE2EDuration="20.192331994s" podCreationTimestamp="2026-04-24 23:53:27 +0000 UTC" firstStartedPulling="2026-04-24 23:53:29.689709398 +0000 UTC m=+3.212993603" lastFinishedPulling="2026-04-24 23:53:38.598258521 +0000 UTC m=+12.121542726" observedRunningTime="2026-04-24 23:53:47.191953049 +0000 UTC m=+20.715237274" watchObservedRunningTime="2026-04-24 23:53:47.192331994 +0000 UTC m=+20.715616218" Apr 24 23:53:47.207705 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:47.207647 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-nb82k" podStartSLOduration=3.371222902 podStartE2EDuration="20.207632627s" podCreationTimestamp="2026-04-24 23:53:27 +0000 UTC" firstStartedPulling="2026-04-24 23:53:29.680631688 +0000 UTC m=+3.203915891" lastFinishedPulling="2026-04-24 23:53:46.517041394 +0000 UTC m=+20.040325616" observedRunningTime="2026-04-24 23:53:47.207338504 +0000 UTC m=+20.730622721" watchObservedRunningTime="2026-04-24 23:53:47.207632627 +0000 UTC m=+20.730916848" Apr 24 23:53:47.230526 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:47.230465 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-201.ec2.internal" podStartSLOduration=20.230443478 podStartE2EDuration="20.230443478s" podCreationTimestamp="2026-04-24 23:53:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:53:47.230437027 +0000 UTC m=+20.753721252" watchObservedRunningTime="2026-04-24 23:53:47.230443478 +0000 UTC m=+20.753727717" Apr 24 23:53:47.264975 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:47.264916 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-75bcc" podStartSLOduration=3.549116046 podStartE2EDuration="20.264896037s" podCreationTimestamp="2026-04-24 23:53:27 +0000 UTC" firstStartedPulling="2026-04-24 23:53:29.689122738 +0000 UTC m=+3.212406944" lastFinishedPulling="2026-04-24 23:53:46.404902722 +0000 UTC m=+19.928186935" observedRunningTime="2026-04-24 23:53:47.264575516 +0000 UTC m=+20.787859740" watchObservedRunningTime="2026-04-24 23:53:47.264896037 +0000 UTC m=+20.788180258" Apr 24 23:53:48.014453 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:48.014409 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f9vfv" Apr 24 23:53:48.014903 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:48.014413 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6bc8h" Apr 24 23:53:48.014903 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:48.014564 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f9vfv" podUID="c44399ed-9019-430d-83d7-8cde0e6f0d03" Apr 24 23:53:48.014903 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:48.014639 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6bc8h" podUID="96c2ed0b-d4e5-4737-9f3e-52e6828f930d" Apr 24 23:53:48.031992 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:48.031965 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-ckn8x" Apr 24 23:53:48.032800 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:48.032778 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-ckn8x" Apr 24 23:53:48.150259 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:48.150230 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7hz6p_281d90a3-293d-45ac-8b4c-87bdc25f3882/ovn-acl-logging/0.log" Apr 24 23:53:48.150712 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:48.150686 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" event={"ID":"281d90a3-293d-45ac-8b4c-87bdc25f3882","Type":"ContainerStarted","Data":"e51d218a52c807fbb672a9ac8a67e4a7e73d25f47c152dc28c7a2608fa2036d6"} Apr 24 23:53:48.150806 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:48.150725 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" event={"ID":"281d90a3-293d-45ac-8b4c-87bdc25f3882","Type":"ContainerStarted","Data":"5610562c7e2c3b7c1361602f5ebc6989b7e1a944c7297a4a769ef2771d7d6d7d"} Apr 24 23:53:48.150806 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:48.150741 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" event={"ID":"281d90a3-293d-45ac-8b4c-87bdc25f3882","Type":"ContainerStarted","Data":"7d24029171d9e45af6cd890ee131cc1f52d66342734a22fd615e345b3936f00d"} Apr 24 23:53:48.152198 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:48.152168 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-2m9kc" event={"ID":"119f77e1-fd03-4239-a927-07a04816a852","Type":"ContainerStarted","Data":"286bb0ced13d078f1709b65254da5c0bf98adf7ab7d7b64b719887e56cf099d8"} Apr 24 23:53:48.181126 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:48.181064 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-2m9kc" podStartSLOduration=4.463002207 podStartE2EDuration="21.181047027s" podCreationTimestamp="2026-04-24 23:53:27 +0000 UTC" firstStartedPulling="2026-04-24 23:53:29.687080519 +0000 UTC m=+3.210364725" lastFinishedPulling="2026-04-24 23:53:46.40512533 +0000 UTC m=+19.928409545" observedRunningTime="2026-04-24 23:53:48.180615749 +0000 UTC m=+21.703899998" watchObservedRunningTime="2026-04-24 23:53:48.181047027 +0000 UTC m=+21.704331252" Apr 24 23:53:48.192731 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:48.192695 2567 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 23:53:48.987483 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:48.987331 2567 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T23:53:48.192714606Z","UUID":"344df7c0-f648-439e-8741-9e186ceb2b1b","Handler":null,"Name":"","Endpoint":""} Apr 24 23:53:48.990575 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:48.990550 2567 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 23:53:48.990724 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:48.990583 2567 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 23:53:49.018304 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:49.018272 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7wg4q" Apr 24 23:53:49.018789 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:49.018428 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7wg4q" podUID="4a7b82bd-bf6c-4091-8f48-64cea3e964a8" Apr 24 23:53:49.156984 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:49.156836 2567 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 23:53:49.156984 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:49.156830 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qq9fz" event={"ID":"e4504335-0abc-4d74-83b7-89d51ce42839","Type":"ContainerStarted","Data":"6a83772635aa4eef00e9a62f79042074fa89c16f2eb5b97c5e668a8fb3bb7d6e"} Apr 24 23:53:50.014848 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:50.014768 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f9vfv" Apr 24 23:53:50.014848 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:50.014768 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6bc8h" Apr 24 23:53:50.015069 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:50.014882 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f9vfv" podUID="c44399ed-9019-430d-83d7-8cde0e6f0d03" Apr 24 23:53:50.015069 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:50.014948 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6bc8h" podUID="96c2ed0b-d4e5-4737-9f3e-52e6828f930d" Apr 24 23:53:50.162160 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:50.162129 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7hz6p_281d90a3-293d-45ac-8b4c-87bdc25f3882/ovn-acl-logging/0.log" Apr 24 23:53:50.162608 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:50.162582 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" event={"ID":"281d90a3-293d-45ac-8b4c-87bdc25f3882","Type":"ContainerStarted","Data":"6d4396eb574c60ecf5d5866ccadbe65302d6bd4c80c5d65a52fe707a37f07623"} Apr 24 23:53:50.164566 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:50.164520 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qq9fz" event={"ID":"e4504335-0abc-4d74-83b7-89d51ce42839","Type":"ContainerStarted","Data":"5fb47ca3c1684aa42bf9837ae0f50cddc7f5aab5888864f2e0b4543c2d071232"} Apr 24 23:53:50.191629 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:50.191583 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qq9fz" podStartSLOduration=3.485538792 podStartE2EDuration="23.191570764s" podCreationTimestamp="2026-04-24 23:53:27 +0000 UTC" firstStartedPulling="2026-04-24 23:53:29.685729443 +0000 UTC m=+3.209013649" lastFinishedPulling="2026-04-24 23:53:49.391761403 +0000 UTC m=+22.915045621" observedRunningTime="2026-04-24 23:53:50.191473323 +0000 UTC m=+23.714757547" watchObservedRunningTime="2026-04-24 23:53:50.191570764 +0000 UTC m=+23.714854966" Apr 24 23:53:51.015058 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:51.015017 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7wg4q" Apr 24 23:53:51.015292 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:51.015177 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7wg4q" podUID="4a7b82bd-bf6c-4091-8f48-64cea3e964a8" Apr 24 23:53:52.014654 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:52.014422 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f9vfv" Apr 24 23:53:52.014654 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:52.014422 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6bc8h" Apr 24 23:53:52.015258 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:52.014726 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6bc8h" podUID="96c2ed0b-d4e5-4737-9f3e-52e6828f930d" Apr 24 23:53:52.015258 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:52.014648 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f9vfv" podUID="c44399ed-9019-430d-83d7-8cde0e6f0d03" Apr 24 23:53:52.170143 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:52.170116 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7hz6p_281d90a3-293d-45ac-8b4c-87bdc25f3882/ovn-acl-logging/0.log" Apr 24 23:53:52.170472 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:52.170445 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" event={"ID":"281d90a3-293d-45ac-8b4c-87bdc25f3882","Type":"ContainerStarted","Data":"95433c2a6b55288f084e7be81b4f7345218ec65f43e5b7303162f594954103fc"} Apr 24 23:53:52.170824 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:52.170789 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:52.170824 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:52.170825 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:52.170993 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:52.170837 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:52.170993 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:52.170927 2567 scope.go:117] "RemoveContainer" containerID="d7436439c2d96b95f3885d7ea24a211b8f99e914482f232e41d76d495e94c012" Apr 24 23:53:52.172147 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:52.172109 2567 generic.go:358] "Generic (PLEG): container finished" podID="1134624f-34d1-4f6e-8821-435df2b54c9b" containerID="f26d4192fd19a618acce5f8fa4ab1cc3a10cedba0d31d713c6ea8ed217063ee4" exitCode=0 Apr 24 23:53:52.172265 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:52.172140 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mmdsm" event={"ID":"1134624f-34d1-4f6e-8821-435df2b54c9b","Type":"ContainerDied","Data":"f26d4192fd19a618acce5f8fa4ab1cc3a10cedba0d31d713c6ea8ed217063ee4"} Apr 24 23:53:52.190166 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:52.188030 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:52.190166 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:52.188191 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:53:53.017666 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:53.017641 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7wg4q" Apr 24 23:53:53.018007 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:53.017741 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7wg4q" podUID="4a7b82bd-bf6c-4091-8f48-64cea3e964a8" Apr 24 23:53:53.176934 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:53.176856 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7hz6p_281d90a3-293d-45ac-8b4c-87bdc25f3882/ovn-acl-logging/0.log" Apr 24 23:53:53.177225 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:53.177198 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" event={"ID":"281d90a3-293d-45ac-8b4c-87bdc25f3882","Type":"ContainerStarted","Data":"38d0731530f5fb889792c976690d6f7c97465e3480add123493890f9a6c17129"} Apr 24 23:53:53.179145 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:53.179116 2567 generic.go:358] "Generic (PLEG): container finished" podID="1134624f-34d1-4f6e-8821-435df2b54c9b" containerID="c715cc4275247e0e26d34784e5a7f092c623727262b691faf7361692473d972b" exitCode=0 Apr 24 23:53:53.179259 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:53.179154 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mmdsm" event={"ID":"1134624f-34d1-4f6e-8821-435df2b54c9b","Type":"ContainerDied","Data":"c715cc4275247e0e26d34784e5a7f092c623727262b691faf7361692473d972b"} Apr 24 23:53:53.206739 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:53.206692 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" podStartSLOduration=9.303904444 podStartE2EDuration="26.20667638s" podCreationTimestamp="2026-04-24 23:53:27 +0000 UTC" firstStartedPulling="2026-04-24 23:53:29.688295487 +0000 UTC m=+3.211579699" lastFinishedPulling="2026-04-24 23:53:46.591067426 +0000 UTC m=+20.114351635" observedRunningTime="2026-04-24 23:53:53.206499241 +0000 UTC m=+26.729783480" watchObservedRunningTime="2026-04-24 23:53:53.20667638 +0000 UTC m=+26.729960604" Apr 24 23:53:53.670326 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:53.670001 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-f9vfv"] Apr 24 23:53:53.670326 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:53.670182 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f9vfv" Apr 24 23:53:53.670326 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:53.670290 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f9vfv" podUID="c44399ed-9019-430d-83d7-8cde0e6f0d03" Apr 24 23:53:53.673475 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:53.673442 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-6bc8h"] Apr 24 23:53:53.673629 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:53.673582 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6bc8h" Apr 24 23:53:53.673711 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:53.673688 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6bc8h" podUID="96c2ed0b-d4e5-4737-9f3e-52e6828f930d" Apr 24 23:53:53.674174 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:53.674151 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7wg4q"] Apr 24 23:53:53.674281 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:53.674268 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7wg4q" Apr 24 23:53:53.674432 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:53.674398 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7wg4q" podUID="4a7b82bd-bf6c-4091-8f48-64cea3e964a8" Apr 24 23:53:53.933712 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:53.933680 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/96c2ed0b-d4e5-4737-9f3e-52e6828f930d-original-pull-secret\") pod \"global-pull-secret-syncer-6bc8h\" (UID: \"96c2ed0b-d4e5-4737-9f3e-52e6828f930d\") " pod="kube-system/global-pull-secret-syncer-6bc8h" Apr 24 23:53:53.933838 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:53.933783 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:53.933924 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:53.933845 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/96c2ed0b-d4e5-4737-9f3e-52e6828f930d-original-pull-secret podName:96c2ed0b-d4e5-4737-9f3e-52e6828f930d nodeName:}" failed. No retries permitted until 2026-04-24 23:54:09.933827794 +0000 UTC m=+43.457111997 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/96c2ed0b-d4e5-4737-9f3e-52e6828f930d-original-pull-secret") pod "global-pull-secret-syncer-6bc8h" (UID: "96c2ed0b-d4e5-4737-9f3e-52e6828f930d") : object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:54.182996 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:54.182965 2567 generic.go:358] "Generic (PLEG): container finished" podID="1134624f-34d1-4f6e-8821-435df2b54c9b" containerID="0f2dddb7f6a895da9e8bf46b2cedd5c94335cb3854a37aa27766b7f8caaf32d9" exitCode=0 Apr 24 23:53:54.183452 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:54.183034 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mmdsm" event={"ID":"1134624f-34d1-4f6e-8821-435df2b54c9b","Type":"ContainerDied","Data":"0f2dddb7f6a895da9e8bf46b2cedd5c94335cb3854a37aa27766b7f8caaf32d9"} Apr 24 23:53:55.014709 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:55.014672 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6bc8h" Apr 24 23:53:55.014988 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:55.014673 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7wg4q" Apr 24 23:53:55.014988 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:55.014797 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6bc8h" podUID="96c2ed0b-d4e5-4737-9f3e-52e6828f930d" Apr 24 23:53:55.014988 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:55.014865 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7wg4q" podUID="4a7b82bd-bf6c-4091-8f48-64cea3e964a8" Apr 24 23:53:56.014710 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:56.014670 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f9vfv" Apr 24 23:53:56.015167 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:56.014788 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f9vfv" podUID="c44399ed-9019-430d-83d7-8cde0e6f0d03" Apr 24 23:53:57.007230 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:57.007199 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-ckn8x" Apr 24 23:53:57.007436 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:57.007335 2567 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 23:53:57.008274 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:57.008251 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-ckn8x" Apr 24 23:53:57.014868 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:57.014847 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6bc8h" Apr 24 23:53:57.015273 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:57.015254 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7wg4q" Apr 24 23:53:57.015403 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:57.015381 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6bc8h" podUID="96c2ed0b-d4e5-4737-9f3e-52e6828f930d" Apr 24 23:53:57.015749 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:57.015723 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7wg4q" podUID="4a7b82bd-bf6c-4091-8f48-64cea3e964a8" Apr 24 23:53:58.014828 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:58.014801 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f9vfv" Apr 24 23:53:58.014993 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:58.014920 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f9vfv" podUID="c44399ed-9019-430d-83d7-8cde0e6f0d03" Apr 24 23:53:59.015163 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:59.015133 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6bc8h" Apr 24 23:53:59.015620 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:59.015143 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7wg4q" Apr 24 23:53:59.015620 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:59.015250 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6bc8h" podUID="96c2ed0b-d4e5-4737-9f3e-52e6828f930d" Apr 24 23:53:59.015620 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:53:59.015317 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7wg4q" podUID="4a7b82bd-bf6c-4091-8f48-64cea3e964a8" Apr 24 23:53:59.831228 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:59.831200 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-201.ec2.internal" event="NodeReady" Apr 24 23:53:59.831404 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:59.831337 2567 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 23:53:59.876472 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:59.876433 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lh69z"] Apr 24 23:53:59.897503 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:59.897470 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-55769f4fc8-zfbsw"] Apr 24 23:53:59.897920 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:59.897720 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lh69z" Apr 24 23:53:59.900623 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:59.900598 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 24 23:53:59.902528 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:59.902496 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 24 23:53:59.902653 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:59.902534 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-j7qq8\"" Apr 24 23:53:59.903128 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:59.903104 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 24 23:53:59.903128 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:59.903124 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 24 23:53:59.918052 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:59.918028 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lh69z"] Apr 24 23:53:59.918164 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:59.918059 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-55769f4fc8-zfbsw"] Apr 24 23:53:59.918164 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:59.918109 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-7p88r"] Apr 24 23:53:59.918282 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:59.918226 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-55769f4fc8-zfbsw" Apr 24 23:53:59.922755 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:59.922732 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 23:53:59.922937 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:59.922916 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 23:53:59.923057 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:59.922738 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 23:53:59.923230 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:59.923210 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-n2vv4\"" Apr 24 23:53:59.931698 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:59.931676 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 23:53:59.943702 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:59.943677 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-s5dsn"] Apr 24 23:53:59.943881 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:59.943809 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-7p88r" Apr 24 23:53:59.953896 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:59.953874 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 24 23:53:59.954033 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:59.953880 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-7xpht\"" Apr 24 23:53:59.954337 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:59.954321 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 24 23:53:59.962828 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:59.962809 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-7gvnb"] Apr 24 23:53:59.962964 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:59.962948 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-s5dsn" Apr 24 23:53:59.967172 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:59.967155 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-jgrj4\"" Apr 24 23:53:59.967463 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:59.967445 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 23:53:59.967763 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:59.967743 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 23:53:59.977006 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:59.976986 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f025110a-88f1-46e6-9779-d4a470fe4338-serving-cert\") pod \"service-ca-operator-d6fc45fc5-lh69z\" (UID: \"f025110a-88f1-46e6-9779-d4a470fe4338\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lh69z" Apr 24 23:53:59.977109 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:59.977063 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f025110a-88f1-46e6-9779-d4a470fe4338-config\") pod \"service-ca-operator-d6fc45fc5-lh69z\" (UID: \"f025110a-88f1-46e6-9779-d4a470fe4338\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lh69z" Apr 24 23:53:59.977109 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:59.977086 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzx8w\" (UniqueName: \"kubernetes.io/projected/f025110a-88f1-46e6-9779-d4a470fe4338-kube-api-access-dzx8w\") pod \"service-ca-operator-d6fc45fc5-lh69z\" (UID: \"f025110a-88f1-46e6-9779-d4a470fe4338\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lh69z" Apr 24 23:53:59.981290 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:59.981275 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-wwswl"] Apr 24 23:53:59.981435 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:59.981420 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7gvnb" Apr 24 23:53:59.984265 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:59.984248 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-nb5d2\"" Apr 24 23:53:59.984523 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:59.984506 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 23:53:59.984523 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:59.984521 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 23:53:59.984785 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:59.984767 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 23:53:59.998558 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:59.998534 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-7p88r"] Apr 24 23:53:59.998558 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:59.998561 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-s5dsn"] Apr 24 23:53:59.998680 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:59.998573 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wwswl"] Apr 24 23:53:59.998680 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:59.998581 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7gvnb"] Apr 24 23:53:59.998680 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:53:59.998672 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wwswl" Apr 24 23:54:00.003075 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.003057 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 23:54:00.003172 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.003112 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xgj74\"" Apr 24 23:54:00.003172 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.003122 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 23:54:00.014475 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.014449 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f9vfv" Apr 24 23:54:00.017592 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.017568 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-ghb4v\"" Apr 24 23:54:00.078300 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.078263 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f025110a-88f1-46e6-9779-d4a470fe4338-serving-cert\") pod \"service-ca-operator-d6fc45fc5-lh69z\" (UID: \"f025110a-88f1-46e6-9779-d4a470fe4338\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lh69z" Apr 24 23:54:00.078481 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.078310 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82wqf\" (UniqueName: \"kubernetes.io/projected/59e59786-7ad7-4994-a0db-3d510271563f-kube-api-access-82wqf\") pod \"image-registry-55769f4fc8-zfbsw\" (UID: \"59e59786-7ad7-4994-a0db-3d510271563f\") " pod="openshift-image-registry/image-registry-55769f4fc8-zfbsw" Apr 24 23:54:00.078481 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.078352 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f7461a6a-d0ad-4be6-93d1-2197570b77bb-config-volume\") pod \"dns-default-wwswl\" (UID: \"f7461a6a-d0ad-4be6-93d1-2197570b77bb\") " pod="openshift-dns/dns-default-wwswl" Apr 24 23:54:00.078481 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.078390 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a94b1d1a-a27a-4b4f-8bec-ad4468a49f04-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-7p88r\" (UID: \"a94b1d1a-a27a-4b4f-8bec-ad4468a49f04\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-7p88r" Apr 24 23:54:00.078481 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.078430 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/59e59786-7ad7-4994-a0db-3d510271563f-registry-tls\") pod \"image-registry-55769f4fc8-zfbsw\" (UID: \"59e59786-7ad7-4994-a0db-3d510271563f\") " pod="openshift-image-registry/image-registry-55769f4fc8-zfbsw" Apr 24 23:54:00.078481 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.078455 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/59e59786-7ad7-4994-a0db-3d510271563f-image-registry-private-configuration\") pod \"image-registry-55769f4fc8-zfbsw\" (UID: \"59e59786-7ad7-4994-a0db-3d510271563f\") " pod="openshift-image-registry/image-registry-55769f4fc8-zfbsw" Apr 24 23:54:00.078481 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.078471 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpftd\" (UniqueName: \"kubernetes.io/projected/c6962ead-9647-463e-ad57-5aec7076f198-kube-api-access-vpftd\") pod \"network-check-source-8894fc9bd-s5dsn\" (UID: \"c6962ead-9647-463e-ad57-5aec7076f198\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-s5dsn" Apr 24 23:54:00.078687 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.078489 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f7461a6a-d0ad-4be6-93d1-2197570b77bb-metrics-tls\") pod \"dns-default-wwswl\" (UID: \"f7461a6a-d0ad-4be6-93d1-2197570b77bb\") " pod="openshift-dns/dns-default-wwswl" Apr 24 23:54:00.078687 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.078554 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz94w\" (UniqueName: \"kubernetes.io/projected/f7461a6a-d0ad-4be6-93d1-2197570b77bb-kube-api-access-xz94w\") pod \"dns-default-wwswl\" (UID: \"f7461a6a-d0ad-4be6-93d1-2197570b77bb\") " pod="openshift-dns/dns-default-wwswl" Apr 24 23:54:00.078687 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.078596 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/59e59786-7ad7-4994-a0db-3d510271563f-trusted-ca\") pod \"image-registry-55769f4fc8-zfbsw\" (UID: \"59e59786-7ad7-4994-a0db-3d510271563f\") " pod="openshift-image-registry/image-registry-55769f4fc8-zfbsw" Apr 24 23:54:00.078687 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.078614 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a94b1d1a-a27a-4b4f-8bec-ad4468a49f04-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-7p88r\" (UID: \"a94b1d1a-a27a-4b4f-8bec-ad4468a49f04\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-7p88r" Apr 24 23:54:00.078687 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.078653 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/59e59786-7ad7-4994-a0db-3d510271563f-bound-sa-token\") pod \"image-registry-55769f4fc8-zfbsw\" (UID: \"59e59786-7ad7-4994-a0db-3d510271563f\") " pod="openshift-image-registry/image-registry-55769f4fc8-zfbsw" Apr 24 23:54:00.078856 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.078691 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f025110a-88f1-46e6-9779-d4a470fe4338-config\") pod \"service-ca-operator-d6fc45fc5-lh69z\" (UID: \"f025110a-88f1-46e6-9779-d4a470fe4338\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lh69z" Apr 24 23:54:00.078856 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.078707 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/59e59786-7ad7-4994-a0db-3d510271563f-ca-trust-extracted\") pod \"image-registry-55769f4fc8-zfbsw\" (UID: \"59e59786-7ad7-4994-a0db-3d510271563f\") " pod="openshift-image-registry/image-registry-55769f4fc8-zfbsw" Apr 24 23:54:00.078856 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.078731 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6522895a-2de1-4503-81b5-929a4a7a71b2-cert\") pod \"ingress-canary-7gvnb\" (UID: \"6522895a-2de1-4503-81b5-929a4a7a71b2\") " pod="openshift-ingress-canary/ingress-canary-7gvnb" Apr 24 23:54:00.078856 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.078758 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dzx8w\" (UniqueName: \"kubernetes.io/projected/f025110a-88f1-46e6-9779-d4a470fe4338-kube-api-access-dzx8w\") pod \"service-ca-operator-d6fc45fc5-lh69z\" (UID: \"f025110a-88f1-46e6-9779-d4a470fe4338\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lh69z" Apr 24 23:54:00.078856 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.078837 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/59e59786-7ad7-4994-a0db-3d510271563f-registry-certificates\") pod \"image-registry-55769f4fc8-zfbsw\" (UID: \"59e59786-7ad7-4994-a0db-3d510271563f\") " pod="openshift-image-registry/image-registry-55769f4fc8-zfbsw" Apr 24 23:54:00.079015 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.078872 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/59e59786-7ad7-4994-a0db-3d510271563f-installation-pull-secrets\") pod \"image-registry-55769f4fc8-zfbsw\" (UID: \"59e59786-7ad7-4994-a0db-3d510271563f\") " pod="openshift-image-registry/image-registry-55769f4fc8-zfbsw" Apr 24 23:54:00.079015 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.078899 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drgrk\" (UniqueName: \"kubernetes.io/projected/6522895a-2de1-4503-81b5-929a4a7a71b2-kube-api-access-drgrk\") pod \"ingress-canary-7gvnb\" (UID: \"6522895a-2de1-4503-81b5-929a4a7a71b2\") " pod="openshift-ingress-canary/ingress-canary-7gvnb" Apr 24 23:54:00.079015 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.078931 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f7461a6a-d0ad-4be6-93d1-2197570b77bb-tmp-dir\") pod \"dns-default-wwswl\" (UID: \"f7461a6a-d0ad-4be6-93d1-2197570b77bb\") " pod="openshift-dns/dns-default-wwswl" Apr 24 23:54:00.079202 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.079186 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f025110a-88f1-46e6-9779-d4a470fe4338-config\") pod \"service-ca-operator-d6fc45fc5-lh69z\" (UID: \"f025110a-88f1-46e6-9779-d4a470fe4338\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lh69z" Apr 24 23:54:00.082394 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.082334 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f025110a-88f1-46e6-9779-d4a470fe4338-serving-cert\") pod \"service-ca-operator-d6fc45fc5-lh69z\" (UID: \"f025110a-88f1-46e6-9779-d4a470fe4338\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lh69z" Apr 24 23:54:00.090449 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.090424 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzx8w\" (UniqueName: \"kubernetes.io/projected/f025110a-88f1-46e6-9779-d4a470fe4338-kube-api-access-dzx8w\") pod \"service-ca-operator-d6fc45fc5-lh69z\" (UID: \"f025110a-88f1-46e6-9779-d4a470fe4338\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lh69z" Apr 24 23:54:00.179677 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.179630 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/59e59786-7ad7-4994-a0db-3d510271563f-ca-trust-extracted\") pod \"image-registry-55769f4fc8-zfbsw\" (UID: \"59e59786-7ad7-4994-a0db-3d510271563f\") " pod="openshift-image-registry/image-registry-55769f4fc8-zfbsw" Apr 24 23:54:00.179677 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.179672 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6522895a-2de1-4503-81b5-929a4a7a71b2-cert\") pod \"ingress-canary-7gvnb\" (UID: \"6522895a-2de1-4503-81b5-929a4a7a71b2\") " pod="openshift-ingress-canary/ingress-canary-7gvnb" Apr 24 23:54:00.179901 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.179700 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/59e59786-7ad7-4994-a0db-3d510271563f-registry-certificates\") pod \"image-registry-55769f4fc8-zfbsw\" (UID: \"59e59786-7ad7-4994-a0db-3d510271563f\") " pod="openshift-image-registry/image-registry-55769f4fc8-zfbsw" Apr 24 23:54:00.179901 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.179730 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/59e59786-7ad7-4994-a0db-3d510271563f-installation-pull-secrets\") pod \"image-registry-55769f4fc8-zfbsw\" (UID: \"59e59786-7ad7-4994-a0db-3d510271563f\") " pod="openshift-image-registry/image-registry-55769f4fc8-zfbsw" Apr 24 23:54:00.179901 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.179755 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-drgrk\" (UniqueName: \"kubernetes.io/projected/6522895a-2de1-4503-81b5-929a4a7a71b2-kube-api-access-drgrk\") pod \"ingress-canary-7gvnb\" (UID: \"6522895a-2de1-4503-81b5-929a4a7a71b2\") " pod="openshift-ingress-canary/ingress-canary-7gvnb" Apr 24 23:54:00.179901 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.179776 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f7461a6a-d0ad-4be6-93d1-2197570b77bb-tmp-dir\") pod \"dns-default-wwswl\" (UID: \"f7461a6a-d0ad-4be6-93d1-2197570b77bb\") " pod="openshift-dns/dns-default-wwswl" Apr 24 23:54:00.179901 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:54:00.179794 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:54:00.179901 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:54:00.179874 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6522895a-2de1-4503-81b5-929a4a7a71b2-cert podName:6522895a-2de1-4503-81b5-929a4a7a71b2 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:00.679852581 +0000 UTC m=+34.203136786 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6522895a-2de1-4503-81b5-929a4a7a71b2-cert") pod "ingress-canary-7gvnb" (UID: "6522895a-2de1-4503-81b5-929a4a7a71b2") : secret "canary-serving-cert" not found Apr 24 23:54:00.180202 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.179900 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-82wqf\" (UniqueName: \"kubernetes.io/projected/59e59786-7ad7-4994-a0db-3d510271563f-kube-api-access-82wqf\") pod \"image-registry-55769f4fc8-zfbsw\" (UID: \"59e59786-7ad7-4994-a0db-3d510271563f\") " pod="openshift-image-registry/image-registry-55769f4fc8-zfbsw" Apr 24 23:54:00.180202 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.179934 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f7461a6a-d0ad-4be6-93d1-2197570b77bb-config-volume\") pod \"dns-default-wwswl\" (UID: \"f7461a6a-d0ad-4be6-93d1-2197570b77bb\") " pod="openshift-dns/dns-default-wwswl" Apr 24 23:54:00.180202 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.179967 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a94b1d1a-a27a-4b4f-8bec-ad4468a49f04-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-7p88r\" (UID: \"a94b1d1a-a27a-4b4f-8bec-ad4468a49f04\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-7p88r" Apr 24 23:54:00.180202 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.180006 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/59e59786-7ad7-4994-a0db-3d510271563f-registry-tls\") pod \"image-registry-55769f4fc8-zfbsw\" (UID: \"59e59786-7ad7-4994-a0db-3d510271563f\") " pod="openshift-image-registry/image-registry-55769f4fc8-zfbsw" Apr 24 23:54:00.180202 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.180033 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/59e59786-7ad7-4994-a0db-3d510271563f-image-registry-private-configuration\") pod \"image-registry-55769f4fc8-zfbsw\" (UID: \"59e59786-7ad7-4994-a0db-3d510271563f\") " pod="openshift-image-registry/image-registry-55769f4fc8-zfbsw" Apr 24 23:54:00.180202 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.180059 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vpftd\" (UniqueName: \"kubernetes.io/projected/c6962ead-9647-463e-ad57-5aec7076f198-kube-api-access-vpftd\") pod \"network-check-source-8894fc9bd-s5dsn\" (UID: \"c6962ead-9647-463e-ad57-5aec7076f198\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-s5dsn" Apr 24 23:54:00.180202 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.180085 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f7461a6a-d0ad-4be6-93d1-2197570b77bb-metrics-tls\") pod \"dns-default-wwswl\" (UID: \"f7461a6a-d0ad-4be6-93d1-2197570b77bb\") " pod="openshift-dns/dns-default-wwswl" Apr 24 23:54:00.180202 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.180116 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xz94w\" (UniqueName: \"kubernetes.io/projected/f7461a6a-d0ad-4be6-93d1-2197570b77bb-kube-api-access-xz94w\") pod \"dns-default-wwswl\" (UID: \"f7461a6a-d0ad-4be6-93d1-2197570b77bb\") " pod="openshift-dns/dns-default-wwswl" Apr 24 23:54:00.180202 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.180146 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/59e59786-7ad7-4994-a0db-3d510271563f-ca-trust-extracted\") pod \"image-registry-55769f4fc8-zfbsw\" (UID: \"59e59786-7ad7-4994-a0db-3d510271563f\") " pod="openshift-image-registry/image-registry-55769f4fc8-zfbsw" Apr 24 23:54:00.180202 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.180167 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/59e59786-7ad7-4994-a0db-3d510271563f-trusted-ca\") pod \"image-registry-55769f4fc8-zfbsw\" (UID: \"59e59786-7ad7-4994-a0db-3d510271563f\") " pod="openshift-image-registry/image-registry-55769f4fc8-zfbsw" Apr 24 23:54:00.180202 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.180204 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a94b1d1a-a27a-4b4f-8bec-ad4468a49f04-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-7p88r\" (UID: \"a94b1d1a-a27a-4b4f-8bec-ad4468a49f04\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-7p88r" Apr 24 23:54:00.180653 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.180215 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f7461a6a-d0ad-4be6-93d1-2197570b77bb-tmp-dir\") pod \"dns-default-wwswl\" (UID: \"f7461a6a-d0ad-4be6-93d1-2197570b77bb\") " pod="openshift-dns/dns-default-wwswl" Apr 24 23:54:00.180653 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.180253 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/59e59786-7ad7-4994-a0db-3d510271563f-bound-sa-token\") pod \"image-registry-55769f4fc8-zfbsw\" (UID: \"59e59786-7ad7-4994-a0db-3d510271563f\") " pod="openshift-image-registry/image-registry-55769f4fc8-zfbsw" Apr 24 23:54:00.180653 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:54:00.180277 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:54:00.180653 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:54:00.180332 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7461a6a-d0ad-4be6-93d1-2197570b77bb-metrics-tls podName:f7461a6a-d0ad-4be6-93d1-2197570b77bb nodeName:}" failed. No retries permitted until 2026-04-24 23:54:00.680312848 +0000 UTC m=+34.203597049 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f7461a6a-d0ad-4be6-93d1-2197570b77bb-metrics-tls") pod "dns-default-wwswl" (UID: "f7461a6a-d0ad-4be6-93d1-2197570b77bb") : secret "dns-default-metrics-tls" not found Apr 24 23:54:00.180653 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:54:00.180448 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 23:54:00.180653 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:54:00.180475 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55769f4fc8-zfbsw: secret "image-registry-tls" not found Apr 24 23:54:00.180653 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:54:00.180532 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 23:54:00.180653 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.180546 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f7461a6a-d0ad-4be6-93d1-2197570b77bb-config-volume\") pod \"dns-default-wwswl\" (UID: \"f7461a6a-d0ad-4be6-93d1-2197570b77bb\") " pod="openshift-dns/dns-default-wwswl" Apr 24 23:54:00.180653 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:54:00.180535 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/59e59786-7ad7-4994-a0db-3d510271563f-registry-tls podName:59e59786-7ad7-4994-a0db-3d510271563f nodeName:}" failed. No retries permitted until 2026-04-24 23:54:00.680520479 +0000 UTC m=+34.203804694 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/59e59786-7ad7-4994-a0db-3d510271563f-registry-tls") pod "image-registry-55769f4fc8-zfbsw" (UID: "59e59786-7ad7-4994-a0db-3d510271563f") : secret "image-registry-tls" not found Apr 24 23:54:00.180653 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:54:00.180586 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a94b1d1a-a27a-4b4f-8bec-ad4468a49f04-networking-console-plugin-cert podName:a94b1d1a-a27a-4b4f-8bec-ad4468a49f04 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:00.680575096 +0000 UTC m=+34.203859298 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/a94b1d1a-a27a-4b4f-8bec-ad4468a49f04-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-7p88r" (UID: "a94b1d1a-a27a-4b4f-8bec-ad4468a49f04") : secret "networking-console-plugin-cert" not found Apr 24 23:54:00.181132 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.181026 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/59e59786-7ad7-4994-a0db-3d510271563f-registry-certificates\") pod \"image-registry-55769f4fc8-zfbsw\" (UID: \"59e59786-7ad7-4994-a0db-3d510271563f\") " pod="openshift-image-registry/image-registry-55769f4fc8-zfbsw" Apr 24 23:54:00.181132 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.181095 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a94b1d1a-a27a-4b4f-8bec-ad4468a49f04-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-7p88r\" (UID: \"a94b1d1a-a27a-4b4f-8bec-ad4468a49f04\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-7p88r" Apr 24 23:54:00.181422 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.181402 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/59e59786-7ad7-4994-a0db-3d510271563f-trusted-ca\") pod \"image-registry-55769f4fc8-zfbsw\" (UID: \"59e59786-7ad7-4994-a0db-3d510271563f\") " pod="openshift-image-registry/image-registry-55769f4fc8-zfbsw" Apr 24 23:54:00.182938 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.182909 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/59e59786-7ad7-4994-a0db-3d510271563f-image-registry-private-configuration\") pod \"image-registry-55769f4fc8-zfbsw\" (UID: \"59e59786-7ad7-4994-a0db-3d510271563f\") " pod="openshift-image-registry/image-registry-55769f4fc8-zfbsw" Apr 24 23:54:00.183036 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.182914 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/59e59786-7ad7-4994-a0db-3d510271563f-installation-pull-secrets\") pod \"image-registry-55769f4fc8-zfbsw\" (UID: \"59e59786-7ad7-4994-a0db-3d510271563f\") " pod="openshift-image-registry/image-registry-55769f4fc8-zfbsw" Apr 24 23:54:00.190034 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.190006 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/59e59786-7ad7-4994-a0db-3d510271563f-bound-sa-token\") pod \"image-registry-55769f4fc8-zfbsw\" (UID: \"59e59786-7ad7-4994-a0db-3d510271563f\") " pod="openshift-image-registry/image-registry-55769f4fc8-zfbsw" Apr 24 23:54:00.190188 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.190171 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-82wqf\" (UniqueName: \"kubernetes.io/projected/59e59786-7ad7-4994-a0db-3d510271563f-kube-api-access-82wqf\") pod \"image-registry-55769f4fc8-zfbsw\" (UID: \"59e59786-7ad7-4994-a0db-3d510271563f\") " pod="openshift-image-registry/image-registry-55769f4fc8-zfbsw" Apr 24 23:54:00.190241 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.190225 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpftd\" (UniqueName: \"kubernetes.io/projected/c6962ead-9647-463e-ad57-5aec7076f198-kube-api-access-vpftd\") pod \"network-check-source-8894fc9bd-s5dsn\" (UID: \"c6962ead-9647-463e-ad57-5aec7076f198\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-s5dsn" Apr 24 23:54:00.190286 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.190230 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz94w\" (UniqueName: \"kubernetes.io/projected/f7461a6a-d0ad-4be6-93d1-2197570b77bb-kube-api-access-xz94w\") pod \"dns-default-wwswl\" (UID: \"f7461a6a-d0ad-4be6-93d1-2197570b77bb\") " pod="openshift-dns/dns-default-wwswl" Apr 24 23:54:00.190286 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.190276 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-drgrk\" (UniqueName: \"kubernetes.io/projected/6522895a-2de1-4503-81b5-929a4a7a71b2-kube-api-access-drgrk\") pod \"ingress-canary-7gvnb\" (UID: \"6522895a-2de1-4503-81b5-929a4a7a71b2\") " pod="openshift-ingress-canary/ingress-canary-7gvnb" Apr 24 23:54:00.208055 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.208022 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lh69z" Apr 24 23:54:00.271284 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.271256 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-s5dsn" Apr 24 23:54:00.449461 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.449213 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-s5dsn"] Apr 24 23:54:00.453640 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:54:00.453609 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6962ead_9647_463e_ad57_5aec7076f198.slice/crio-1cfed0cf2b7ae425f7d9a45eea37da555b2d84b5255c29d9785fdaeb8f6f084d WatchSource:0}: Error finding container 1cfed0cf2b7ae425f7d9a45eea37da555b2d84b5255c29d9785fdaeb8f6f084d: Status 404 returned error can't find the container with id 1cfed0cf2b7ae425f7d9a45eea37da555b2d84b5255c29d9785fdaeb8f6f084d Apr 24 23:54:00.458769 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.458745 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lh69z"] Apr 24 23:54:00.472252 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:54:00.472228 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf025110a_88f1_46e6_9779_d4a470fe4338.slice/crio-c6889f99482e58b3c8ed6810cc9f13ea82b4ac096a80ae41ad681335537c9e43 WatchSource:0}: Error finding container c6889f99482e58b3c8ed6810cc9f13ea82b4ac096a80ae41ad681335537c9e43: Status 404 returned error can't find the container with id c6889f99482e58b3c8ed6810cc9f13ea82b4ac096a80ae41ad681335537c9e43 Apr 24 23:54:00.684801 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.684766 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a7b82bd-bf6c-4091-8f48-64cea3e964a8-metrics-certs\") pod \"network-metrics-daemon-7wg4q\" (UID: \"4a7b82bd-bf6c-4091-8f48-64cea3e964a8\") " pod="openshift-multus/network-metrics-daemon-7wg4q" Apr 24 23:54:00.684801 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.684807 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6522895a-2de1-4503-81b5-929a4a7a71b2-cert\") pod \"ingress-canary-7gvnb\" (UID: \"6522895a-2de1-4503-81b5-929a4a7a71b2\") " pod="openshift-ingress-canary/ingress-canary-7gvnb" Apr 24 23:54:00.684984 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.684846 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a94b1d1a-a27a-4b4f-8bec-ad4468a49f04-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-7p88r\" (UID: \"a94b1d1a-a27a-4b4f-8bec-ad4468a49f04\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-7p88r" Apr 24 23:54:00.684984 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.684875 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/59e59786-7ad7-4994-a0db-3d510271563f-registry-tls\") pod \"image-registry-55769f4fc8-zfbsw\" (UID: \"59e59786-7ad7-4994-a0db-3d510271563f\") " pod="openshift-image-registry/image-registry-55769f4fc8-zfbsw" Apr 24 23:54:00.684984 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.684893 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f7461a6a-d0ad-4be6-93d1-2197570b77bb-metrics-tls\") pod \"dns-default-wwswl\" (UID: \"f7461a6a-d0ad-4be6-93d1-2197570b77bb\") " pod="openshift-dns/dns-default-wwswl" Apr 24 23:54:00.684984 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:54:00.684925 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:54:00.684984 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:54:00.684976 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 23:54:00.684984 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:54:00.684980 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:54:00.685166 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:54:00.684990 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6522895a-2de1-4503-81b5-929a4a7a71b2-cert podName:6522895a-2de1-4503-81b5-929a4a7a71b2 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:01.684974799 +0000 UTC m=+35.208259001 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6522895a-2de1-4503-81b5-929a4a7a71b2-cert") pod "ingress-canary-7gvnb" (UID: "6522895a-2de1-4503-81b5-929a4a7a71b2") : secret "canary-serving-cert" not found Apr 24 23:54:00.685166 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:54:00.684993 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 23:54:00.685166 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:54:00.684991 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55769f4fc8-zfbsw: secret "image-registry-tls" not found Apr 24 23:54:00.685166 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:54:00.685030 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7461a6a-d0ad-4be6-93d1-2197570b77bb-metrics-tls podName:f7461a6a-d0ad-4be6-93d1-2197570b77bb nodeName:}" failed. No retries permitted until 2026-04-24 23:54:01.685014874 +0000 UTC m=+35.208299093 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f7461a6a-d0ad-4be6-93d1-2197570b77bb-metrics-tls") pod "dns-default-wwswl" (UID: "f7461a6a-d0ad-4be6-93d1-2197570b77bb") : secret "dns-default-metrics-tls" not found Apr 24 23:54:00.685166 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:54:00.684927 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:54:00.685166 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:54:00.685047 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a94b1d1a-a27a-4b4f-8bec-ad4468a49f04-networking-console-plugin-cert podName:a94b1d1a-a27a-4b4f-8bec-ad4468a49f04 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:01.685037789 +0000 UTC m=+35.208321991 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/a94b1d1a-a27a-4b4f-8bec-ad4468a49f04-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-7p88r" (UID: "a94b1d1a-a27a-4b4f-8bec-ad4468a49f04") : secret "networking-console-plugin-cert" not found Apr 24 23:54:00.685166 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:54:00.685062 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/59e59786-7ad7-4994-a0db-3d510271563f-registry-tls podName:59e59786-7ad7-4994-a0db-3d510271563f nodeName:}" failed. No retries permitted until 2026-04-24 23:54:01.685053894 +0000 UTC m=+35.208338102 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/59e59786-7ad7-4994-a0db-3d510271563f-registry-tls") pod "image-registry-55769f4fc8-zfbsw" (UID: "59e59786-7ad7-4994-a0db-3d510271563f") : secret "image-registry-tls" not found Apr 24 23:54:00.685166 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:54:00.685075 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a7b82bd-bf6c-4091-8f48-64cea3e964a8-metrics-certs podName:4a7b82bd-bf6c-4091-8f48-64cea3e964a8 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:32.685067889 +0000 UTC m=+66.208352093 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4a7b82bd-bf6c-4091-8f48-64cea3e964a8-metrics-certs") pod "network-metrics-daemon-7wg4q" (UID: "4a7b82bd-bf6c-4091-8f48-64cea3e964a8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:54:00.887513 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.887435 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ljwnq\" (UniqueName: \"kubernetes.io/projected/c44399ed-9019-430d-83d7-8cde0e6f0d03-kube-api-access-ljwnq\") pod \"network-check-target-f9vfv\" (UID: \"c44399ed-9019-430d-83d7-8cde0e6f0d03\") " pod="openshift-network-diagnostics/network-check-target-f9vfv" Apr 24 23:54:00.891317 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.891290 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljwnq\" (UniqueName: \"kubernetes.io/projected/c44399ed-9019-430d-83d7-8cde0e6f0d03-kube-api-access-ljwnq\") pod \"network-check-target-f9vfv\" (UID: \"c44399ed-9019-430d-83d7-8cde0e6f0d03\") " pod="openshift-network-diagnostics/network-check-target-f9vfv" Apr 24 23:54:00.925176 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:00.925149 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f9vfv" Apr 24 23:54:01.018415 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:01.018387 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7wg4q" Apr 24 23:54:01.019083 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:01.018392 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6bc8h" Apr 24 23:54:01.021725 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:01.021702 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 23:54:01.021871 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:01.021792 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-h8trl\"" Apr 24 23:54:01.021871 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:01.021805 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 23:54:01.067403 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:01.067353 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-f9vfv"] Apr 24 23:54:01.071990 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:54:01.071948 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc44399ed_9019_430d_83d7_8cde0e6f0d03.slice/crio-bcfff074c1ba8090e7f31412e79dad4dc2471e2d224b97622081ab3ffa802bc4 WatchSource:0}: Error finding container bcfff074c1ba8090e7f31412e79dad4dc2471e2d224b97622081ab3ffa802bc4: Status 404 returned error can't find the container with id bcfff074c1ba8090e7f31412e79dad4dc2471e2d224b97622081ab3ffa802bc4 Apr 24 23:54:01.197711 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:01.197675 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-s5dsn" event={"ID":"c6962ead-9647-463e-ad57-5aec7076f198","Type":"ContainerStarted","Data":"1cfed0cf2b7ae425f7d9a45eea37da555b2d84b5255c29d9785fdaeb8f6f084d"} Apr 24 23:54:01.200435 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:01.200407 2567 generic.go:358] "Generic (PLEG): container finished" podID="1134624f-34d1-4f6e-8821-435df2b54c9b" containerID="31ce42e1b5763a681fabee5c441af1d19df18df6d49cbde55acf8df6969b67e3" exitCode=0 Apr 24 23:54:01.200579 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:01.200473 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mmdsm" event={"ID":"1134624f-34d1-4f6e-8821-435df2b54c9b","Type":"ContainerDied","Data":"31ce42e1b5763a681fabee5c441af1d19df18df6d49cbde55acf8df6969b67e3"} Apr 24 23:54:01.201714 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:01.201692 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-f9vfv" event={"ID":"c44399ed-9019-430d-83d7-8cde0e6f0d03","Type":"ContainerStarted","Data":"bcfff074c1ba8090e7f31412e79dad4dc2471e2d224b97622081ab3ffa802bc4"} Apr 24 23:54:01.203002 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:01.202974 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lh69z" event={"ID":"f025110a-88f1-46e6-9779-d4a470fe4338","Type":"ContainerStarted","Data":"c6889f99482e58b3c8ed6810cc9f13ea82b4ac096a80ae41ad681335537c9e43"} Apr 24 23:54:01.694740 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:01.694700 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a94b1d1a-a27a-4b4f-8bec-ad4468a49f04-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-7p88r\" (UID: \"a94b1d1a-a27a-4b4f-8bec-ad4468a49f04\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-7p88r" Apr 24 23:54:01.694918 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:01.694760 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/59e59786-7ad7-4994-a0db-3d510271563f-registry-tls\") pod \"image-registry-55769f4fc8-zfbsw\" (UID: \"59e59786-7ad7-4994-a0db-3d510271563f\") " pod="openshift-image-registry/image-registry-55769f4fc8-zfbsw" Apr 24 23:54:01.694918 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:01.694791 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f7461a6a-d0ad-4be6-93d1-2197570b77bb-metrics-tls\") pod \"dns-default-wwswl\" (UID: \"f7461a6a-d0ad-4be6-93d1-2197570b77bb\") " pod="openshift-dns/dns-default-wwswl" Apr 24 23:54:01.694918 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:01.694851 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6522895a-2de1-4503-81b5-929a4a7a71b2-cert\") pod \"ingress-canary-7gvnb\" (UID: \"6522895a-2de1-4503-81b5-929a4a7a71b2\") " pod="openshift-ingress-canary/ingress-canary-7gvnb" Apr 24 23:54:01.694918 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:54:01.694885 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 23:54:01.695127 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:54:01.694933 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:54:01.695127 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:54:01.694959 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a94b1d1a-a27a-4b4f-8bec-ad4468a49f04-networking-console-plugin-cert podName:a94b1d1a-a27a-4b4f-8bec-ad4468a49f04 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:03.694939455 +0000 UTC m=+37.218223664 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/a94b1d1a-a27a-4b4f-8bec-ad4468a49f04-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-7p88r" (UID: "a94b1d1a-a27a-4b4f-8bec-ad4468a49f04") : secret "networking-console-plugin-cert" not found Apr 24 23:54:01.695127 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:54:01.694966 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 23:54:01.695127 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:54:01.694990 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55769f4fc8-zfbsw: secret "image-registry-tls" not found Apr 24 23:54:01.695127 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:54:01.694997 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:54:01.695127 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:54:01.694979 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6522895a-2de1-4503-81b5-929a4a7a71b2-cert podName:6522895a-2de1-4503-81b5-929a4a7a71b2 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:03.694969222 +0000 UTC m=+37.218253424 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6522895a-2de1-4503-81b5-929a4a7a71b2-cert") pod "ingress-canary-7gvnb" (UID: "6522895a-2de1-4503-81b5-929a4a7a71b2") : secret "canary-serving-cert" not found Apr 24 23:54:01.695127 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:54:01.695059 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7461a6a-d0ad-4be6-93d1-2197570b77bb-metrics-tls podName:f7461a6a-d0ad-4be6-93d1-2197570b77bb nodeName:}" failed. No retries permitted until 2026-04-24 23:54:03.695041823 +0000 UTC m=+37.218326025 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f7461a6a-d0ad-4be6-93d1-2197570b77bb-metrics-tls") pod "dns-default-wwswl" (UID: "f7461a6a-d0ad-4be6-93d1-2197570b77bb") : secret "dns-default-metrics-tls" not found Apr 24 23:54:01.695127 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:54:01.695073 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/59e59786-7ad7-4994-a0db-3d510271563f-registry-tls podName:59e59786-7ad7-4994-a0db-3d510271563f nodeName:}" failed. No retries permitted until 2026-04-24 23:54:03.695066049 +0000 UTC m=+37.218350251 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/59e59786-7ad7-4994-a0db-3d510271563f-registry-tls") pod "image-registry-55769f4fc8-zfbsw" (UID: "59e59786-7ad7-4994-a0db-3d510271563f") : secret "image-registry-tls" not found Apr 24 23:54:02.209897 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:02.209781 2567 generic.go:358] "Generic (PLEG): container finished" podID="1134624f-34d1-4f6e-8821-435df2b54c9b" containerID="463bec6516d4fcfe5230fcb9eee455a6e0a199603a0cb4d6703713bf98dc496a" exitCode=0 Apr 24 23:54:02.209897 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:02.209852 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mmdsm" event={"ID":"1134624f-34d1-4f6e-8821-435df2b54c9b","Type":"ContainerDied","Data":"463bec6516d4fcfe5230fcb9eee455a6e0a199603a0cb4d6703713bf98dc496a"} Apr 24 23:54:03.215747 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:03.215526 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mmdsm" event={"ID":"1134624f-34d1-4f6e-8821-435df2b54c9b","Type":"ContainerStarted","Data":"d21a67510558ee8d4398831623e8406a25fee3192852c2c5f8ec056a22b06691"} Apr 24 23:54:03.239376 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:03.239307 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-mmdsm" podStartSLOduration=5.644519269 podStartE2EDuration="36.239288282s" podCreationTimestamp="2026-04-24 23:53:27 +0000 UTC" firstStartedPulling="2026-04-24 23:53:29.684587881 +0000 UTC m=+3.207872086" lastFinishedPulling="2026-04-24 23:54:00.279356889 +0000 UTC m=+33.802641099" observedRunningTime="2026-04-24 23:54:03.237718249 +0000 UTC m=+36.761002472" watchObservedRunningTime="2026-04-24 23:54:03.239288282 +0000 UTC m=+36.762572538" Apr 24 23:54:03.714049 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:03.714007 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a94b1d1a-a27a-4b4f-8bec-ad4468a49f04-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-7p88r\" (UID: \"a94b1d1a-a27a-4b4f-8bec-ad4468a49f04\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-7p88r" Apr 24 23:54:03.714258 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:03.714069 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/59e59786-7ad7-4994-a0db-3d510271563f-registry-tls\") pod \"image-registry-55769f4fc8-zfbsw\" (UID: \"59e59786-7ad7-4994-a0db-3d510271563f\") " pod="openshift-image-registry/image-registry-55769f4fc8-zfbsw" Apr 24 23:54:03.714258 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:03.714097 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f7461a6a-d0ad-4be6-93d1-2197570b77bb-metrics-tls\") pod \"dns-default-wwswl\" (UID: \"f7461a6a-d0ad-4be6-93d1-2197570b77bb\") " pod="openshift-dns/dns-default-wwswl" Apr 24 23:54:03.714258 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:03.714159 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6522895a-2de1-4503-81b5-929a4a7a71b2-cert\") pod \"ingress-canary-7gvnb\" (UID: \"6522895a-2de1-4503-81b5-929a4a7a71b2\") " pod="openshift-ingress-canary/ingress-canary-7gvnb" Apr 24 23:54:03.714258 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:54:03.714173 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 23:54:03.714258 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:54:03.714195 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55769f4fc8-zfbsw: secret "image-registry-tls" not found Apr 24 23:54:03.714258 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:54:03.714171 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 23:54:03.714258 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:54:03.714258 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/59e59786-7ad7-4994-a0db-3d510271563f-registry-tls podName:59e59786-7ad7-4994-a0db-3d510271563f nodeName:}" failed. No retries permitted until 2026-04-24 23:54:07.714237552 +0000 UTC m=+41.237521773 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/59e59786-7ad7-4994-a0db-3d510271563f-registry-tls") pod "image-registry-55769f4fc8-zfbsw" (UID: "59e59786-7ad7-4994-a0db-3d510271563f") : secret "image-registry-tls" not found Apr 24 23:54:03.714601 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:54:03.714265 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:54:03.714601 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:54:03.714246 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:54:03.714601 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:54:03.714309 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6522895a-2de1-4503-81b5-929a4a7a71b2-cert podName:6522895a-2de1-4503-81b5-929a4a7a71b2 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:07.714293843 +0000 UTC m=+41.237578062 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6522895a-2de1-4503-81b5-929a4a7a71b2-cert") pod "ingress-canary-7gvnb" (UID: "6522895a-2de1-4503-81b5-929a4a7a71b2") : secret "canary-serving-cert" not found Apr 24 23:54:03.714601 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:54:03.714327 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a94b1d1a-a27a-4b4f-8bec-ad4468a49f04-networking-console-plugin-cert podName:a94b1d1a-a27a-4b4f-8bec-ad4468a49f04 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:07.714317686 +0000 UTC m=+41.237601895 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/a94b1d1a-a27a-4b4f-8bec-ad4468a49f04-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-7p88r" (UID: "a94b1d1a-a27a-4b4f-8bec-ad4468a49f04") : secret "networking-console-plugin-cert" not found Apr 24 23:54:03.714601 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:54:03.714344 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7461a6a-d0ad-4be6-93d1-2197570b77bb-metrics-tls podName:f7461a6a-d0ad-4be6-93d1-2197570b77bb nodeName:}" failed. No retries permitted until 2026-04-24 23:54:07.714337275 +0000 UTC m=+41.237621476 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f7461a6a-d0ad-4be6-93d1-2197570b77bb-metrics-tls") pod "dns-default-wwswl" (UID: "f7461a6a-d0ad-4be6-93d1-2197570b77bb") : secret "dns-default-metrics-tls" not found Apr 24 23:54:05.221121 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:05.221082 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-f9vfv" event={"ID":"c44399ed-9019-430d-83d7-8cde0e6f0d03","Type":"ContainerStarted","Data":"86e792a458e72ce51b0be794584a4832415ffb71bd7b94ef80ffa498cf2587cb"} Apr 24 23:54:05.221981 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:05.221236 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-f9vfv" Apr 24 23:54:05.222444 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:05.222423 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lh69z" event={"ID":"f025110a-88f1-46e6-9779-d4a470fe4338","Type":"ContainerStarted","Data":"35abab28105bfc97f3abd414f71c83a2be48789ed71fd12313edb9864349466f"} Apr 24 23:54:05.223686 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:05.223665 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-s5dsn" event={"ID":"c6962ead-9647-463e-ad57-5aec7076f198","Type":"ContainerStarted","Data":"efaa4577ba90e51484ee27b0d2a1e00fa1b6e1374d8ec342f0f7e75a2eb8e32d"} Apr 24 23:54:05.237413 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:05.237357 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-f9vfv" podStartSLOduration=35.051681844 podStartE2EDuration="38.237346021s" podCreationTimestamp="2026-04-24 23:53:27 +0000 UTC" firstStartedPulling="2026-04-24 23:54:01.074424239 +0000 UTC m=+34.597708448" lastFinishedPulling="2026-04-24 23:54:04.260088424 +0000 UTC m=+37.783372625" observedRunningTime="2026-04-24 23:54:05.236879876 +0000 UTC m=+38.760164100" watchObservedRunningTime="2026-04-24 23:54:05.237346021 +0000 UTC m=+38.760630244" Apr 24 23:54:05.254019 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:05.253972 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-s5dsn" podStartSLOduration=33.458008704 podStartE2EDuration="37.253958346s" podCreationTimestamp="2026-04-24 23:53:28 +0000 UTC" firstStartedPulling="2026-04-24 23:54:00.45564823 +0000 UTC m=+33.978932432" lastFinishedPulling="2026-04-24 23:54:04.251597872 +0000 UTC m=+37.774882074" observedRunningTime="2026-04-24 23:54:05.253114255 +0000 UTC m=+38.776398480" watchObservedRunningTime="2026-04-24 23:54:05.253958346 +0000 UTC m=+38.777242569" Apr 24 23:54:05.267343 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:05.267294 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lh69z" podStartSLOduration=34.490391956 podStartE2EDuration="38.267278851s" podCreationTimestamp="2026-04-24 23:53:27 +0000 UTC" firstStartedPulling="2026-04-24 23:54:00.474242696 +0000 UTC m=+33.997526898" lastFinishedPulling="2026-04-24 23:54:04.251129576 +0000 UTC m=+37.774413793" observedRunningTime="2026-04-24 23:54:05.26685409 +0000 UTC m=+38.790138315" watchObservedRunningTime="2026-04-24 23:54:05.267278851 +0000 UTC m=+38.790563074" Apr 24 23:54:07.747569 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:07.747533 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6522895a-2de1-4503-81b5-929a4a7a71b2-cert\") pod \"ingress-canary-7gvnb\" (UID: \"6522895a-2de1-4503-81b5-929a4a7a71b2\") " pod="openshift-ingress-canary/ingress-canary-7gvnb" Apr 24 23:54:07.748049 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:07.747589 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a94b1d1a-a27a-4b4f-8bec-ad4468a49f04-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-7p88r\" (UID: \"a94b1d1a-a27a-4b4f-8bec-ad4468a49f04\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-7p88r" Apr 24 23:54:07.748049 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:54:07.747675 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 23:54:07.748049 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:54:07.747693 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:54:07.748049 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:07.747713 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/59e59786-7ad7-4994-a0db-3d510271563f-registry-tls\") pod \"image-registry-55769f4fc8-zfbsw\" (UID: \"59e59786-7ad7-4994-a0db-3d510271563f\") " pod="openshift-image-registry/image-registry-55769f4fc8-zfbsw" Apr 24 23:54:07.748049 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:54:07.747723 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a94b1d1a-a27a-4b4f-8bec-ad4468a49f04-networking-console-plugin-cert podName:a94b1d1a-a27a-4b4f-8bec-ad4468a49f04 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:15.747708644 +0000 UTC m=+49.270992845 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/a94b1d1a-a27a-4b4f-8bec-ad4468a49f04-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-7p88r" (UID: "a94b1d1a-a27a-4b4f-8bec-ad4468a49f04") : secret "networking-console-plugin-cert" not found Apr 24 23:54:07.748049 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:54:07.747755 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6522895a-2de1-4503-81b5-929a4a7a71b2-cert podName:6522895a-2de1-4503-81b5-929a4a7a71b2 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:15.747738485 +0000 UTC m=+49.271022688 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6522895a-2de1-4503-81b5-929a4a7a71b2-cert") pod "ingress-canary-7gvnb" (UID: "6522895a-2de1-4503-81b5-929a4a7a71b2") : secret "canary-serving-cert" not found Apr 24 23:54:07.748049 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:54:07.747785 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 23:54:07.748049 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:54:07.747794 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55769f4fc8-zfbsw: secret "image-registry-tls" not found Apr 24 23:54:07.748049 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:07.747792 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f7461a6a-d0ad-4be6-93d1-2197570b77bb-metrics-tls\") pod \"dns-default-wwswl\" (UID: \"f7461a6a-d0ad-4be6-93d1-2197570b77bb\") " pod="openshift-dns/dns-default-wwswl" Apr 24 23:54:07.748049 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:54:07.747824 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/59e59786-7ad7-4994-a0db-3d510271563f-registry-tls podName:59e59786-7ad7-4994-a0db-3d510271563f nodeName:}" failed. No retries permitted until 2026-04-24 23:54:15.747814135 +0000 UTC m=+49.271098337 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/59e59786-7ad7-4994-a0db-3d510271563f-registry-tls") pod "image-registry-55769f4fc8-zfbsw" (UID: "59e59786-7ad7-4994-a0db-3d510271563f") : secret "image-registry-tls" not found Apr 24 23:54:07.748049 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:54:07.747865 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:54:07.748049 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:54:07.747903 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7461a6a-d0ad-4be6-93d1-2197570b77bb-metrics-tls podName:f7461a6a-d0ad-4be6-93d1-2197570b77bb nodeName:}" failed. No retries permitted until 2026-04-24 23:54:15.747893028 +0000 UTC m=+49.271177229 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f7461a6a-d0ad-4be6-93d1-2197570b77bb-metrics-tls") pod "dns-default-wwswl" (UID: "f7461a6a-d0ad-4be6-93d1-2197570b77bb") : secret "dns-default-metrics-tls" not found Apr 24 23:54:07.819290 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:07.819258 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-jjqj4"] Apr 24 23:54:07.825199 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:07.825178 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-jjqj4" Apr 24 23:54:07.827894 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:07.827864 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 24 23:54:07.828007 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:07.827948 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 24 23:54:07.829002 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:07.828983 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 24 23:54:07.829115 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:07.829019 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 24 23:54:07.829115 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:07.828988 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-8dkpx\"" Apr 24 23:54:07.831709 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:07.831690 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-jjqj4"] Apr 24 23:54:07.948732 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:07.948689 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b64e7123-e174-4e34-bcc3-92c4c5ae53eb-signing-key\") pod \"service-ca-865cb79987-jjqj4\" (UID: \"b64e7123-e174-4e34-bcc3-92c4c5ae53eb\") " pod="openshift-service-ca/service-ca-865cb79987-jjqj4" Apr 24 23:54:07.948889 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:07.948750 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpp94\" (UniqueName: \"kubernetes.io/projected/b64e7123-e174-4e34-bcc3-92c4c5ae53eb-kube-api-access-gpp94\") pod \"service-ca-865cb79987-jjqj4\" (UID: \"b64e7123-e174-4e34-bcc3-92c4c5ae53eb\") " pod="openshift-service-ca/service-ca-865cb79987-jjqj4" Apr 24 23:54:07.948889 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:07.948827 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b64e7123-e174-4e34-bcc3-92c4c5ae53eb-signing-cabundle\") pod \"service-ca-865cb79987-jjqj4\" (UID: \"b64e7123-e174-4e34-bcc3-92c4c5ae53eb\") " pod="openshift-service-ca/service-ca-865cb79987-jjqj4" Apr 24 23:54:08.049440 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:08.049320 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b64e7123-e174-4e34-bcc3-92c4c5ae53eb-signing-key\") pod \"service-ca-865cb79987-jjqj4\" (UID: \"b64e7123-e174-4e34-bcc3-92c4c5ae53eb\") " pod="openshift-service-ca/service-ca-865cb79987-jjqj4" Apr 24 23:54:08.049440 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:08.049396 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gpp94\" (UniqueName: \"kubernetes.io/projected/b64e7123-e174-4e34-bcc3-92c4c5ae53eb-kube-api-access-gpp94\") pod \"service-ca-865cb79987-jjqj4\" (UID: \"b64e7123-e174-4e34-bcc3-92c4c5ae53eb\") " pod="openshift-service-ca/service-ca-865cb79987-jjqj4" Apr 24 23:54:08.049440 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:08.049440 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b64e7123-e174-4e34-bcc3-92c4c5ae53eb-signing-cabundle\") pod \"service-ca-865cb79987-jjqj4\" (UID: \"b64e7123-e174-4e34-bcc3-92c4c5ae53eb\") " pod="openshift-service-ca/service-ca-865cb79987-jjqj4" Apr 24 23:54:08.050686 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:08.050655 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b64e7123-e174-4e34-bcc3-92c4c5ae53eb-signing-cabundle\") pod \"service-ca-865cb79987-jjqj4\" (UID: \"b64e7123-e174-4e34-bcc3-92c4c5ae53eb\") " pod="openshift-service-ca/service-ca-865cb79987-jjqj4" Apr 24 23:54:08.053592 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:08.053572 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b64e7123-e174-4e34-bcc3-92c4c5ae53eb-signing-key\") pod \"service-ca-865cb79987-jjqj4\" (UID: \"b64e7123-e174-4e34-bcc3-92c4c5ae53eb\") " pod="openshift-service-ca/service-ca-865cb79987-jjqj4" Apr 24 23:54:08.058739 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:08.058720 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpp94\" (UniqueName: \"kubernetes.io/projected/b64e7123-e174-4e34-bcc3-92c4c5ae53eb-kube-api-access-gpp94\") pod \"service-ca-865cb79987-jjqj4\" (UID: \"b64e7123-e174-4e34-bcc3-92c4c5ae53eb\") " pod="openshift-service-ca/service-ca-865cb79987-jjqj4" Apr 24 23:54:08.135652 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:08.135618 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-jjqj4" Apr 24 23:54:08.252595 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:08.252469 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-jjqj4"] Apr 24 23:54:08.254952 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:54:08.254925 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb64e7123_e174_4e34_bcc3_92c4c5ae53eb.slice/crio-1cc979ffb136ac6a8a6437c53740c69bfa9569f1d798efbdbafc68830413ed79 WatchSource:0}: Error finding container 1cc979ffb136ac6a8a6437c53740c69bfa9569f1d798efbdbafc68830413ed79: Status 404 returned error can't find the container with id 1cc979ffb136ac6a8a6437c53740c69bfa9569f1d798efbdbafc68830413ed79 Apr 24 23:54:09.233860 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:09.233818 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-jjqj4" event={"ID":"b64e7123-e174-4e34-bcc3-92c4c5ae53eb","Type":"ContainerStarted","Data":"5d651859c6ee3d72f7ed2d633c5b27ac06eec187a4d936e79f9637a6c5a390d6"} Apr 24 23:54:09.233860 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:09.233854 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-jjqj4" event={"ID":"b64e7123-e174-4e34-bcc3-92c4c5ae53eb","Type":"ContainerStarted","Data":"1cc979ffb136ac6a8a6437c53740c69bfa9569f1d798efbdbafc68830413ed79"} Apr 24 23:54:09.252535 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:09.252489 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-jjqj4" podStartSLOduration=2.2524737679999998 podStartE2EDuration="2.252473768s" podCreationTimestamp="2026-04-24 23:54:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:54:09.251420159 +0000 UTC m=+42.774704384" watchObservedRunningTime="2026-04-24 23:54:09.252473768 +0000 UTC m=+42.775757993" Apr 24 23:54:09.965830 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:09.965785 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/96c2ed0b-d4e5-4737-9f3e-52e6828f930d-original-pull-secret\") pod \"global-pull-secret-syncer-6bc8h\" (UID: \"96c2ed0b-d4e5-4737-9f3e-52e6828f930d\") " pod="kube-system/global-pull-secret-syncer-6bc8h" Apr 24 23:54:09.968108 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:09.968090 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/96c2ed0b-d4e5-4737-9f3e-52e6828f930d-original-pull-secret\") pod \"global-pull-secret-syncer-6bc8h\" (UID: \"96c2ed0b-d4e5-4737-9f3e-52e6828f930d\") " pod="kube-system/global-pull-secret-syncer-6bc8h" Apr 24 23:54:10.037914 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:10.037877 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6bc8h" Apr 24 23:54:10.174716 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:10.174689 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-6bc8h"] Apr 24 23:54:10.245038 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:10.244932 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-6bc8h" event={"ID":"96c2ed0b-d4e5-4737-9f3e-52e6828f930d","Type":"ContainerStarted","Data":"e96a1608b17ea9e766a8c2650458422645be73778a642904079d8a6f21c78e3b"} Apr 24 23:54:15.256720 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:15.256680 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-6bc8h" event={"ID":"96c2ed0b-d4e5-4737-9f3e-52e6828f930d","Type":"ContainerStarted","Data":"5190f64c62976b7ea1a46ffb10776d014815acd66bee0604c7ac9f5ec891d0ec"} Apr 24 23:54:15.271528 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:15.271476 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-6bc8h" podStartSLOduration=32.901365651 podStartE2EDuration="37.27146008s" podCreationTimestamp="2026-04-24 23:53:38 +0000 UTC" firstStartedPulling="2026-04-24 23:54:10.183459305 +0000 UTC m=+43.706743515" lastFinishedPulling="2026-04-24 23:54:14.553553728 +0000 UTC m=+48.076837944" observedRunningTime="2026-04-24 23:54:15.271029028 +0000 UTC m=+48.794313251" watchObservedRunningTime="2026-04-24 23:54:15.27146008 +0000 UTC m=+48.794744304" Apr 24 23:54:15.826674 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:15.826642 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/59e59786-7ad7-4994-a0db-3d510271563f-registry-tls\") pod \"image-registry-55769f4fc8-zfbsw\" (UID: \"59e59786-7ad7-4994-a0db-3d510271563f\") " pod="openshift-image-registry/image-registry-55769f4fc8-zfbsw" Apr 24 23:54:15.826674 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:15.826678 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f7461a6a-d0ad-4be6-93d1-2197570b77bb-metrics-tls\") pod \"dns-default-wwswl\" (UID: \"f7461a6a-d0ad-4be6-93d1-2197570b77bb\") " pod="openshift-dns/dns-default-wwswl" Apr 24 23:54:15.826968 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:15.826730 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6522895a-2de1-4503-81b5-929a4a7a71b2-cert\") pod \"ingress-canary-7gvnb\" (UID: \"6522895a-2de1-4503-81b5-929a4a7a71b2\") " pod="openshift-ingress-canary/ingress-canary-7gvnb" Apr 24 23:54:15.826968 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:15.826786 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a94b1d1a-a27a-4b4f-8bec-ad4468a49f04-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-7p88r\" (UID: \"a94b1d1a-a27a-4b4f-8bec-ad4468a49f04\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-7p88r" Apr 24 23:54:15.826968 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:54:15.826869 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 23:54:15.826968 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:54:15.826937 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a94b1d1a-a27a-4b4f-8bec-ad4468a49f04-networking-console-plugin-cert podName:a94b1d1a-a27a-4b4f-8bec-ad4468a49f04 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:31.826919193 +0000 UTC m=+65.350203398 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/a94b1d1a-a27a-4b4f-8bec-ad4468a49f04-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-7p88r" (UID: "a94b1d1a-a27a-4b4f-8bec-ad4468a49f04") : secret "networking-console-plugin-cert" not found Apr 24 23:54:15.829207 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:15.829177 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f7461a6a-d0ad-4be6-93d1-2197570b77bb-metrics-tls\") pod \"dns-default-wwswl\" (UID: \"f7461a6a-d0ad-4be6-93d1-2197570b77bb\") " pod="openshift-dns/dns-default-wwswl" Apr 24 23:54:15.829316 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:15.829189 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6522895a-2de1-4503-81b5-929a4a7a71b2-cert\") pod \"ingress-canary-7gvnb\" (UID: \"6522895a-2de1-4503-81b5-929a4a7a71b2\") " pod="openshift-ingress-canary/ingress-canary-7gvnb" Apr 24 23:54:15.829316 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:15.829189 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/59e59786-7ad7-4994-a0db-3d510271563f-registry-tls\") pod \"image-registry-55769f4fc8-zfbsw\" (UID: \"59e59786-7ad7-4994-a0db-3d510271563f\") " pod="openshift-image-registry/image-registry-55769f4fc8-zfbsw" Apr 24 23:54:15.889109 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:15.889077 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7gvnb" Apr 24 23:54:15.906866 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:15.906840 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wwswl" Apr 24 23:54:16.021988 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:16.021952 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7gvnb"] Apr 24 23:54:16.036906 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:54:16.036876 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6522895a_2de1_4503_81b5_929a4a7a71b2.slice/crio-c88b042de3c276487740f40f690bc315b165884a8d973b1ce1634185d5fa2f67 WatchSource:0}: Error finding container c88b042de3c276487740f40f690bc315b165884a8d973b1ce1634185d5fa2f67: Status 404 returned error can't find the container with id c88b042de3c276487740f40f690bc315b165884a8d973b1ce1634185d5fa2f67 Apr 24 23:54:16.040202 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:16.040180 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wwswl"] Apr 24 23:54:16.043445 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:54:16.043424 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7461a6a_d0ad_4be6_93d1_2197570b77bb.slice/crio-ff62e291c505f4aa99c6bc8c58d267ad907e39c8659d20c54affec4739687aae WatchSource:0}: Error finding container ff62e291c505f4aa99c6bc8c58d267ad907e39c8659d20c54affec4739687aae: Status 404 returned error can't find the container with id ff62e291c505f4aa99c6bc8c58d267ad907e39c8659d20c54affec4739687aae Apr 24 23:54:16.128377 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:16.128279 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-55769f4fc8-zfbsw" Apr 24 23:54:16.262329 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:16.262287 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wwswl" event={"ID":"f7461a6a-d0ad-4be6-93d1-2197570b77bb","Type":"ContainerStarted","Data":"ff62e291c505f4aa99c6bc8c58d267ad907e39c8659d20c54affec4739687aae"} Apr 24 23:54:16.262919 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:16.262881 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-55769f4fc8-zfbsw"] Apr 24 23:54:16.263894 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:16.263866 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7gvnb" event={"ID":"6522895a-2de1-4503-81b5-929a4a7a71b2","Type":"ContainerStarted","Data":"c88b042de3c276487740f40f690bc315b165884a8d973b1ce1634185d5fa2f67"} Apr 24 23:54:16.269939 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:54:16.269904 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59e59786_7ad7_4994_a0db_3d510271563f.slice/crio-889beb5e46731e313aaa42b38dd2277193b8d466a695b394363695e2cf7177bc WatchSource:0}: Error finding container 889beb5e46731e313aaa42b38dd2277193b8d466a695b394363695e2cf7177bc: Status 404 returned error can't find the container with id 889beb5e46731e313aaa42b38dd2277193b8d466a695b394363695e2cf7177bc Apr 24 23:54:17.268830 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:17.268783 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-55769f4fc8-zfbsw" event={"ID":"59e59786-7ad7-4994-a0db-3d510271563f","Type":"ContainerStarted","Data":"7de9e2282817be7c1388ac4a2c47d1020c529d7ca356b2b32c76e5382167021f"} Apr 24 23:54:17.268830 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:17.268826 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-55769f4fc8-zfbsw" event={"ID":"59e59786-7ad7-4994-a0db-3d510271563f","Type":"ContainerStarted","Data":"889beb5e46731e313aaa42b38dd2277193b8d466a695b394363695e2cf7177bc"} Apr 24 23:54:17.269323 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:17.269038 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-55769f4fc8-zfbsw" Apr 24 23:54:17.292291 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:17.291650 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-55769f4fc8-zfbsw" podStartSLOduration=50.291629616 podStartE2EDuration="50.291629616s" podCreationTimestamp="2026-04-24 23:53:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:54:17.290686754 +0000 UTC m=+50.813970978" watchObservedRunningTime="2026-04-24 23:54:17.291629616 +0000 UTC m=+50.814913841" Apr 24 23:54:19.279934 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:19.279896 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wwswl" event={"ID":"f7461a6a-d0ad-4be6-93d1-2197570b77bb","Type":"ContainerStarted","Data":"bf74b98666f83948a4ff82d17b5bcd744974185bdbc3cfa8389d8dce5f6069d9"} Apr 24 23:54:19.279934 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:19.279935 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wwswl" event={"ID":"f7461a6a-d0ad-4be6-93d1-2197570b77bb","Type":"ContainerStarted","Data":"ace4f51e9dea31859e31ec50f48e820c7b135e007a619d8b2a285c6b0751197f"} Apr 24 23:54:19.280449 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:19.280092 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-wwswl" Apr 24 23:54:19.282983 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:19.282939 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7gvnb" event={"ID":"6522895a-2de1-4503-81b5-929a4a7a71b2","Type":"ContainerStarted","Data":"8c65d6380426a891af70c7ba5d7f5238f572c867f7577aa2b43ebaf711b9a94b"} Apr 24 23:54:19.300314 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:19.300259 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-wwswl" podStartSLOduration=17.721941995 podStartE2EDuration="20.300245242s" podCreationTimestamp="2026-04-24 23:53:59 +0000 UTC" firstStartedPulling="2026-04-24 23:54:16.045542133 +0000 UTC m=+49.568826336" lastFinishedPulling="2026-04-24 23:54:18.623845378 +0000 UTC m=+52.147129583" observedRunningTime="2026-04-24 23:54:19.299605507 +0000 UTC m=+52.822889731" watchObservedRunningTime="2026-04-24 23:54:19.300245242 +0000 UTC m=+52.823529465" Apr 24 23:54:19.317334 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:19.317279 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-7gvnb" podStartSLOduration=17.7272506 podStartE2EDuration="20.31726497s" podCreationTimestamp="2026-04-24 23:53:59 +0000 UTC" firstStartedPulling="2026-04-24 23:54:16.039020236 +0000 UTC m=+49.562304439" lastFinishedPulling="2026-04-24 23:54:18.629034598 +0000 UTC m=+52.152318809" observedRunningTime="2026-04-24 23:54:19.316953814 +0000 UTC m=+52.840238038" watchObservedRunningTime="2026-04-24 23:54:19.31726497 +0000 UTC m=+52.840549194" Apr 24 23:54:24.193578 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:24.193548 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7hz6p" Apr 24 23:54:29.288666 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:29.288636 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-wwswl" Apr 24 23:54:30.579030 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.578995 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ss7f9"] Apr 24 23:54:30.616522 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.616489 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-j5w2w"] Apr 24 23:54:30.616678 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.616653 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ss7f9" Apr 24 23:54:30.620064 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.620036 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-rmqww\"" Apr 24 23:54:30.621078 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.621060 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 24 23:54:30.621201 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.621159 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 24 23:54:30.621261 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.621221 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 24 23:54:30.637342 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.637315 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-j5w2w"] Apr 24 23:54:30.637342 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.637342 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ss7f9"] Apr 24 23:54:30.637532 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.637447 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-j5w2w" Apr 24 23:54:30.641516 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.641490 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-hv8kk\"" Apr 24 23:54:30.641685 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.641668 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 24 23:54:30.641821 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.641803 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 24 23:54:30.688833 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.688800 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-ghjcw"] Apr 24 23:54:30.715289 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.715252 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-vbnzj"] Apr 24 23:54:30.715466 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.715446 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ghjcw" Apr 24 23:54:30.718320 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.718300 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 23:54:30.719499 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.719473 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 23:54:30.719606 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.719507 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 24 23:54:30.719606 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.719475 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-pww9j\"" Apr 24 23:54:30.719606 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.719575 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 24 23:54:30.727953 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.727926 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-546cdcb66d-p7rsx"] Apr 24 23:54:30.728097 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.728079 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-vbnzj" Apr 24 23:54:30.731631 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.731614 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-b6qbf\"" Apr 24 23:54:30.731974 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.731942 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 24 23:54:30.732079 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.731981 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 24 23:54:30.732079 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.732009 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 24 23:54:30.732917 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.732901 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 24 23:54:30.736035 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.736013 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hkns\" (UniqueName: \"kubernetes.io/projected/6e0c7dfd-78db-4373-bdab-9fddbacaac5d-kube-api-access-9hkns\") pod \"volume-data-source-validator-7c6cbb6c87-j5w2w\" (UID: \"6e0c7dfd-78db-4373-bdab-9fddbacaac5d\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-j5w2w" Apr 24 23:54:30.736143 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.736047 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/85b593ed-1f3d-4acb-aab4-c488e413a3cc-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-ss7f9\" (UID: \"85b593ed-1f3d-4acb-aab4-c488e413a3cc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ss7f9" Apr 24 23:54:30.736143 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.736086 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwml7\" (UniqueName: \"kubernetes.io/projected/85b593ed-1f3d-4acb-aab4-c488e413a3cc-kube-api-access-wwml7\") pod \"cluster-samples-operator-6dc5bdb6b4-ss7f9\" (UID: \"85b593ed-1f3d-4acb-aab4-c488e413a3cc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ss7f9" Apr 24 23:54:30.738900 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.738867 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 24 23:54:30.740901 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.740886 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xf6dv"] Apr 24 23:54:30.741051 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.741039 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-546cdcb66d-p7rsx" Apr 24 23:54:30.743555 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.743536 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 24 23:54:30.743852 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.743834 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 24 23:54:30.743930 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.743870 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 24 23:54:30.743930 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.743891 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-zqnp5\"" Apr 24 23:54:30.744036 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.743926 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 24 23:54:30.746232 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.746215 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 24 23:54:30.746301 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.746245 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 24 23:54:30.762355 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.762329 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-lrmcq"] Apr 24 23:54:30.762501 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.762469 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xf6dv" Apr 24 23:54:30.765380 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.765288 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 24 23:54:30.765380 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.765334 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 24 23:54:30.765598 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.765554 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 24 23:54:30.765598 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.765589 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 24 23:54:30.765868 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.765851 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-h8nlp\"" Apr 24 23:54:30.780789 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.780764 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xf6dv"] Apr 24 23:54:30.780789 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.780789 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-vbnzj"] Apr 24 23:54:30.780928 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.780797 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-ghjcw"] Apr 24 23:54:30.780928 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.780807 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-546cdcb66d-p7rsx"] Apr 24 23:54:30.780928 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.780815 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-lrmcq"] Apr 24 23:54:30.780928 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.780897 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-lrmcq" Apr 24 23:54:30.784113 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.784083 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 24 23:54:30.784242 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.784232 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 23:54:30.784299 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.784237 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 24 23:54:30.785107 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.785086 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-fkndn\"" Apr 24 23:54:30.785189 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.785123 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 23:54:30.788556 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.788536 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 24 23:54:30.837019 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.836933 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9cfc1266-6758-44b5-9cbf-386bf602a3fc-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-xf6dv\" (UID: \"9cfc1266-6758-44b5-9cbf-386bf602a3fc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xf6dv" Apr 24 23:54:30.837019 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.836965 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10b8e145-3ffa-4b6b-bc9e-ca257d2ed1bd-serving-cert\") pod \"insights-operator-585dfdc468-lrmcq\" (UID: \"10b8e145-3ffa-4b6b-bc9e-ca257d2ed1bd\") " pod="openshift-insights/insights-operator-585dfdc468-lrmcq" Apr 24 23:54:30.837019 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.837007 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/3ac1b4af-a2a2-4bcf-b2ea-9456a735084d-stats-auth\") pod \"router-default-546cdcb66d-p7rsx\" (UID: \"3ac1b4af-a2a2-4bcf-b2ea-9456a735084d\") " pod="openshift-ingress/router-default-546cdcb66d-p7rsx" Apr 24 23:54:30.837300 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.837075 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10b8e145-3ffa-4b6b-bc9e-ca257d2ed1bd-service-ca-bundle\") pod \"insights-operator-585dfdc468-lrmcq\" (UID: \"10b8e145-3ffa-4b6b-bc9e-ca257d2ed1bd\") " pod="openshift-insights/insights-operator-585dfdc468-lrmcq" Apr 24 23:54:30.837300 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.837123 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8g97\" (UniqueName: \"kubernetes.io/projected/14d409e9-bd22-416d-934a-018d672f2b6b-kube-api-access-k8g97\") pod \"cluster-monitoring-operator-75587bd455-ghjcw\" (UID: \"14d409e9-bd22-416d-934a-018d672f2b6b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ghjcw" Apr 24 23:54:30.837300 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.837155 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/14d409e9-bd22-416d-934a-018d672f2b6b-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-ghjcw\" (UID: \"14d409e9-bd22-416d-934a-018d672f2b6b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ghjcw" Apr 24 23:54:30.837300 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.837181 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a707c1f4-d5ea-444b-9ab4-37d50135c3c4-trusted-ca\") pod \"console-operator-9d4b6777b-vbnzj\" (UID: \"a707c1f4-d5ea-444b-9ab4-37d50135c3c4\") " pod="openshift-console-operator/console-operator-9d4b6777b-vbnzj" Apr 24 23:54:30.837300 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.837218 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/85b593ed-1f3d-4acb-aab4-c488e413a3cc-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-ss7f9\" (UID: \"85b593ed-1f3d-4acb-aab4-c488e413a3cc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ss7f9" Apr 24 23:54:30.837300 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.837239 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a707c1f4-d5ea-444b-9ab4-37d50135c3c4-config\") pod \"console-operator-9d4b6777b-vbnzj\" (UID: \"a707c1f4-d5ea-444b-9ab4-37d50135c3c4\") " pod="openshift-console-operator/console-operator-9d4b6777b-vbnzj" Apr 24 23:54:30.837300 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.837261 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cfc1266-6758-44b5-9cbf-386bf602a3fc-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-xf6dv\" (UID: \"9cfc1266-6758-44b5-9cbf-386bf602a3fc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xf6dv" Apr 24 23:54:30.837300 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.837285 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ac1b4af-a2a2-4bcf-b2ea-9456a735084d-metrics-certs\") pod \"router-default-546cdcb66d-p7rsx\" (UID: \"3ac1b4af-a2a2-4bcf-b2ea-9456a735084d\") " pod="openshift-ingress/router-default-546cdcb66d-p7rsx" Apr 24 23:54:30.837747 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.837323 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wwml7\" (UniqueName: \"kubernetes.io/projected/85b593ed-1f3d-4acb-aab4-c488e413a3cc-kube-api-access-wwml7\") pod \"cluster-samples-operator-6dc5bdb6b4-ss7f9\" (UID: \"85b593ed-1f3d-4acb-aab4-c488e413a3cc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ss7f9" Apr 24 23:54:30.837747 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.837377 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/10b8e145-3ffa-4b6b-bc9e-ca257d2ed1bd-snapshots\") pod \"insights-operator-585dfdc468-lrmcq\" (UID: \"10b8e145-3ffa-4b6b-bc9e-ca257d2ed1bd\") " pod="openshift-insights/insights-operator-585dfdc468-lrmcq" Apr 24 23:54:30.837747 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.837511 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xr4m5\" (UniqueName: \"kubernetes.io/projected/a707c1f4-d5ea-444b-9ab4-37d50135c3c4-kube-api-access-xr4m5\") pod \"console-operator-9d4b6777b-vbnzj\" (UID: \"a707c1f4-d5ea-444b-9ab4-37d50135c3c4\") " pod="openshift-console-operator/console-operator-9d4b6777b-vbnzj" Apr 24 23:54:30.837747 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.837549 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a707c1f4-d5ea-444b-9ab4-37d50135c3c4-serving-cert\") pod \"console-operator-9d4b6777b-vbnzj\" (UID: \"a707c1f4-d5ea-444b-9ab4-37d50135c3c4\") " pod="openshift-console-operator/console-operator-9d4b6777b-vbnzj" Apr 24 23:54:30.837747 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.837576 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/3ac1b4af-a2a2-4bcf-b2ea-9456a735084d-default-certificate\") pod \"router-default-546cdcb66d-p7rsx\" (UID: \"3ac1b4af-a2a2-4bcf-b2ea-9456a735084d\") " pod="openshift-ingress/router-default-546cdcb66d-p7rsx" Apr 24 23:54:30.837747 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.837598 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ac1b4af-a2a2-4bcf-b2ea-9456a735084d-service-ca-bundle\") pod \"router-default-546cdcb66d-p7rsx\" (UID: \"3ac1b4af-a2a2-4bcf-b2ea-9456a735084d\") " pod="openshift-ingress/router-default-546cdcb66d-p7rsx" Apr 24 23:54:30.837747 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.837695 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10b8e145-3ffa-4b6b-bc9e-ca257d2ed1bd-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-lrmcq\" (UID: \"10b8e145-3ffa-4b6b-bc9e-ca257d2ed1bd\") " pod="openshift-insights/insights-operator-585dfdc468-lrmcq" Apr 24 23:54:30.837747 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.837722 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9hkns\" (UniqueName: \"kubernetes.io/projected/6e0c7dfd-78db-4373-bdab-9fddbacaac5d-kube-api-access-9hkns\") pod \"volume-data-source-validator-7c6cbb6c87-j5w2w\" (UID: \"6e0c7dfd-78db-4373-bdab-9fddbacaac5d\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-j5w2w" Apr 24 23:54:30.837747 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.837740 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/10b8e145-3ffa-4b6b-bc9e-ca257d2ed1bd-tmp\") pod \"insights-operator-585dfdc468-lrmcq\" (UID: \"10b8e145-3ffa-4b6b-bc9e-ca257d2ed1bd\") " pod="openshift-insights/insights-operator-585dfdc468-lrmcq" Apr 24 23:54:30.838046 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.837761 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsg6c\" (UniqueName: \"kubernetes.io/projected/10b8e145-3ffa-4b6b-bc9e-ca257d2ed1bd-kube-api-access-nsg6c\") pod \"insights-operator-585dfdc468-lrmcq\" (UID: \"10b8e145-3ffa-4b6b-bc9e-ca257d2ed1bd\") " pod="openshift-insights/insights-operator-585dfdc468-lrmcq" Apr 24 23:54:30.838046 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.837796 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5vml\" (UniqueName: \"kubernetes.io/projected/3ac1b4af-a2a2-4bcf-b2ea-9456a735084d-kube-api-access-p5vml\") pod \"router-default-546cdcb66d-p7rsx\" (UID: \"3ac1b4af-a2a2-4bcf-b2ea-9456a735084d\") " pod="openshift-ingress/router-default-546cdcb66d-p7rsx" Apr 24 23:54:30.838046 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.837816 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pstp\" (UniqueName: \"kubernetes.io/projected/9cfc1266-6758-44b5-9cbf-386bf602a3fc-kube-api-access-7pstp\") pod \"kube-storage-version-migrator-operator-6769c5d45-xf6dv\" (UID: \"9cfc1266-6758-44b5-9cbf-386bf602a3fc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xf6dv" Apr 24 23:54:30.838046 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.837834 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/14d409e9-bd22-416d-934a-018d672f2b6b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-ghjcw\" (UID: \"14d409e9-bd22-416d-934a-018d672f2b6b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ghjcw" Apr 24 23:54:30.839863 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.839841 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/85b593ed-1f3d-4acb-aab4-c488e413a3cc-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-ss7f9\" (UID: \"85b593ed-1f3d-4acb-aab4-c488e413a3cc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ss7f9" Apr 24 23:54:30.866100 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.866063 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hkns\" (UniqueName: \"kubernetes.io/projected/6e0c7dfd-78db-4373-bdab-9fddbacaac5d-kube-api-access-9hkns\") pod \"volume-data-source-validator-7c6cbb6c87-j5w2w\" (UID: \"6e0c7dfd-78db-4373-bdab-9fddbacaac5d\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-j5w2w" Apr 24 23:54:30.868625 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.868600 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwml7\" (UniqueName: \"kubernetes.io/projected/85b593ed-1f3d-4acb-aab4-c488e413a3cc-kube-api-access-wwml7\") pod \"cluster-samples-operator-6dc5bdb6b4-ss7f9\" (UID: \"85b593ed-1f3d-4acb-aab4-c488e413a3cc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ss7f9" Apr 24 23:54:30.924853 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.924821 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ss7f9" Apr 24 23:54:30.938520 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.938496 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10b8e145-3ffa-4b6b-bc9e-ca257d2ed1bd-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-lrmcq\" (UID: \"10b8e145-3ffa-4b6b-bc9e-ca257d2ed1bd\") " pod="openshift-insights/insights-operator-585dfdc468-lrmcq" Apr 24 23:54:30.938630 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.938531 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/10b8e145-3ffa-4b6b-bc9e-ca257d2ed1bd-tmp\") pod \"insights-operator-585dfdc468-lrmcq\" (UID: \"10b8e145-3ffa-4b6b-bc9e-ca257d2ed1bd\") " pod="openshift-insights/insights-operator-585dfdc468-lrmcq" Apr 24 23:54:30.938630 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.938556 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nsg6c\" (UniqueName: \"kubernetes.io/projected/10b8e145-3ffa-4b6b-bc9e-ca257d2ed1bd-kube-api-access-nsg6c\") pod \"insights-operator-585dfdc468-lrmcq\" (UID: \"10b8e145-3ffa-4b6b-bc9e-ca257d2ed1bd\") " pod="openshift-insights/insights-operator-585dfdc468-lrmcq" Apr 24 23:54:30.938630 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.938582 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p5vml\" (UniqueName: \"kubernetes.io/projected/3ac1b4af-a2a2-4bcf-b2ea-9456a735084d-kube-api-access-p5vml\") pod \"router-default-546cdcb66d-p7rsx\" (UID: \"3ac1b4af-a2a2-4bcf-b2ea-9456a735084d\") " pod="openshift-ingress/router-default-546cdcb66d-p7rsx" Apr 24 23:54:30.938630 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.938610 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7pstp\" (UniqueName: \"kubernetes.io/projected/9cfc1266-6758-44b5-9cbf-386bf602a3fc-kube-api-access-7pstp\") pod \"kube-storage-version-migrator-operator-6769c5d45-xf6dv\" (UID: \"9cfc1266-6758-44b5-9cbf-386bf602a3fc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xf6dv" Apr 24 23:54:30.938859 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.938831 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/14d409e9-bd22-416d-934a-018d672f2b6b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-ghjcw\" (UID: \"14d409e9-bd22-416d-934a-018d672f2b6b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ghjcw" Apr 24 23:54:30.938938 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.938886 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/10b8e145-3ffa-4b6b-bc9e-ca257d2ed1bd-tmp\") pod \"insights-operator-585dfdc468-lrmcq\" (UID: \"10b8e145-3ffa-4b6b-bc9e-ca257d2ed1bd\") " pod="openshift-insights/insights-operator-585dfdc468-lrmcq" Apr 24 23:54:30.938938 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.938885 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9cfc1266-6758-44b5-9cbf-386bf602a3fc-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-xf6dv\" (UID: \"9cfc1266-6758-44b5-9cbf-386bf602a3fc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xf6dv" Apr 24 23:54:30.939044 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.938944 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10b8e145-3ffa-4b6b-bc9e-ca257d2ed1bd-serving-cert\") pod \"insights-operator-585dfdc468-lrmcq\" (UID: \"10b8e145-3ffa-4b6b-bc9e-ca257d2ed1bd\") " pod="openshift-insights/insights-operator-585dfdc468-lrmcq" Apr 24 23:54:30.939044 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.938990 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/3ac1b4af-a2a2-4bcf-b2ea-9456a735084d-stats-auth\") pod \"router-default-546cdcb66d-p7rsx\" (UID: \"3ac1b4af-a2a2-4bcf-b2ea-9456a735084d\") " pod="openshift-ingress/router-default-546cdcb66d-p7rsx" Apr 24 23:54:30.939044 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.939018 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10b8e145-3ffa-4b6b-bc9e-ca257d2ed1bd-service-ca-bundle\") pod \"insights-operator-585dfdc468-lrmcq\" (UID: \"10b8e145-3ffa-4b6b-bc9e-ca257d2ed1bd\") " pod="openshift-insights/insights-operator-585dfdc468-lrmcq" Apr 24 23:54:30.939240 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.939045 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k8g97\" (UniqueName: \"kubernetes.io/projected/14d409e9-bd22-416d-934a-018d672f2b6b-kube-api-access-k8g97\") pod \"cluster-monitoring-operator-75587bd455-ghjcw\" (UID: \"14d409e9-bd22-416d-934a-018d672f2b6b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ghjcw" Apr 24 23:54:30.939240 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.939077 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/14d409e9-bd22-416d-934a-018d672f2b6b-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-ghjcw\" (UID: \"14d409e9-bd22-416d-934a-018d672f2b6b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ghjcw" Apr 24 23:54:30.939240 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.939103 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a707c1f4-d5ea-444b-9ab4-37d50135c3c4-trusted-ca\") pod \"console-operator-9d4b6777b-vbnzj\" (UID: \"a707c1f4-d5ea-444b-9ab4-37d50135c3c4\") " pod="openshift-console-operator/console-operator-9d4b6777b-vbnzj" Apr 24 23:54:30.939240 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.939134 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a707c1f4-d5ea-444b-9ab4-37d50135c3c4-config\") pod \"console-operator-9d4b6777b-vbnzj\" (UID: \"a707c1f4-d5ea-444b-9ab4-37d50135c3c4\") " pod="openshift-console-operator/console-operator-9d4b6777b-vbnzj" Apr 24 23:54:30.939240 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.939175 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cfc1266-6758-44b5-9cbf-386bf602a3fc-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-xf6dv\" (UID: \"9cfc1266-6758-44b5-9cbf-386bf602a3fc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xf6dv" Apr 24 23:54:30.939240 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.939207 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ac1b4af-a2a2-4bcf-b2ea-9456a735084d-metrics-certs\") pod \"router-default-546cdcb66d-p7rsx\" (UID: \"3ac1b4af-a2a2-4bcf-b2ea-9456a735084d\") " pod="openshift-ingress/router-default-546cdcb66d-p7rsx" Apr 24 23:54:30.939240 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.939236 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/10b8e145-3ffa-4b6b-bc9e-ca257d2ed1bd-snapshots\") pod \"insights-operator-585dfdc468-lrmcq\" (UID: \"10b8e145-3ffa-4b6b-bc9e-ca257d2ed1bd\") " pod="openshift-insights/insights-operator-585dfdc468-lrmcq" Apr 24 23:54:30.939596 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.939270 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xr4m5\" (UniqueName: \"kubernetes.io/projected/a707c1f4-d5ea-444b-9ab4-37d50135c3c4-kube-api-access-xr4m5\") pod \"console-operator-9d4b6777b-vbnzj\" (UID: \"a707c1f4-d5ea-444b-9ab4-37d50135c3c4\") " pod="openshift-console-operator/console-operator-9d4b6777b-vbnzj" Apr 24 23:54:30.939596 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.939297 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a707c1f4-d5ea-444b-9ab4-37d50135c3c4-serving-cert\") pod \"console-operator-9d4b6777b-vbnzj\" (UID: \"a707c1f4-d5ea-444b-9ab4-37d50135c3c4\") " pod="openshift-console-operator/console-operator-9d4b6777b-vbnzj" Apr 24 23:54:30.939596 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.939336 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/3ac1b4af-a2a2-4bcf-b2ea-9456a735084d-default-certificate\") pod \"router-default-546cdcb66d-p7rsx\" (UID: \"3ac1b4af-a2a2-4bcf-b2ea-9456a735084d\") " pod="openshift-ingress/router-default-546cdcb66d-p7rsx" Apr 24 23:54:30.939596 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.939386 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ac1b4af-a2a2-4bcf-b2ea-9456a735084d-service-ca-bundle\") pod \"router-default-546cdcb66d-p7rsx\" (UID: \"3ac1b4af-a2a2-4bcf-b2ea-9456a735084d\") " pod="openshift-ingress/router-default-546cdcb66d-p7rsx" Apr 24 23:54:30.939798 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.939617 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10b8e145-3ffa-4b6b-bc9e-ca257d2ed1bd-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-lrmcq\" (UID: \"10b8e145-3ffa-4b6b-bc9e-ca257d2ed1bd\") " pod="openshift-insights/insights-operator-585dfdc468-lrmcq" Apr 24 23:54:30.940341 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.940012 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/14d409e9-bd22-416d-934a-018d672f2b6b-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-ghjcw\" (UID: \"14d409e9-bd22-416d-934a-018d672f2b6b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ghjcw" Apr 24 23:54:30.940341 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.940104 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ac1b4af-a2a2-4bcf-b2ea-9456a735084d-service-ca-bundle\") pod \"router-default-546cdcb66d-p7rsx\" (UID: \"3ac1b4af-a2a2-4bcf-b2ea-9456a735084d\") " pod="openshift-ingress/router-default-546cdcb66d-p7rsx" Apr 24 23:54:30.940341 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.940196 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10b8e145-3ffa-4b6b-bc9e-ca257d2ed1bd-service-ca-bundle\") pod \"insights-operator-585dfdc468-lrmcq\" (UID: \"10b8e145-3ffa-4b6b-bc9e-ca257d2ed1bd\") " pod="openshift-insights/insights-operator-585dfdc468-lrmcq" Apr 24 23:54:30.940736 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.940710 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/10b8e145-3ffa-4b6b-bc9e-ca257d2ed1bd-snapshots\") pod \"insights-operator-585dfdc468-lrmcq\" (UID: \"10b8e145-3ffa-4b6b-bc9e-ca257d2ed1bd\") " pod="openshift-insights/insights-operator-585dfdc468-lrmcq" Apr 24 23:54:30.941845 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.941822 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a707c1f4-d5ea-444b-9ab4-37d50135c3c4-trusted-ca\") pod \"console-operator-9d4b6777b-vbnzj\" (UID: \"a707c1f4-d5ea-444b-9ab4-37d50135c3c4\") " pod="openshift-console-operator/console-operator-9d4b6777b-vbnzj" Apr 24 23:54:30.941953 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.941823 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a707c1f4-d5ea-444b-9ab4-37d50135c3c4-config\") pod \"console-operator-9d4b6777b-vbnzj\" (UID: \"a707c1f4-d5ea-444b-9ab4-37d50135c3c4\") " pod="openshift-console-operator/console-operator-9d4b6777b-vbnzj" Apr 24 23:54:30.943662 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.942777 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a707c1f4-d5ea-444b-9ab4-37d50135c3c4-serving-cert\") pod \"console-operator-9d4b6777b-vbnzj\" (UID: \"a707c1f4-d5ea-444b-9ab4-37d50135c3c4\") " pod="openshift-console-operator/console-operator-9d4b6777b-vbnzj" Apr 24 23:54:30.943662 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.943219 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ac1b4af-a2a2-4bcf-b2ea-9456a735084d-metrics-certs\") pod \"router-default-546cdcb66d-p7rsx\" (UID: \"3ac1b4af-a2a2-4bcf-b2ea-9456a735084d\") " pod="openshift-ingress/router-default-546cdcb66d-p7rsx" Apr 24 23:54:30.943662 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.943621 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9cfc1266-6758-44b5-9cbf-386bf602a3fc-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-xf6dv\" (UID: \"9cfc1266-6758-44b5-9cbf-386bf602a3fc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xf6dv" Apr 24 23:54:30.943858 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.943691 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/3ac1b4af-a2a2-4bcf-b2ea-9456a735084d-default-certificate\") pod \"router-default-546cdcb66d-p7rsx\" (UID: \"3ac1b4af-a2a2-4bcf-b2ea-9456a735084d\") " pod="openshift-ingress/router-default-546cdcb66d-p7rsx" Apr 24 23:54:30.943917 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.943879 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/3ac1b4af-a2a2-4bcf-b2ea-9456a735084d-stats-auth\") pod \"router-default-546cdcb66d-p7rsx\" (UID: \"3ac1b4af-a2a2-4bcf-b2ea-9456a735084d\") " pod="openshift-ingress/router-default-546cdcb66d-p7rsx" Apr 24 23:54:30.944352 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.944328 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/14d409e9-bd22-416d-934a-018d672f2b6b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-ghjcw\" (UID: \"14d409e9-bd22-416d-934a-018d672f2b6b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ghjcw" Apr 24 23:54:30.944702 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.944687 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10b8e145-3ffa-4b6b-bc9e-ca257d2ed1bd-serving-cert\") pod \"insights-operator-585dfdc468-lrmcq\" (UID: \"10b8e145-3ffa-4b6b-bc9e-ca257d2ed1bd\") " pod="openshift-insights/insights-operator-585dfdc468-lrmcq" Apr 24 23:54:30.946246 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.946233 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-j5w2w" Apr 24 23:54:30.948637 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.948618 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsg6c\" (UniqueName: \"kubernetes.io/projected/10b8e145-3ffa-4b6b-bc9e-ca257d2ed1bd-kube-api-access-nsg6c\") pod \"insights-operator-585dfdc468-lrmcq\" (UID: \"10b8e145-3ffa-4b6b-bc9e-ca257d2ed1bd\") " pod="openshift-insights/insights-operator-585dfdc468-lrmcq" Apr 24 23:54:30.949059 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.949014 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5vml\" (UniqueName: \"kubernetes.io/projected/3ac1b4af-a2a2-4bcf-b2ea-9456a735084d-kube-api-access-p5vml\") pod \"router-default-546cdcb66d-p7rsx\" (UID: \"3ac1b4af-a2a2-4bcf-b2ea-9456a735084d\") " pod="openshift-ingress/router-default-546cdcb66d-p7rsx" Apr 24 23:54:30.951142 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.951123 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8g97\" (UniqueName: \"kubernetes.io/projected/14d409e9-bd22-416d-934a-018d672f2b6b-kube-api-access-k8g97\") pod \"cluster-monitoring-operator-75587bd455-ghjcw\" (UID: \"14d409e9-bd22-416d-934a-018d672f2b6b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ghjcw" Apr 24 23:54:30.953275 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.953254 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xr4m5\" (UniqueName: \"kubernetes.io/projected/a707c1f4-d5ea-444b-9ab4-37d50135c3c4-kube-api-access-xr4m5\") pod \"console-operator-9d4b6777b-vbnzj\" (UID: \"a707c1f4-d5ea-444b-9ab4-37d50135c3c4\") " pod="openshift-console-operator/console-operator-9d4b6777b-vbnzj" Apr 24 23:54:30.953712 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.953690 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cfc1266-6758-44b5-9cbf-386bf602a3fc-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-xf6dv\" (UID: \"9cfc1266-6758-44b5-9cbf-386bf602a3fc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xf6dv" Apr 24 23:54:30.955381 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:30.955344 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pstp\" (UniqueName: \"kubernetes.io/projected/9cfc1266-6758-44b5-9cbf-386bf602a3fc-kube-api-access-7pstp\") pod \"kube-storage-version-migrator-operator-6769c5d45-xf6dv\" (UID: \"9cfc1266-6758-44b5-9cbf-386bf602a3fc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xf6dv" Apr 24 23:54:31.025256 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:31.025222 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ghjcw" Apr 24 23:54:31.040132 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:31.039551 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-vbnzj" Apr 24 23:54:31.055438 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:31.054736 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-546cdcb66d-p7rsx" Apr 24 23:54:31.071691 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:31.071658 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xf6dv" Apr 24 23:54:31.089653 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:31.088072 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ss7f9"] Apr 24 23:54:31.089933 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:31.089654 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-lrmcq" Apr 24 23:54:31.113597 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:31.113525 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-j5w2w"] Apr 24 23:54:31.120106 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:54:31.118941 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e0c7dfd_78db_4373_bdab_9fddbacaac5d.slice/crio-acc8fc3a7793e865f92f28b42fd421144eb1da64f69bdf6e630205c13f42fbdc WatchSource:0}: Error finding container acc8fc3a7793e865f92f28b42fd421144eb1da64f69bdf6e630205c13f42fbdc: Status 404 returned error can't find the container with id acc8fc3a7793e865f92f28b42fd421144eb1da64f69bdf6e630205c13f42fbdc Apr 24 23:54:31.239549 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:31.239432 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-ghjcw"] Apr 24 23:54:31.243517 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:54:31.243488 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14d409e9_bd22_416d_934a_018d672f2b6b.slice/crio-983e8f13b829dbe865fbd25924abde7a6a0b9d8d92990d425a5948aa5c803021 WatchSource:0}: Error finding container 983e8f13b829dbe865fbd25924abde7a6a0b9d8d92990d425a5948aa5c803021: Status 404 returned error can't find the container with id 983e8f13b829dbe865fbd25924abde7a6a0b9d8d92990d425a5948aa5c803021 Apr 24 23:54:31.263508 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:31.263476 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-vbnzj"] Apr 24 23:54:31.266790 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:54:31.266761 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda707c1f4_d5ea_444b_9ab4_37d50135c3c4.slice/crio-91d6baf514f51199306a8b6a4f85870771c38dbed5808c5672e9dcac18f579f5 WatchSource:0}: Error finding container 91d6baf514f51199306a8b6a4f85870771c38dbed5808c5672e9dcac18f579f5: Status 404 returned error can't find the container with id 91d6baf514f51199306a8b6a4f85870771c38dbed5808c5672e9dcac18f579f5 Apr 24 23:54:31.306535 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:31.306501 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-lrmcq"] Apr 24 23:54:31.309523 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:54:31.309496 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10b8e145_3ffa_4b6b_bc9e_ca257d2ed1bd.slice/crio-a88bd4abe2882b67cc5e8fd8495590a3a3dee2b2c7e21578216bf72c9560afb1 WatchSource:0}: Error finding container a88bd4abe2882b67cc5e8fd8495590a3a3dee2b2c7e21578216bf72c9560afb1: Status 404 returned error can't find the container with id a88bd4abe2882b67cc5e8fd8495590a3a3dee2b2c7e21578216bf72c9560afb1 Apr 24 23:54:31.315144 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:31.315064 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-j5w2w" event={"ID":"6e0c7dfd-78db-4373-bdab-9fddbacaac5d","Type":"ContainerStarted","Data":"acc8fc3a7793e865f92f28b42fd421144eb1da64f69bdf6e630205c13f42fbdc"} Apr 24 23:54:31.316090 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:31.316068 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ghjcw" event={"ID":"14d409e9-bd22-416d-934a-018d672f2b6b","Type":"ContainerStarted","Data":"983e8f13b829dbe865fbd25924abde7a6a0b9d8d92990d425a5948aa5c803021"} Apr 24 23:54:31.317041 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:31.317019 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ss7f9" event={"ID":"85b593ed-1f3d-4acb-aab4-c488e413a3cc","Type":"ContainerStarted","Data":"be7124541f8d4c1a8eccac3b34953fee6f12cf4eff0a3387b11483efa92c2bef"} Apr 24 23:54:31.318011 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:31.317993 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-vbnzj" event={"ID":"a707c1f4-d5ea-444b-9ab4-37d50135c3c4","Type":"ContainerStarted","Data":"91d6baf514f51199306a8b6a4f85870771c38dbed5808c5672e9dcac18f579f5"} Apr 24 23:54:31.477714 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:31.477682 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xf6dv"] Apr 24 23:54:31.480318 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:54:31.480291 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cfc1266_6758_44b5_9cbf_386bf602a3fc.slice/crio-17197e17a180664e93ec25175b0d19f8381893f77d8ab09fe226ef09fa8e7997 WatchSource:0}: Error finding container 17197e17a180664e93ec25175b0d19f8381893f77d8ab09fe226ef09fa8e7997: Status 404 returned error can't find the container with id 17197e17a180664e93ec25175b0d19f8381893f77d8ab09fe226ef09fa8e7997 Apr 24 23:54:31.481584 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:31.481557 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-546cdcb66d-p7rsx"] Apr 24 23:54:31.484481 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:54:31.484456 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ac1b4af_a2a2_4bcf_b2ea_9456a735084d.slice/crio-ade415c2f2e8d935cc8f683a3ab045fcccff648c02d130bc84c1d91d5e6a93d1 WatchSource:0}: Error finding container ade415c2f2e8d935cc8f683a3ab045fcccff648c02d130bc84c1d91d5e6a93d1: Status 404 returned error can't find the container with id ade415c2f2e8d935cc8f683a3ab045fcccff648c02d130bc84c1d91d5e6a93d1 Apr 24 23:54:31.847594 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:31.847043 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a94b1d1a-a27a-4b4f-8bec-ad4468a49f04-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-7p88r\" (UID: \"a94b1d1a-a27a-4b4f-8bec-ad4468a49f04\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-7p88r" Apr 24 23:54:31.861587 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:31.861462 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a94b1d1a-a27a-4b4f-8bec-ad4468a49f04-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-7p88r\" (UID: \"a94b1d1a-a27a-4b4f-8bec-ad4468a49f04\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-7p88r" Apr 24 23:54:32.055846 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:32.055813 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-7xpht\"" Apr 24 23:54:32.063308 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:32.063265 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-7p88r" Apr 24 23:54:32.245812 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:32.245749 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-7p88r"] Apr 24 23:54:32.262600 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:54:32.262564 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda94b1d1a_a27a_4b4f_8bec_ad4468a49f04.slice/crio-a4e2008fa2ad6347f08edd96e0bf3eac02341f526be5a5364b3924937b69dbcc WatchSource:0}: Error finding container a4e2008fa2ad6347f08edd96e0bf3eac02341f526be5a5364b3924937b69dbcc: Status 404 returned error can't find the container with id a4e2008fa2ad6347f08edd96e0bf3eac02341f526be5a5364b3924937b69dbcc Apr 24 23:54:32.331446 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:32.331407 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-7p88r" event={"ID":"a94b1d1a-a27a-4b4f-8bec-ad4468a49f04","Type":"ContainerStarted","Data":"a4e2008fa2ad6347f08edd96e0bf3eac02341f526be5a5364b3924937b69dbcc"} Apr 24 23:54:32.334057 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:32.334025 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xf6dv" event={"ID":"9cfc1266-6758-44b5-9cbf-386bf602a3fc","Type":"ContainerStarted","Data":"17197e17a180664e93ec25175b0d19f8381893f77d8ab09fe226ef09fa8e7997"} Apr 24 23:54:32.336001 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:32.335968 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-lrmcq" event={"ID":"10b8e145-3ffa-4b6b-bc9e-ca257d2ed1bd","Type":"ContainerStarted","Data":"a88bd4abe2882b67cc5e8fd8495590a3a3dee2b2c7e21578216bf72c9560afb1"} Apr 24 23:54:32.340865 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:32.340830 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-546cdcb66d-p7rsx" event={"ID":"3ac1b4af-a2a2-4bcf-b2ea-9456a735084d","Type":"ContainerStarted","Data":"cf26d1d5fb72904334ebb2da7764d3897ee717f9335f7b9f2091ef8c33b3be2c"} Apr 24 23:54:32.340865 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:32.340872 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-546cdcb66d-p7rsx" event={"ID":"3ac1b4af-a2a2-4bcf-b2ea-9456a735084d","Type":"ContainerStarted","Data":"ade415c2f2e8d935cc8f683a3ab045fcccff648c02d130bc84c1d91d5e6a93d1"} Apr 24 23:54:32.360610 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:32.360544 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-546cdcb66d-p7rsx" podStartSLOduration=2.360524689 podStartE2EDuration="2.360524689s" podCreationTimestamp="2026-04-24 23:54:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:54:32.359624029 +0000 UTC m=+65.882908254" watchObservedRunningTime="2026-04-24 23:54:32.360524689 +0000 UTC m=+65.883808913" Apr 24 23:54:32.757082 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:32.756638 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a7b82bd-bf6c-4091-8f48-64cea3e964a8-metrics-certs\") pod \"network-metrics-daemon-7wg4q\" (UID: \"4a7b82bd-bf6c-4091-8f48-64cea3e964a8\") " pod="openshift-multus/network-metrics-daemon-7wg4q" Apr 24 23:54:32.759490 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:32.759461 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 23:54:32.771561 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:32.771495 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a7b82bd-bf6c-4091-8f48-64cea3e964a8-metrics-certs\") pod \"network-metrics-daemon-7wg4q\" (UID: \"4a7b82bd-bf6c-4091-8f48-64cea3e964a8\") " pod="openshift-multus/network-metrics-daemon-7wg4q" Apr 24 23:54:32.833409 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:32.832993 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-h8trl\"" Apr 24 23:54:32.847760 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:32.847309 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7wg4q" Apr 24 23:54:33.055670 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:33.055595 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-546cdcb66d-p7rsx" Apr 24 23:54:33.060753 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:33.060528 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-546cdcb66d-p7rsx" Apr 24 23:54:33.344247 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:33.344149 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-546cdcb66d-p7rsx" Apr 24 23:54:33.345709 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:33.345683 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-546cdcb66d-p7rsx" Apr 24 23:54:35.662448 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:35.662414 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7wg4q"] Apr 24 23:54:35.793314 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:54:35.793034 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a7b82bd_bf6c_4091_8f48_64cea3e964a8.slice/crio-0c52fe026897bdcaee35ee60a9466dd3431217745ff12717e6eea0345d806ded WatchSource:0}: Error finding container 0c52fe026897bdcaee35ee60a9466dd3431217745ff12717e6eea0345d806ded: Status 404 returned error can't find the container with id 0c52fe026897bdcaee35ee60a9466dd3431217745ff12717e6eea0345d806ded Apr 24 23:54:36.134171 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:36.134130 2567 patch_prober.go:28] interesting pod/image-registry-55769f4fc8-zfbsw container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 23:54:36.134336 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:36.134197 2567 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-55769f4fc8-zfbsw" podUID="59e59786-7ad7-4994-a0db-3d510271563f" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 23:54:36.227879 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:36.227504 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-f9vfv" Apr 24 23:54:36.370609 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:36.370526 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-j5w2w" event={"ID":"6e0c7dfd-78db-4373-bdab-9fddbacaac5d","Type":"ContainerStarted","Data":"3b0f7c97413d74ccd2c45b21b204fe44897ad54deb7aa817bfd0363cb358fc06"} Apr 24 23:54:36.373561 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:36.373521 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-7p88r" event={"ID":"a94b1d1a-a27a-4b4f-8bec-ad4468a49f04","Type":"ContainerStarted","Data":"581092d95259ce4903879a83b495ad66935ee34c51e131ee4a7ff44f26b61bc4"} Apr 24 23:54:36.377163 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:36.377107 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xf6dv" event={"ID":"9cfc1266-6758-44b5-9cbf-386bf602a3fc","Type":"ContainerStarted","Data":"64c16a088293ca8b17189f2bf1f0a5002e78126802f470fa5583e20b155eea71"} Apr 24 23:54:36.385163 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:36.379759 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ghjcw" event={"ID":"14d409e9-bd22-416d-934a-018d672f2b6b","Type":"ContainerStarted","Data":"b4b2a2615845f9676f8c1cc57f633be843930af6d37a714254a8f1fd9db86504"} Apr 24 23:54:36.385163 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:36.381747 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-lrmcq" event={"ID":"10b8e145-3ffa-4b6b-bc9e-ca257d2ed1bd","Type":"ContainerStarted","Data":"be52f6650f2009fb35ca7ed3086523391380214e3e664c75e2fa78c43acab5d8"} Apr 24 23:54:36.385163 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:36.383574 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7wg4q" event={"ID":"4a7b82bd-bf6c-4091-8f48-64cea3e964a8","Type":"ContainerStarted","Data":"0c52fe026897bdcaee35ee60a9466dd3431217745ff12717e6eea0345d806ded"} Apr 24 23:54:36.385900 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:36.385877 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ss7f9" event={"ID":"85b593ed-1f3d-4acb-aab4-c488e413a3cc","Type":"ContainerStarted","Data":"c8d7f2041fa946d199e21e38c7d6b208b5e050d2e2a889a3f04f8e84831b0b6b"} Apr 24 23:54:36.386061 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:36.386047 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ss7f9" event={"ID":"85b593ed-1f3d-4acb-aab4-c488e413a3cc","Type":"ContainerStarted","Data":"90826897500ab528d7d2adfe6116986946cd98ab74b5bf2aadd403feaeda69ae"} Apr 24 23:54:36.388202 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:36.388184 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-vbnzj" event={"ID":"a707c1f4-d5ea-444b-9ab4-37d50135c3c4","Type":"ContainerStarted","Data":"9b9ae62d86a573681684f0c4be39defab64b0d6942a2aa80fedc64dd45bce6d0"} Apr 24 23:54:36.389035 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:36.389014 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-vbnzj" Apr 24 23:54:36.392888 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:36.391485 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-j5w2w" podStartSLOduration=1.721162651 podStartE2EDuration="6.391472251s" podCreationTimestamp="2026-04-24 23:54:30 +0000 UTC" firstStartedPulling="2026-04-24 23:54:31.12288 +0000 UTC m=+64.646164215" lastFinishedPulling="2026-04-24 23:54:35.793189601 +0000 UTC m=+69.316473815" observedRunningTime="2026-04-24 23:54:36.390986083 +0000 UTC m=+69.914270310" watchObservedRunningTime="2026-04-24 23:54:36.391472251 +0000 UTC m=+69.914756471" Apr 24 23:54:36.413163 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:36.412630 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-7p88r" podStartSLOduration=62.840142097 podStartE2EDuration="1m6.412612154s" podCreationTimestamp="2026-04-24 23:53:30 +0000 UTC" firstStartedPulling="2026-04-24 23:54:32.265314271 +0000 UTC m=+65.788598477" lastFinishedPulling="2026-04-24 23:54:35.83778429 +0000 UTC m=+69.361068534" observedRunningTime="2026-04-24 23:54:36.412169634 +0000 UTC m=+69.935453857" watchObservedRunningTime="2026-04-24 23:54:36.412612154 +0000 UTC m=+69.935896379" Apr 24 23:54:36.444560 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:36.444502 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ss7f9" podStartSLOduration=1.822671087 podStartE2EDuration="6.444483186s" podCreationTimestamp="2026-04-24 23:54:30 +0000 UTC" firstStartedPulling="2026-04-24 23:54:31.171414024 +0000 UTC m=+64.694698242" lastFinishedPulling="2026-04-24 23:54:35.793226139 +0000 UTC m=+69.316510341" observedRunningTime="2026-04-24 23:54:36.442717107 +0000 UTC m=+69.966001333" watchObservedRunningTime="2026-04-24 23:54:36.444483186 +0000 UTC m=+69.967767412" Apr 24 23:54:36.495798 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:36.495030 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-vbnzj" podStartSLOduration=1.97004479 podStartE2EDuration="6.495011873s" podCreationTimestamp="2026-04-24 23:54:30 +0000 UTC" firstStartedPulling="2026-04-24 23:54:31.268763245 +0000 UTC m=+64.792047447" lastFinishedPulling="2026-04-24 23:54:35.79373031 +0000 UTC m=+69.317014530" observedRunningTime="2026-04-24 23:54:36.494688106 +0000 UTC m=+70.017972332" watchObservedRunningTime="2026-04-24 23:54:36.495011873 +0000 UTC m=+70.018296098" Apr 24 23:54:36.496170 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:36.496132 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ghjcw" podStartSLOduration=1.9475381980000002 podStartE2EDuration="6.496123544s" podCreationTimestamp="2026-04-24 23:54:30 +0000 UTC" firstStartedPulling="2026-04-24 23:54:31.245118301 +0000 UTC m=+64.768402504" lastFinishedPulling="2026-04-24 23:54:35.793703648 +0000 UTC m=+69.316987850" observedRunningTime="2026-04-24 23:54:36.471414448 +0000 UTC m=+69.994698674" watchObservedRunningTime="2026-04-24 23:54:36.496123544 +0000 UTC m=+70.019407770" Apr 24 23:54:36.524805 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:36.523393 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xf6dv" podStartSLOduration=2.212775979 podStartE2EDuration="6.523355139s" podCreationTimestamp="2026-04-24 23:54:30 +0000 UTC" firstStartedPulling="2026-04-24 23:54:31.482698129 +0000 UTC m=+65.005982331" lastFinishedPulling="2026-04-24 23:54:35.793277277 +0000 UTC m=+69.316561491" observedRunningTime="2026-04-24 23:54:36.522735258 +0000 UTC m=+70.046019482" watchObservedRunningTime="2026-04-24 23:54:36.523355139 +0000 UTC m=+70.046639366" Apr 24 23:54:36.557383 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:36.553794 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-lrmcq" podStartSLOduration=2.075343688 podStartE2EDuration="6.553771911s" podCreationTimestamp="2026-04-24 23:54:30 +0000 UTC" firstStartedPulling="2026-04-24 23:54:31.315293035 +0000 UTC m=+64.838577238" lastFinishedPulling="2026-04-24 23:54:35.793721246 +0000 UTC m=+69.317005461" observedRunningTime="2026-04-24 23:54:36.552814183 +0000 UTC m=+70.076098409" watchObservedRunningTime="2026-04-24 23:54:36.553771911 +0000 UTC m=+70.077056136" Apr 24 23:54:37.241576 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:37.241546 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-vbnzj" Apr 24 23:54:38.145967 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:38.145847 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-wwswl_f7461a6a-d0ad-4be6-93d1-2197570b77bb/dns/0.log" Apr 24 23:54:38.276051 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:38.276024 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-55769f4fc8-zfbsw" Apr 24 23:54:38.324917 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:38.324890 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-wwswl_f7461a6a-d0ad-4be6-93d1-2197570b77bb/kube-rbac-proxy/0.log" Apr 24 23:54:38.395612 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:38.395574 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7wg4q" event={"ID":"4a7b82bd-bf6c-4091-8f48-64cea3e964a8","Type":"ContainerStarted","Data":"4fcc8b7a01b508cf04a9d628043a0b11d34e752114491cf301464fdafdfe4528"} Apr 24 23:54:38.395809 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:38.395624 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7wg4q" event={"ID":"4a7b82bd-bf6c-4091-8f48-64cea3e964a8","Type":"ContainerStarted","Data":"b2b1599097ba62236e5ece2eb407846a0aeebb17f42a6669bca959e3be7b4330"} Apr 24 23:54:38.413195 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:38.413094 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-7wg4q" podStartSLOduration=69.722907349 podStartE2EDuration="1m11.413079145s" podCreationTimestamp="2026-04-24 23:53:27 +0000 UTC" firstStartedPulling="2026-04-24 23:54:35.794967739 +0000 UTC m=+69.318251941" lastFinishedPulling="2026-04-24 23:54:37.485139522 +0000 UTC m=+71.008423737" observedRunningTime="2026-04-24 23:54:38.411881896 +0000 UTC m=+71.935166146" watchObservedRunningTime="2026-04-24 23:54:38.413079145 +0000 UTC m=+71.936363406" Apr 24 23:54:38.522056 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:38.522033 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-75bcc_323b58fb-a204-4400-a836-973ccf33cd8e/dns-node-resolver/0.log" Apr 24 23:54:39.123578 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:39.123472 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-55769f4fc8-zfbsw_59e59786-7ad7-4994-a0db-3d510271563f/registry/0.log" Apr 24 23:54:39.722965 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:39.722933 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-cjk8s_49ea4681-f36b-4e20-a2b5-d76f46611b7a/node-ca/0.log" Apr 24 23:54:40.122640 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:40.122560 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-546cdcb66d-p7rsx_3ac1b4af-a2a2-4bcf-b2ea-9456a735084d/router/0.log" Apr 24 23:54:40.321544 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:40.321511 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-7gvnb_6522895a-2de1-4503-81b5-929a4a7a71b2/serve-healthcheck-canary/0.log" Apr 24 23:54:44.917489 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:44.917446 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-wfnsq"] Apr 24 23:54:44.941257 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:44.941230 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-wfnsq" Apr 24 23:54:44.944131 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:44.944103 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 23:54:44.944266 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:44.944154 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 23:54:44.944266 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:44.944154 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-m6xfv\"" Apr 24 23:54:44.944266 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:44.944210 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 23:54:44.944266 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:44.944208 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 23:54:45.064825 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:45.064790 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/59f163d5-53f4-4bdf-b71d-c0dd23cc0261-node-exporter-textfile\") pod \"node-exporter-wfnsq\" (UID: \"59f163d5-53f4-4bdf-b71d-c0dd23cc0261\") " pod="openshift-monitoring/node-exporter-wfnsq" Apr 24 23:54:45.064982 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:45.064837 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/59f163d5-53f4-4bdf-b71d-c0dd23cc0261-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-wfnsq\" (UID: \"59f163d5-53f4-4bdf-b71d-c0dd23cc0261\") " pod="openshift-monitoring/node-exporter-wfnsq" Apr 24 23:54:45.064982 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:45.064869 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcnj2\" (UniqueName: \"kubernetes.io/projected/59f163d5-53f4-4bdf-b71d-c0dd23cc0261-kube-api-access-zcnj2\") pod \"node-exporter-wfnsq\" (UID: \"59f163d5-53f4-4bdf-b71d-c0dd23cc0261\") " pod="openshift-monitoring/node-exporter-wfnsq" Apr 24 23:54:45.064982 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:45.064926 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/59f163d5-53f4-4bdf-b71d-c0dd23cc0261-node-exporter-accelerators-collector-config\") pod \"node-exporter-wfnsq\" (UID: \"59f163d5-53f4-4bdf-b71d-c0dd23cc0261\") " pod="openshift-monitoring/node-exporter-wfnsq" Apr 24 23:54:45.064982 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:45.064952 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/59f163d5-53f4-4bdf-b71d-c0dd23cc0261-root\") pod \"node-exporter-wfnsq\" (UID: \"59f163d5-53f4-4bdf-b71d-c0dd23cc0261\") " pod="openshift-monitoring/node-exporter-wfnsq" Apr 24 23:54:45.065183 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:45.064990 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/59f163d5-53f4-4bdf-b71d-c0dd23cc0261-node-exporter-wtmp\") pod \"node-exporter-wfnsq\" (UID: \"59f163d5-53f4-4bdf-b71d-c0dd23cc0261\") " pod="openshift-monitoring/node-exporter-wfnsq" Apr 24 23:54:45.065183 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:45.065074 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/59f163d5-53f4-4bdf-b71d-c0dd23cc0261-node-exporter-tls\") pod \"node-exporter-wfnsq\" (UID: \"59f163d5-53f4-4bdf-b71d-c0dd23cc0261\") " pod="openshift-monitoring/node-exporter-wfnsq" Apr 24 23:54:45.065183 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:45.065134 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/59f163d5-53f4-4bdf-b71d-c0dd23cc0261-metrics-client-ca\") pod \"node-exporter-wfnsq\" (UID: \"59f163d5-53f4-4bdf-b71d-c0dd23cc0261\") " pod="openshift-monitoring/node-exporter-wfnsq" Apr 24 23:54:45.065183 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:45.065178 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/59f163d5-53f4-4bdf-b71d-c0dd23cc0261-sys\") pod \"node-exporter-wfnsq\" (UID: \"59f163d5-53f4-4bdf-b71d-c0dd23cc0261\") " pod="openshift-monitoring/node-exporter-wfnsq" Apr 24 23:54:45.166052 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:45.166014 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/59f163d5-53f4-4bdf-b71d-c0dd23cc0261-sys\") pod \"node-exporter-wfnsq\" (UID: \"59f163d5-53f4-4bdf-b71d-c0dd23cc0261\") " pod="openshift-monitoring/node-exporter-wfnsq" Apr 24 23:54:45.166255 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:45.166084 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/59f163d5-53f4-4bdf-b71d-c0dd23cc0261-node-exporter-textfile\") pod \"node-exporter-wfnsq\" (UID: \"59f163d5-53f4-4bdf-b71d-c0dd23cc0261\") " pod="openshift-monitoring/node-exporter-wfnsq" Apr 24 23:54:45.166255 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:45.166116 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/59f163d5-53f4-4bdf-b71d-c0dd23cc0261-sys\") pod \"node-exporter-wfnsq\" (UID: \"59f163d5-53f4-4bdf-b71d-c0dd23cc0261\") " pod="openshift-monitoring/node-exporter-wfnsq" Apr 24 23:54:45.166255 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:45.166118 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/59f163d5-53f4-4bdf-b71d-c0dd23cc0261-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-wfnsq\" (UID: \"59f163d5-53f4-4bdf-b71d-c0dd23cc0261\") " pod="openshift-monitoring/node-exporter-wfnsq" Apr 24 23:54:45.166255 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:45.166178 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zcnj2\" (UniqueName: \"kubernetes.io/projected/59f163d5-53f4-4bdf-b71d-c0dd23cc0261-kube-api-access-zcnj2\") pod \"node-exporter-wfnsq\" (UID: \"59f163d5-53f4-4bdf-b71d-c0dd23cc0261\") " pod="openshift-monitoring/node-exporter-wfnsq" Apr 24 23:54:45.166255 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:45.166224 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/59f163d5-53f4-4bdf-b71d-c0dd23cc0261-node-exporter-accelerators-collector-config\") pod \"node-exporter-wfnsq\" (UID: \"59f163d5-53f4-4bdf-b71d-c0dd23cc0261\") " pod="openshift-monitoring/node-exporter-wfnsq" Apr 24 23:54:45.166255 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:45.166246 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/59f163d5-53f4-4bdf-b71d-c0dd23cc0261-root\") pod \"node-exporter-wfnsq\" (UID: \"59f163d5-53f4-4bdf-b71d-c0dd23cc0261\") " pod="openshift-monitoring/node-exporter-wfnsq" Apr 24 23:54:45.166573 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:45.166294 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/59f163d5-53f4-4bdf-b71d-c0dd23cc0261-root\") pod \"node-exporter-wfnsq\" (UID: \"59f163d5-53f4-4bdf-b71d-c0dd23cc0261\") " pod="openshift-monitoring/node-exporter-wfnsq" Apr 24 23:54:45.166573 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:45.166398 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/59f163d5-53f4-4bdf-b71d-c0dd23cc0261-node-exporter-wtmp\") pod \"node-exporter-wfnsq\" (UID: \"59f163d5-53f4-4bdf-b71d-c0dd23cc0261\") " pod="openshift-monitoring/node-exporter-wfnsq" Apr 24 23:54:45.166573 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:45.166448 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/59f163d5-53f4-4bdf-b71d-c0dd23cc0261-node-exporter-tls\") pod \"node-exporter-wfnsq\" (UID: \"59f163d5-53f4-4bdf-b71d-c0dd23cc0261\") " pod="openshift-monitoring/node-exporter-wfnsq" Apr 24 23:54:45.166573 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:45.166479 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/59f163d5-53f4-4bdf-b71d-c0dd23cc0261-metrics-client-ca\") pod \"node-exporter-wfnsq\" (UID: \"59f163d5-53f4-4bdf-b71d-c0dd23cc0261\") " pod="openshift-monitoring/node-exporter-wfnsq" Apr 24 23:54:45.166573 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:45.166558 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/59f163d5-53f4-4bdf-b71d-c0dd23cc0261-node-exporter-wtmp\") pod \"node-exporter-wfnsq\" (UID: \"59f163d5-53f4-4bdf-b71d-c0dd23cc0261\") " pod="openshift-monitoring/node-exporter-wfnsq" Apr 24 23:54:45.177257 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:45.177196 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/59f163d5-53f4-4bdf-b71d-c0dd23cc0261-node-exporter-textfile\") pod \"node-exporter-wfnsq\" (UID: \"59f163d5-53f4-4bdf-b71d-c0dd23cc0261\") " pod="openshift-monitoring/node-exporter-wfnsq" Apr 24 23:54:45.180120 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:45.180078 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/59f163d5-53f4-4bdf-b71d-c0dd23cc0261-node-exporter-accelerators-collector-config\") pod \"node-exporter-wfnsq\" (UID: \"59f163d5-53f4-4bdf-b71d-c0dd23cc0261\") " pod="openshift-monitoring/node-exporter-wfnsq" Apr 24 23:54:45.180264 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:45.180154 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/59f163d5-53f4-4bdf-b71d-c0dd23cc0261-metrics-client-ca\") pod \"node-exporter-wfnsq\" (UID: \"59f163d5-53f4-4bdf-b71d-c0dd23cc0261\") " pod="openshift-monitoring/node-exporter-wfnsq" Apr 24 23:54:45.181904 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:45.181875 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/59f163d5-53f4-4bdf-b71d-c0dd23cc0261-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-wfnsq\" (UID: \"59f163d5-53f4-4bdf-b71d-c0dd23cc0261\") " pod="openshift-monitoring/node-exporter-wfnsq" Apr 24 23:54:45.182120 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:45.182098 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/59f163d5-53f4-4bdf-b71d-c0dd23cc0261-node-exporter-tls\") pod \"node-exporter-wfnsq\" (UID: \"59f163d5-53f4-4bdf-b71d-c0dd23cc0261\") " pod="openshift-monitoring/node-exporter-wfnsq" Apr 24 23:54:45.182205 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:45.182191 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcnj2\" (UniqueName: \"kubernetes.io/projected/59f163d5-53f4-4bdf-b71d-c0dd23cc0261-kube-api-access-zcnj2\") pod \"node-exporter-wfnsq\" (UID: \"59f163d5-53f4-4bdf-b71d-c0dd23cc0261\") " pod="openshift-monitoring/node-exporter-wfnsq" Apr 24 23:54:45.251210 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:45.251171 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-wfnsq" Apr 24 23:54:45.260248 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:54:45.260216 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59f163d5_53f4_4bdf_b71d_c0dd23cc0261.slice/crio-c4900155ab322d0658f40d1b51955a9e72e21eda7e3efd154cdd9ca837d974ac WatchSource:0}: Error finding container c4900155ab322d0658f40d1b51955a9e72e21eda7e3efd154cdd9ca837d974ac: Status 404 returned error can't find the container with id c4900155ab322d0658f40d1b51955a9e72e21eda7e3efd154cdd9ca837d974ac Apr 24 23:54:45.420298 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:45.420263 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wfnsq" event={"ID":"59f163d5-53f4-4bdf-b71d-c0dd23cc0261","Type":"ContainerStarted","Data":"c4900155ab322d0658f40d1b51955a9e72e21eda7e3efd154cdd9ca837d974ac"} Apr 24 23:54:46.425075 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:46.424987 2567 generic.go:358] "Generic (PLEG): container finished" podID="59f163d5-53f4-4bdf-b71d-c0dd23cc0261" containerID="741105c54a8f1952f56330f3c64a2720435550d12c7a10de94d73f4d209ce54b" exitCode=0 Apr 24 23:54:46.425459 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:46.425071 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wfnsq" event={"ID":"59f163d5-53f4-4bdf-b71d-c0dd23cc0261","Type":"ContainerDied","Data":"741105c54a8f1952f56330f3c64a2720435550d12c7a10de94d73f4d209ce54b"} Apr 24 23:54:47.205734 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:47.205701 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-vv7gt"] Apr 24 23:54:47.210325 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:47.210300 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-vv7gt" Apr 24 23:54:47.213127 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:47.213105 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 23:54:47.214598 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:47.214576 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-r9wdl\"" Apr 24 23:54:47.214705 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:47.214582 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 23:54:47.219340 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:47.219314 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-vv7gt"] Apr 24 23:54:47.281616 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:47.281577 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/905b783e-82e9-4f73-8977-416326429b44-data-volume\") pod \"insights-runtime-extractor-vv7gt\" (UID: \"905b783e-82e9-4f73-8977-416326429b44\") " pod="openshift-insights/insights-runtime-extractor-vv7gt" Apr 24 23:54:47.281616 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:47.281615 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/905b783e-82e9-4f73-8977-416326429b44-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-vv7gt\" (UID: \"905b783e-82e9-4f73-8977-416326429b44\") " pod="openshift-insights/insights-runtime-extractor-vv7gt" Apr 24 23:54:47.281830 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:47.281672 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/905b783e-82e9-4f73-8977-416326429b44-crio-socket\") pod \"insights-runtime-extractor-vv7gt\" (UID: \"905b783e-82e9-4f73-8977-416326429b44\") " pod="openshift-insights/insights-runtime-extractor-vv7gt" Apr 24 23:54:47.281830 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:47.281728 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/905b783e-82e9-4f73-8977-416326429b44-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-vv7gt\" (UID: \"905b783e-82e9-4f73-8977-416326429b44\") " pod="openshift-insights/insights-runtime-extractor-vv7gt" Apr 24 23:54:47.281830 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:47.281779 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfk6b\" (UniqueName: \"kubernetes.io/projected/905b783e-82e9-4f73-8977-416326429b44-kube-api-access-jfk6b\") pod \"insights-runtime-extractor-vv7gt\" (UID: \"905b783e-82e9-4f73-8977-416326429b44\") " pod="openshift-insights/insights-runtime-extractor-vv7gt" Apr 24 23:54:47.382635 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:47.382597 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/905b783e-82e9-4f73-8977-416326429b44-data-volume\") pod \"insights-runtime-extractor-vv7gt\" (UID: \"905b783e-82e9-4f73-8977-416326429b44\") " pod="openshift-insights/insights-runtime-extractor-vv7gt" Apr 24 23:54:47.382635 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:47.382636 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/905b783e-82e9-4f73-8977-416326429b44-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-vv7gt\" (UID: \"905b783e-82e9-4f73-8977-416326429b44\") " pod="openshift-insights/insights-runtime-extractor-vv7gt" Apr 24 23:54:47.382882 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:47.382657 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/905b783e-82e9-4f73-8977-416326429b44-crio-socket\") pod \"insights-runtime-extractor-vv7gt\" (UID: \"905b783e-82e9-4f73-8977-416326429b44\") " pod="openshift-insights/insights-runtime-extractor-vv7gt" Apr 24 23:54:47.382882 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:47.382694 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/905b783e-82e9-4f73-8977-416326429b44-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-vv7gt\" (UID: \"905b783e-82e9-4f73-8977-416326429b44\") " pod="openshift-insights/insights-runtime-extractor-vv7gt" Apr 24 23:54:47.382882 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:47.382730 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jfk6b\" (UniqueName: \"kubernetes.io/projected/905b783e-82e9-4f73-8977-416326429b44-kube-api-access-jfk6b\") pod \"insights-runtime-extractor-vv7gt\" (UID: \"905b783e-82e9-4f73-8977-416326429b44\") " pod="openshift-insights/insights-runtime-extractor-vv7gt" Apr 24 23:54:47.382882 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:47.382821 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/905b783e-82e9-4f73-8977-416326429b44-crio-socket\") pod \"insights-runtime-extractor-vv7gt\" (UID: \"905b783e-82e9-4f73-8977-416326429b44\") " pod="openshift-insights/insights-runtime-extractor-vv7gt" Apr 24 23:54:47.383082 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:47.383050 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/905b783e-82e9-4f73-8977-416326429b44-data-volume\") pod \"insights-runtime-extractor-vv7gt\" (UID: \"905b783e-82e9-4f73-8977-416326429b44\") " pod="openshift-insights/insights-runtime-extractor-vv7gt" Apr 24 23:54:47.383226 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:47.383208 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/905b783e-82e9-4f73-8977-416326429b44-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-vv7gt\" (UID: \"905b783e-82e9-4f73-8977-416326429b44\") " pod="openshift-insights/insights-runtime-extractor-vv7gt" Apr 24 23:54:47.385075 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:47.385052 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/905b783e-82e9-4f73-8977-416326429b44-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-vv7gt\" (UID: \"905b783e-82e9-4f73-8977-416326429b44\") " pod="openshift-insights/insights-runtime-extractor-vv7gt" Apr 24 23:54:47.391463 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:47.391443 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfk6b\" (UniqueName: \"kubernetes.io/projected/905b783e-82e9-4f73-8977-416326429b44-kube-api-access-jfk6b\") pod \"insights-runtime-extractor-vv7gt\" (UID: \"905b783e-82e9-4f73-8977-416326429b44\") " pod="openshift-insights/insights-runtime-extractor-vv7gt" Apr 24 23:54:47.429533 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:47.429495 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wfnsq" event={"ID":"59f163d5-53f4-4bdf-b71d-c0dd23cc0261","Type":"ContainerStarted","Data":"d9589c9e000543ecc047232c38b11acc8e29bcac0bdcf042e8d3db3dd7fc2695"} Apr 24 23:54:47.429533 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:47.429537 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wfnsq" event={"ID":"59f163d5-53f4-4bdf-b71d-c0dd23cc0261","Type":"ContainerStarted","Data":"2a2c39c281cbfd3414522a17637b2f04d4cf72f46ce64e64856e30b07f9d4f21"} Apr 24 23:54:47.448759 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:47.448712 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-wfnsq" podStartSLOduration=2.644414409 podStartE2EDuration="3.448697148s" podCreationTimestamp="2026-04-24 23:54:44 +0000 UTC" firstStartedPulling="2026-04-24 23:54:45.261847799 +0000 UTC m=+78.785132015" lastFinishedPulling="2026-04-24 23:54:46.066130536 +0000 UTC m=+79.589414754" observedRunningTime="2026-04-24 23:54:47.447581961 +0000 UTC m=+80.970866211" watchObservedRunningTime="2026-04-24 23:54:47.448697148 +0000 UTC m=+80.971981372" Apr 24 23:54:47.521006 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:47.520929 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-vv7gt" Apr 24 23:54:47.645339 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:47.645315 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-vv7gt"] Apr 24 23:54:47.649291 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:54:47.649261 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod905b783e_82e9_4f73_8977_416326429b44.slice/crio-addc6d40972a96f60e9df7be3b6b8a3a453c15931b123e4fc3f9b6c808bb3d25 WatchSource:0}: Error finding container addc6d40972a96f60e9df7be3b6b8a3a453c15931b123e4fc3f9b6c808bb3d25: Status 404 returned error can't find the container with id addc6d40972a96f60e9df7be3b6b8a3a453c15931b123e4fc3f9b6c808bb3d25 Apr 24 23:54:48.434853 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:48.434818 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-vv7gt" event={"ID":"905b783e-82e9-4f73-8977-416326429b44","Type":"ContainerStarted","Data":"f354be454ee7ad6b53dd87256737cd2c3b23bf989408ab14e8bc0b972100ab96"} Apr 24 23:54:48.434853 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:48.434857 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-vv7gt" event={"ID":"905b783e-82e9-4f73-8977-416326429b44","Type":"ContainerStarted","Data":"2aab91038488ad40e72a30651f0808c492cb963d8f75b83be5bcd17bd51dbda7"} Apr 24 23:54:48.435255 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:48.434866 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-vv7gt" event={"ID":"905b783e-82e9-4f73-8977-416326429b44","Type":"ContainerStarted","Data":"addc6d40972a96f60e9df7be3b6b8a3a453c15931b123e4fc3f9b6c808bb3d25"} Apr 24 23:54:50.442693 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:50.442652 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-vv7gt" event={"ID":"905b783e-82e9-4f73-8977-416326429b44","Type":"ContainerStarted","Data":"499a2959230b150712c073ce691f3a81e6c2588b1598c2627091d43550b415a1"} Apr 24 23:54:50.462664 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:50.462609 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-vv7gt" podStartSLOduration=1.394967105 podStartE2EDuration="3.462593719s" podCreationTimestamp="2026-04-24 23:54:47 +0000 UTC" firstStartedPulling="2026-04-24 23:54:47.72799093 +0000 UTC m=+81.251275145" lastFinishedPulling="2026-04-24 23:54:49.795617538 +0000 UTC m=+83.318901759" observedRunningTime="2026-04-24 23:54:50.462559592 +0000 UTC m=+83.985843817" watchObservedRunningTime="2026-04-24 23:54:50.462593719 +0000 UTC m=+83.985877943" Apr 24 23:54:53.235131 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:54:53.235088 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-55769f4fc8-zfbsw"] Apr 24 23:55:18.254077 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:55:18.254028 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-55769f4fc8-zfbsw" podUID="59e59786-7ad7-4994-a0db-3d510271563f" containerName="registry" containerID="cri-o://7de9e2282817be7c1388ac4a2c47d1020c529d7ca356b2b32c76e5382167021f" gracePeriod=30 Apr 24 23:55:18.272058 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:55:18.272034 2567 patch_prober.go:28] interesting pod/image-registry-55769f4fc8-zfbsw container/registry namespace/openshift-image-registry: Readiness probe status=failure output="Get \"https://10.132.0.7:5000/healthz\": dial tcp 10.132.0.7:5000: connect: connection refused" start-of-body= Apr 24 23:55:18.272155 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:55:18.272091 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-55769f4fc8-zfbsw" podUID="59e59786-7ad7-4994-a0db-3d510271563f" containerName="registry" probeResult="failure" output="Get \"https://10.132.0.7:5000/healthz\": dial tcp 10.132.0.7:5000: connect: connection refused" Apr 24 23:55:18.488757 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:55:18.488732 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-55769f4fc8-zfbsw" Apr 24 23:55:18.522622 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:55:18.522548 2567 generic.go:358] "Generic (PLEG): container finished" podID="59e59786-7ad7-4994-a0db-3d510271563f" containerID="7de9e2282817be7c1388ac4a2c47d1020c529d7ca356b2b32c76e5382167021f" exitCode=0 Apr 24 23:55:18.522622 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:55:18.522598 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-55769f4fc8-zfbsw" event={"ID":"59e59786-7ad7-4994-a0db-3d510271563f","Type":"ContainerDied","Data":"7de9e2282817be7c1388ac4a2c47d1020c529d7ca356b2b32c76e5382167021f"} Apr 24 23:55:18.522800 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:55:18.522626 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-55769f4fc8-zfbsw" event={"ID":"59e59786-7ad7-4994-a0db-3d510271563f","Type":"ContainerDied","Data":"889beb5e46731e313aaa42b38dd2277193b8d466a695b394363695e2cf7177bc"} Apr 24 23:55:18.522800 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:55:18.522642 2567 scope.go:117] "RemoveContainer" containerID="7de9e2282817be7c1388ac4a2c47d1020c529d7ca356b2b32c76e5382167021f" Apr 24 23:55:18.522800 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:55:18.522606 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-55769f4fc8-zfbsw" Apr 24 23:55:18.531054 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:55:18.531034 2567 scope.go:117] "RemoveContainer" containerID="7de9e2282817be7c1388ac4a2c47d1020c529d7ca356b2b32c76e5382167021f" Apr 24 23:55:18.531318 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:55:18.531297 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7de9e2282817be7c1388ac4a2c47d1020c529d7ca356b2b32c76e5382167021f\": container with ID starting with 7de9e2282817be7c1388ac4a2c47d1020c529d7ca356b2b32c76e5382167021f not found: ID does not exist" containerID="7de9e2282817be7c1388ac4a2c47d1020c529d7ca356b2b32c76e5382167021f" Apr 24 23:55:18.531426 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:55:18.531329 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7de9e2282817be7c1388ac4a2c47d1020c529d7ca356b2b32c76e5382167021f"} err="failed to get container status \"7de9e2282817be7c1388ac4a2c47d1020c529d7ca356b2b32c76e5382167021f\": rpc error: code = NotFound desc = could not find container \"7de9e2282817be7c1388ac4a2c47d1020c529d7ca356b2b32c76e5382167021f\": container with ID starting with 7de9e2282817be7c1388ac4a2c47d1020c529d7ca356b2b32c76e5382167021f not found: ID does not exist" Apr 24 23:55:18.543650 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:55:18.543625 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/59e59786-7ad7-4994-a0db-3d510271563f-registry-certificates\") pod \"59e59786-7ad7-4994-a0db-3d510271563f\" (UID: \"59e59786-7ad7-4994-a0db-3d510271563f\") " Apr 24 23:55:18.543750 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:55:18.543671 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/59e59786-7ad7-4994-a0db-3d510271563f-bound-sa-token\") pod \"59e59786-7ad7-4994-a0db-3d510271563f\" (UID: \"59e59786-7ad7-4994-a0db-3d510271563f\") " Apr 24 23:55:18.543750 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:55:18.543690 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/59e59786-7ad7-4994-a0db-3d510271563f-registry-tls\") pod \"59e59786-7ad7-4994-a0db-3d510271563f\" (UID: \"59e59786-7ad7-4994-a0db-3d510271563f\") " Apr 24 23:55:18.543817 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:55:18.543795 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/59e59786-7ad7-4994-a0db-3d510271563f-trusted-ca\") pod \"59e59786-7ad7-4994-a0db-3d510271563f\" (UID: \"59e59786-7ad7-4994-a0db-3d510271563f\") " Apr 24 23:55:18.543868 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:55:18.543846 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/59e59786-7ad7-4994-a0db-3d510271563f-ca-trust-extracted\") pod \"59e59786-7ad7-4994-a0db-3d510271563f\" (UID: \"59e59786-7ad7-4994-a0db-3d510271563f\") " Apr 24 23:55:18.543922 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:55:18.543906 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/59e59786-7ad7-4994-a0db-3d510271563f-image-registry-private-configuration\") pod \"59e59786-7ad7-4994-a0db-3d510271563f\" (UID: \"59e59786-7ad7-4994-a0db-3d510271563f\") " Apr 24 23:55:18.543991 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:55:18.543969 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82wqf\" (UniqueName: \"kubernetes.io/projected/59e59786-7ad7-4994-a0db-3d510271563f-kube-api-access-82wqf\") pod \"59e59786-7ad7-4994-a0db-3d510271563f\" (UID: \"59e59786-7ad7-4994-a0db-3d510271563f\") " Apr 24 23:55:18.544063 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:55:18.544002 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/59e59786-7ad7-4994-a0db-3d510271563f-installation-pull-secrets\") pod \"59e59786-7ad7-4994-a0db-3d510271563f\" (UID: \"59e59786-7ad7-4994-a0db-3d510271563f\") " Apr 24 23:55:18.544118 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:55:18.544056 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59e59786-7ad7-4994-a0db-3d510271563f-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "59e59786-7ad7-4994-a0db-3d510271563f" (UID: "59e59786-7ad7-4994-a0db-3d510271563f"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:55:18.544264 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:55:18.544223 2567 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/59e59786-7ad7-4994-a0db-3d510271563f-registry-certificates\") on node \"ip-10-0-135-201.ec2.internal\" DevicePath \"\"" Apr 24 23:55:18.544264 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:55:18.544249 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59e59786-7ad7-4994-a0db-3d510271563f-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "59e59786-7ad7-4994-a0db-3d510271563f" (UID: "59e59786-7ad7-4994-a0db-3d510271563f"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:55:18.546075 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:55:18.546027 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59e59786-7ad7-4994-a0db-3d510271563f-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "59e59786-7ad7-4994-a0db-3d510271563f" (UID: "59e59786-7ad7-4994-a0db-3d510271563f"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:55:18.546437 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:55:18.546396 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59e59786-7ad7-4994-a0db-3d510271563f-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "59e59786-7ad7-4994-a0db-3d510271563f" (UID: "59e59786-7ad7-4994-a0db-3d510271563f"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:55:18.546541 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:55:18.546486 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59e59786-7ad7-4994-a0db-3d510271563f-kube-api-access-82wqf" (OuterVolumeSpecName: "kube-api-access-82wqf") pod "59e59786-7ad7-4994-a0db-3d510271563f" (UID: "59e59786-7ad7-4994-a0db-3d510271563f"). InnerVolumeSpecName "kube-api-access-82wqf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:55:18.546541 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:55:18.546505 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59e59786-7ad7-4994-a0db-3d510271563f-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "59e59786-7ad7-4994-a0db-3d510271563f" (UID: "59e59786-7ad7-4994-a0db-3d510271563f"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:55:18.546681 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:55:18.546658 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59e59786-7ad7-4994-a0db-3d510271563f-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "59e59786-7ad7-4994-a0db-3d510271563f" (UID: "59e59786-7ad7-4994-a0db-3d510271563f"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:55:18.555591 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:55:18.555549 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59e59786-7ad7-4994-a0db-3d510271563f-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "59e59786-7ad7-4994-a0db-3d510271563f" (UID: "59e59786-7ad7-4994-a0db-3d510271563f"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 23:55:18.644667 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:55:18.644621 2567 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/59e59786-7ad7-4994-a0db-3d510271563f-trusted-ca\") on node \"ip-10-0-135-201.ec2.internal\" DevicePath \"\"" Apr 24 23:55:18.644667 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:55:18.644663 2567 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/59e59786-7ad7-4994-a0db-3d510271563f-ca-trust-extracted\") on node \"ip-10-0-135-201.ec2.internal\" DevicePath \"\"" Apr 24 23:55:18.644667 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:55:18.644678 2567 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/59e59786-7ad7-4994-a0db-3d510271563f-image-registry-private-configuration\") on node \"ip-10-0-135-201.ec2.internal\" DevicePath \"\"" Apr 24 23:55:18.644898 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:55:18.644688 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-82wqf\" (UniqueName: \"kubernetes.io/projected/59e59786-7ad7-4994-a0db-3d510271563f-kube-api-access-82wqf\") on node \"ip-10-0-135-201.ec2.internal\" DevicePath \"\"" Apr 24 23:55:18.644898 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:55:18.644697 2567 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/59e59786-7ad7-4994-a0db-3d510271563f-installation-pull-secrets\") on node \"ip-10-0-135-201.ec2.internal\" DevicePath \"\"" Apr 24 23:55:18.644898 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:55:18.644706 2567 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/59e59786-7ad7-4994-a0db-3d510271563f-bound-sa-token\") on node \"ip-10-0-135-201.ec2.internal\" DevicePath \"\"" Apr 24 23:55:18.644898 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:55:18.644714 2567 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/59e59786-7ad7-4994-a0db-3d510271563f-registry-tls\") on node \"ip-10-0-135-201.ec2.internal\" DevicePath \"\"" Apr 24 23:55:18.847338 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:55:18.847304 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-55769f4fc8-zfbsw"] Apr 24 23:55:18.850930 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:55:18.850906 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-55769f4fc8-zfbsw"] Apr 24 23:55:19.019055 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:55:19.019020 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59e59786-7ad7-4994-a0db-3d510271563f" path="/var/lib/kubelet/pods/59e59786-7ad7-4994-a0db-3d510271563f/volumes" Apr 24 23:55:35.572932 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:55:35.572891 2567 generic.go:358] "Generic (PLEG): container finished" podID="f025110a-88f1-46e6-9779-d4a470fe4338" containerID="35abab28105bfc97f3abd414f71c83a2be48789ed71fd12313edb9864349466f" exitCode=0 Apr 24 23:55:35.573340 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:55:35.572966 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lh69z" event={"ID":"f025110a-88f1-46e6-9779-d4a470fe4338","Type":"ContainerDied","Data":"35abab28105bfc97f3abd414f71c83a2be48789ed71fd12313edb9864349466f"} Apr 24 23:55:35.573340 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:55:35.573288 2567 scope.go:117] "RemoveContainer" containerID="35abab28105bfc97f3abd414f71c83a2be48789ed71fd12313edb9864349466f" Apr 24 23:55:36.577562 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:55:36.577526 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lh69z" event={"ID":"f025110a-88f1-46e6-9779-d4a470fe4338","Type":"ContainerStarted","Data":"a19af755ef9007635b9cfccd05642232e51db6f9c08fdfff6ae033fc14182add"} Apr 24 23:57:11.019402 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:11.019350 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c77f9q"] Apr 24 23:57:11.019811 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:11.019604 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59e59786-7ad7-4994-a0db-3d510271563f" containerName="registry" Apr 24 23:57:11.019811 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:11.019616 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="59e59786-7ad7-4994-a0db-3d510271563f" containerName="registry" Apr 24 23:57:11.019811 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:11.019667 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="59e59786-7ad7-4994-a0db-3d510271563f" containerName="registry" Apr 24 23:57:11.022468 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:11.022447 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c77f9q" Apr 24 23:57:11.025166 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:11.025144 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-hhz8h\"" Apr 24 23:57:11.025256 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:11.025148 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 23:57:11.026260 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:11.026246 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 23:57:11.031081 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:11.031058 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c77f9q"] Apr 24 23:57:11.142570 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:11.142520 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2210bd8e-f9b5-4448-9e96-614adb07bff3-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c77f9q\" (UID: \"2210bd8e-f9b5-4448-9e96-614adb07bff3\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c77f9q" Apr 24 23:57:11.142781 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:11.142646 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2210bd8e-f9b5-4448-9e96-614adb07bff3-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c77f9q\" (UID: \"2210bd8e-f9b5-4448-9e96-614adb07bff3\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c77f9q" Apr 24 23:57:11.142781 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:11.142741 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2pnd\" (UniqueName: \"kubernetes.io/projected/2210bd8e-f9b5-4448-9e96-614adb07bff3-kube-api-access-l2pnd\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c77f9q\" (UID: \"2210bd8e-f9b5-4448-9e96-614adb07bff3\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c77f9q" Apr 24 23:57:11.243854 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:11.243815 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2210bd8e-f9b5-4448-9e96-614adb07bff3-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c77f9q\" (UID: \"2210bd8e-f9b5-4448-9e96-614adb07bff3\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c77f9q" Apr 24 23:57:11.244014 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:11.243873 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2210bd8e-f9b5-4448-9e96-614adb07bff3-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c77f9q\" (UID: \"2210bd8e-f9b5-4448-9e96-614adb07bff3\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c77f9q" Apr 24 23:57:11.244014 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:11.243908 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l2pnd\" (UniqueName: \"kubernetes.io/projected/2210bd8e-f9b5-4448-9e96-614adb07bff3-kube-api-access-l2pnd\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c77f9q\" (UID: \"2210bd8e-f9b5-4448-9e96-614adb07bff3\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c77f9q" Apr 24 23:57:11.244237 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:11.244216 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2210bd8e-f9b5-4448-9e96-614adb07bff3-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c77f9q\" (UID: \"2210bd8e-f9b5-4448-9e96-614adb07bff3\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c77f9q" Apr 24 23:57:11.244293 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:11.244277 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2210bd8e-f9b5-4448-9e96-614adb07bff3-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c77f9q\" (UID: \"2210bd8e-f9b5-4448-9e96-614adb07bff3\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c77f9q" Apr 24 23:57:11.253539 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:11.253516 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2pnd\" (UniqueName: \"kubernetes.io/projected/2210bd8e-f9b5-4448-9e96-614adb07bff3-kube-api-access-l2pnd\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c77f9q\" (UID: \"2210bd8e-f9b5-4448-9e96-614adb07bff3\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c77f9q" Apr 24 23:57:11.332162 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:11.332080 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c77f9q" Apr 24 23:57:11.457145 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:11.457118 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c77f9q"] Apr 24 23:57:11.459672 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:57:11.459641 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2210bd8e_f9b5_4448_9e96_614adb07bff3.slice/crio-65a40e593c391500470b5c21807e10e0b35bf7b75549094ecf30c281d470c704 WatchSource:0}: Error finding container 65a40e593c391500470b5c21807e10e0b35bf7b75549094ecf30c281d470c704: Status 404 returned error can't find the container with id 65a40e593c391500470b5c21807e10e0b35bf7b75549094ecf30c281d470c704 Apr 24 23:57:11.833199 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:11.833165 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c77f9q" event={"ID":"2210bd8e-f9b5-4448-9e96-614adb07bff3","Type":"ContainerStarted","Data":"65a40e593c391500470b5c21807e10e0b35bf7b75549094ecf30c281d470c704"} Apr 24 23:57:17.853016 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:17.852976 2567 generic.go:358] "Generic (PLEG): container finished" podID="2210bd8e-f9b5-4448-9e96-614adb07bff3" containerID="c5e1a05c26310d9400524f3091d37d3936ba56ffa3e1fd7dd2cc1cbdec1d5a7e" exitCode=0 Apr 24 23:57:17.853421 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:17.853063 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c77f9q" event={"ID":"2210bd8e-f9b5-4448-9e96-614adb07bff3","Type":"ContainerDied","Data":"c5e1a05c26310d9400524f3091d37d3936ba56ffa3e1fd7dd2cc1cbdec1d5a7e"} Apr 24 23:57:20.868698 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:20.868657 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c77f9q" event={"ID":"2210bd8e-f9b5-4448-9e96-614adb07bff3","Type":"ContainerStarted","Data":"917b3d5097b476dceb02da71750e5b3dbdc2aaed83384400c4c8d4c224d4484e"} Apr 24 23:57:21.872983 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:21.872949 2567 generic.go:358] "Generic (PLEG): container finished" podID="2210bd8e-f9b5-4448-9e96-614adb07bff3" containerID="917b3d5097b476dceb02da71750e5b3dbdc2aaed83384400c4c8d4c224d4484e" exitCode=0 Apr 24 23:57:21.873351 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:21.872987 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c77f9q" event={"ID":"2210bd8e-f9b5-4448-9e96-614adb07bff3","Type":"ContainerDied","Data":"917b3d5097b476dceb02da71750e5b3dbdc2aaed83384400c4c8d4c224d4484e"} Apr 24 23:57:28.894445 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:28.894326 2567 generic.go:358] "Generic (PLEG): container finished" podID="2210bd8e-f9b5-4448-9e96-614adb07bff3" containerID="c3e1915090abcfdbff07e0b10a550cb140d4b326fadad9b1b2f76741b675cd95" exitCode=0 Apr 24 23:57:28.894445 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:28.894408 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c77f9q" event={"ID":"2210bd8e-f9b5-4448-9e96-614adb07bff3","Type":"ContainerDied","Data":"c3e1915090abcfdbff07e0b10a550cb140d4b326fadad9b1b2f76741b675cd95"} Apr 24 23:57:30.018035 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:30.018002 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c77f9q" Apr 24 23:57:30.085992 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:30.085944 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2210bd8e-f9b5-4448-9e96-614adb07bff3-util\") pod \"2210bd8e-f9b5-4448-9e96-614adb07bff3\" (UID: \"2210bd8e-f9b5-4448-9e96-614adb07bff3\") " Apr 24 23:57:30.085992 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:30.085995 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2pnd\" (UniqueName: \"kubernetes.io/projected/2210bd8e-f9b5-4448-9e96-614adb07bff3-kube-api-access-l2pnd\") pod \"2210bd8e-f9b5-4448-9e96-614adb07bff3\" (UID: \"2210bd8e-f9b5-4448-9e96-614adb07bff3\") " Apr 24 23:57:30.086245 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:30.086066 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2210bd8e-f9b5-4448-9e96-614adb07bff3-bundle\") pod \"2210bd8e-f9b5-4448-9e96-614adb07bff3\" (UID: \"2210bd8e-f9b5-4448-9e96-614adb07bff3\") " Apr 24 23:57:30.086748 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:30.086725 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2210bd8e-f9b5-4448-9e96-614adb07bff3-bundle" (OuterVolumeSpecName: "bundle") pod "2210bd8e-f9b5-4448-9e96-614adb07bff3" (UID: "2210bd8e-f9b5-4448-9e96-614adb07bff3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 23:57:30.088257 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:30.088229 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2210bd8e-f9b5-4448-9e96-614adb07bff3-kube-api-access-l2pnd" (OuterVolumeSpecName: "kube-api-access-l2pnd") pod "2210bd8e-f9b5-4448-9e96-614adb07bff3" (UID: "2210bd8e-f9b5-4448-9e96-614adb07bff3"). InnerVolumeSpecName "kube-api-access-l2pnd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:57:30.091584 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:30.091557 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2210bd8e-f9b5-4448-9e96-614adb07bff3-util" (OuterVolumeSpecName: "util") pod "2210bd8e-f9b5-4448-9e96-614adb07bff3" (UID: "2210bd8e-f9b5-4448-9e96-614adb07bff3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 23:57:30.187419 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:30.187389 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2210bd8e-f9b5-4448-9e96-614adb07bff3-util\") on node \"ip-10-0-135-201.ec2.internal\" DevicePath \"\"" Apr 24 23:57:30.187419 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:30.187420 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l2pnd\" (UniqueName: \"kubernetes.io/projected/2210bd8e-f9b5-4448-9e96-614adb07bff3-kube-api-access-l2pnd\") on node \"ip-10-0-135-201.ec2.internal\" DevicePath \"\"" Apr 24 23:57:30.187612 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:30.187430 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2210bd8e-f9b5-4448-9e96-614adb07bff3-bundle\") on node \"ip-10-0-135-201.ec2.internal\" DevicePath \"\"" Apr 24 23:57:30.901836 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:30.901801 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c77f9q" event={"ID":"2210bd8e-f9b5-4448-9e96-614adb07bff3","Type":"ContainerDied","Data":"65a40e593c391500470b5c21807e10e0b35bf7b75549094ecf30c281d470c704"} Apr 24 23:57:30.901836 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:30.901835 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65a40e593c391500470b5c21807e10e0b35bf7b75549094ecf30c281d470c704" Apr 24 23:57:30.902039 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:30.901845 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c77f9q" Apr 24 23:57:32.658105 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:32.658073 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2bmfl"] Apr 24 23:57:32.658548 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:32.658404 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2210bd8e-f9b5-4448-9e96-614adb07bff3" containerName="util" Apr 24 23:57:32.658548 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:32.658416 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="2210bd8e-f9b5-4448-9e96-614adb07bff3" containerName="util" Apr 24 23:57:32.658548 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:32.658432 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2210bd8e-f9b5-4448-9e96-614adb07bff3" containerName="pull" Apr 24 23:57:32.658548 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:32.658440 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="2210bd8e-f9b5-4448-9e96-614adb07bff3" containerName="pull" Apr 24 23:57:32.658548 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:32.658453 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2210bd8e-f9b5-4448-9e96-614adb07bff3" containerName="extract" Apr 24 23:57:32.658548 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:32.658461 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="2210bd8e-f9b5-4448-9e96-614adb07bff3" containerName="extract" Apr 24 23:57:32.658548 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:32.658512 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="2210bd8e-f9b5-4448-9e96-614adb07bff3" containerName="extract" Apr 24 23:57:32.665182 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:32.665165 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2bmfl" Apr 24 23:57:32.667791 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:32.667765 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 24 23:57:32.667791 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:32.667780 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 24 23:57:32.668750 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:32.668723 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-l7g6j\"" Apr 24 23:57:32.668910 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:32.668891 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 24 23:57:32.669483 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:32.669460 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2bmfl"] Apr 24 23:57:32.705376 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:32.705339 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mccz\" (UniqueName: \"kubernetes.io/projected/c60182ec-13c1-4067-b2e3-14c59852625a-kube-api-access-7mccz\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-2bmfl\" (UID: \"c60182ec-13c1-4067-b2e3-14c59852625a\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2bmfl" Apr 24 23:57:32.705520 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:32.705405 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/c60182ec-13c1-4067-b2e3-14c59852625a-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-2bmfl\" (UID: \"c60182ec-13c1-4067-b2e3-14c59852625a\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2bmfl" Apr 24 23:57:32.806461 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:32.806424 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7mccz\" (UniqueName: \"kubernetes.io/projected/c60182ec-13c1-4067-b2e3-14c59852625a-kube-api-access-7mccz\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-2bmfl\" (UID: \"c60182ec-13c1-4067-b2e3-14c59852625a\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2bmfl" Apr 24 23:57:32.806656 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:32.806468 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/c60182ec-13c1-4067-b2e3-14c59852625a-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-2bmfl\" (UID: \"c60182ec-13c1-4067-b2e3-14c59852625a\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2bmfl" Apr 24 23:57:32.808923 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:32.808896 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/c60182ec-13c1-4067-b2e3-14c59852625a-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-2bmfl\" (UID: \"c60182ec-13c1-4067-b2e3-14c59852625a\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2bmfl" Apr 24 23:57:32.815773 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:32.815742 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mccz\" (UniqueName: \"kubernetes.io/projected/c60182ec-13c1-4067-b2e3-14c59852625a-kube-api-access-7mccz\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-2bmfl\" (UID: \"c60182ec-13c1-4067-b2e3-14c59852625a\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2bmfl" Apr 24 23:57:32.976201 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:32.976158 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2bmfl" Apr 24 23:57:33.112460 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:33.112428 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2bmfl"] Apr 24 23:57:33.116261 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:57:33.116231 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc60182ec_13c1_4067_b2e3_14c59852625a.slice/crio-6cece492e5056c0a182ffc16f2d1b47bc776c1b7c1e91be643dbf7cc591939de WatchSource:0}: Error finding container 6cece492e5056c0a182ffc16f2d1b47bc776c1b7c1e91be643dbf7cc591939de: Status 404 returned error can't find the container with id 6cece492e5056c0a182ffc16f2d1b47bc776c1b7c1e91be643dbf7cc591939de Apr 24 23:57:33.913664 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:33.913623 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2bmfl" event={"ID":"c60182ec-13c1-4067-b2e3-14c59852625a","Type":"ContainerStarted","Data":"6cece492e5056c0a182ffc16f2d1b47bc776c1b7c1e91be643dbf7cc591939de"} Apr 24 23:57:39.932844 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:39.932811 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2bmfl" event={"ID":"c60182ec-13c1-4067-b2e3-14c59852625a","Type":"ContainerStarted","Data":"28f7bc062032b2511b194afc3a837dc0e7ec079408ef15c39451afcffe0cc2ee"} Apr 24 23:57:39.933288 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:39.932962 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2bmfl" Apr 24 23:57:39.957095 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:39.957042 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2bmfl" podStartSLOduration=1.48702921 podStartE2EDuration="7.957027847s" podCreationTimestamp="2026-04-24 23:57:32 +0000 UTC" firstStartedPulling="2026-04-24 23:57:33.118159775 +0000 UTC m=+246.641443995" lastFinishedPulling="2026-04-24 23:57:39.58815843 +0000 UTC m=+253.111442632" observedRunningTime="2026-04-24 23:57:39.955565236 +0000 UTC m=+253.478849457" watchObservedRunningTime="2026-04-24 23:57:39.957027847 +0000 UTC m=+253.480312071" Apr 24 23:57:40.105027 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:40.104983 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-szzgk"] Apr 24 23:57:40.108333 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:40.108313 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-szzgk" Apr 24 23:57:40.111228 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:40.111206 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 24 23:57:40.111334 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:40.111211 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 24 23:57:40.111497 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:40.111484 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-crdjw\"" Apr 24 23:57:40.116307 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:40.116284 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-szzgk"] Apr 24 23:57:40.168249 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:40.168214 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6jq9\" (UniqueName: \"kubernetes.io/projected/2b391a4e-09c6-479b-956c-160bdacfada0-kube-api-access-d6jq9\") pod \"keda-operator-ffbb595cb-szzgk\" (UID: \"2b391a4e-09c6-479b-956c-160bdacfada0\") " pod="openshift-keda/keda-operator-ffbb595cb-szzgk" Apr 24 23:57:40.168433 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:40.168264 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2b391a4e-09c6-479b-956c-160bdacfada0-certificates\") pod \"keda-operator-ffbb595cb-szzgk\" (UID: \"2b391a4e-09c6-479b-956c-160bdacfada0\") " pod="openshift-keda/keda-operator-ffbb595cb-szzgk" Apr 24 23:57:40.168433 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:40.168337 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/2b391a4e-09c6-479b-956c-160bdacfada0-cabundle0\") pod \"keda-operator-ffbb595cb-szzgk\" (UID: \"2b391a4e-09c6-479b-956c-160bdacfada0\") " pod="openshift-keda/keda-operator-ffbb595cb-szzgk" Apr 24 23:57:40.269160 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:40.269063 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/2b391a4e-09c6-479b-956c-160bdacfada0-cabundle0\") pod \"keda-operator-ffbb595cb-szzgk\" (UID: \"2b391a4e-09c6-479b-956c-160bdacfada0\") " pod="openshift-keda/keda-operator-ffbb595cb-szzgk" Apr 24 23:57:40.269160 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:40.269128 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d6jq9\" (UniqueName: \"kubernetes.io/projected/2b391a4e-09c6-479b-956c-160bdacfada0-kube-api-access-d6jq9\") pod \"keda-operator-ffbb595cb-szzgk\" (UID: \"2b391a4e-09c6-479b-956c-160bdacfada0\") " pod="openshift-keda/keda-operator-ffbb595cb-szzgk" Apr 24 23:57:40.269420 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:40.269171 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2b391a4e-09c6-479b-956c-160bdacfada0-certificates\") pod \"keda-operator-ffbb595cb-szzgk\" (UID: \"2b391a4e-09c6-479b-956c-160bdacfada0\") " pod="openshift-keda/keda-operator-ffbb595cb-szzgk" Apr 24 23:57:40.269420 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:57:40.269291 2567 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 24 23:57:40.269420 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:57:40.269309 2567 secret.go:281] references non-existent secret key: ca.crt Apr 24 23:57:40.269420 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:57:40.269319 2567 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 23:57:40.269420 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:57:40.269335 2567 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-szzgk: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 24 23:57:40.269420 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:57:40.269405 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2b391a4e-09c6-479b-956c-160bdacfada0-certificates podName:2b391a4e-09c6-479b-956c-160bdacfada0 nodeName:}" failed. No retries permitted until 2026-04-24 23:57:40.769385571 +0000 UTC m=+254.292669777 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/2b391a4e-09c6-479b-956c-160bdacfada0-certificates") pod "keda-operator-ffbb595cb-szzgk" (UID: "2b391a4e-09c6-479b-956c-160bdacfada0") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 24 23:57:40.269888 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:40.269865 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/2b391a4e-09c6-479b-956c-160bdacfada0-cabundle0\") pod \"keda-operator-ffbb595cb-szzgk\" (UID: \"2b391a4e-09c6-479b-956c-160bdacfada0\") " pod="openshift-keda/keda-operator-ffbb595cb-szzgk" Apr 24 23:57:40.280704 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:40.280677 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6jq9\" (UniqueName: \"kubernetes.io/projected/2b391a4e-09c6-479b-956c-160bdacfada0-kube-api-access-d6jq9\") pod \"keda-operator-ffbb595cb-szzgk\" (UID: \"2b391a4e-09c6-479b-956c-160bdacfada0\") " pod="openshift-keda/keda-operator-ffbb595cb-szzgk" Apr 24 23:57:40.513273 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:40.513232 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-2gxdv"] Apr 24 23:57:40.516628 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:40.516609 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2gxdv" Apr 24 23:57:40.519754 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:40.519702 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 24 23:57:40.525859 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:40.525834 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-2gxdv"] Apr 24 23:57:40.571770 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:40.571733 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptsfr\" (UniqueName: \"kubernetes.io/projected/a93aa261-066c-4b8b-8140-c2ab0b175190-kube-api-access-ptsfr\") pod \"keda-metrics-apiserver-7c9f485588-2gxdv\" (UID: \"a93aa261-066c-4b8b-8140-c2ab0b175190\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2gxdv" Apr 24 23:57:40.571957 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:40.571824 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a93aa261-066c-4b8b-8140-c2ab0b175190-certificates\") pod \"keda-metrics-apiserver-7c9f485588-2gxdv\" (UID: \"a93aa261-066c-4b8b-8140-c2ab0b175190\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2gxdv" Apr 24 23:57:40.571957 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:40.571884 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/a93aa261-066c-4b8b-8140-c2ab0b175190-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-2gxdv\" (UID: \"a93aa261-066c-4b8b-8140-c2ab0b175190\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2gxdv" Apr 24 23:57:40.673267 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:40.673196 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a93aa261-066c-4b8b-8140-c2ab0b175190-certificates\") pod \"keda-metrics-apiserver-7c9f485588-2gxdv\" (UID: \"a93aa261-066c-4b8b-8140-c2ab0b175190\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2gxdv" Apr 24 23:57:40.673465 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:40.673290 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/a93aa261-066c-4b8b-8140-c2ab0b175190-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-2gxdv\" (UID: \"a93aa261-066c-4b8b-8140-c2ab0b175190\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2gxdv" Apr 24 23:57:40.673465 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:40.673353 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ptsfr\" (UniqueName: \"kubernetes.io/projected/a93aa261-066c-4b8b-8140-c2ab0b175190-kube-api-access-ptsfr\") pod \"keda-metrics-apiserver-7c9f485588-2gxdv\" (UID: \"a93aa261-066c-4b8b-8140-c2ab0b175190\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2gxdv" Apr 24 23:57:40.673465 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:57:40.673388 2567 secret.go:281] references non-existent secret key: tls.crt Apr 24 23:57:40.673465 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:57:40.673412 2567 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 23:57:40.673465 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:57:40.673432 2567 projected.go:264] Couldn't get secret openshift-keda/keda-metrics-apiserver-certs: secret "keda-metrics-apiserver-certs" not found Apr 24 23:57:40.673465 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:57:40.673454 2567 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-2gxdv: [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 24 23:57:40.673790 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:57:40.673524 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a93aa261-066c-4b8b-8140-c2ab0b175190-certificates podName:a93aa261-066c-4b8b-8140-c2ab0b175190 nodeName:}" failed. No retries permitted until 2026-04-24 23:57:41.173502622 +0000 UTC m=+254.696786891 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/a93aa261-066c-4b8b-8140-c2ab0b175190-certificates") pod "keda-metrics-apiserver-7c9f485588-2gxdv" (UID: "a93aa261-066c-4b8b-8140-c2ab0b175190") : [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 24 23:57:40.673790 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:40.673713 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/a93aa261-066c-4b8b-8140-c2ab0b175190-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-2gxdv\" (UID: \"a93aa261-066c-4b8b-8140-c2ab0b175190\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2gxdv" Apr 24 23:57:40.685259 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:40.685224 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptsfr\" (UniqueName: \"kubernetes.io/projected/a93aa261-066c-4b8b-8140-c2ab0b175190-kube-api-access-ptsfr\") pod \"keda-metrics-apiserver-7c9f485588-2gxdv\" (UID: \"a93aa261-066c-4b8b-8140-c2ab0b175190\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2gxdv" Apr 24 23:57:40.730057 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:40.730025 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-698zl"] Apr 24 23:57:40.733245 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:40.733227 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-698zl" Apr 24 23:57:40.738775 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:40.738753 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 24 23:57:40.744415 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:40.744390 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-698zl"] Apr 24 23:57:40.774195 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:40.774117 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfdpv\" (UniqueName: \"kubernetes.io/projected/0d665b64-e024-4ae3-a8c4-9676eebc8cef-kube-api-access-lfdpv\") pod \"keda-admission-cf49989db-698zl\" (UID: \"0d665b64-e024-4ae3-a8c4-9676eebc8cef\") " pod="openshift-keda/keda-admission-cf49989db-698zl" Apr 24 23:57:40.774195 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:40.774190 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2b391a4e-09c6-479b-956c-160bdacfada0-certificates\") pod \"keda-operator-ffbb595cb-szzgk\" (UID: \"2b391a4e-09c6-479b-956c-160bdacfada0\") " pod="openshift-keda/keda-operator-ffbb595cb-szzgk" Apr 24 23:57:40.774390 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:40.774274 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0d665b64-e024-4ae3-a8c4-9676eebc8cef-certificates\") pod \"keda-admission-cf49989db-698zl\" (UID: \"0d665b64-e024-4ae3-a8c4-9676eebc8cef\") " pod="openshift-keda/keda-admission-cf49989db-698zl" Apr 24 23:57:40.774442 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:57:40.774381 2567 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 24 23:57:40.774442 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:57:40.774408 2567 secret.go:281] references non-existent secret key: ca.crt Apr 24 23:57:40.774442 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:57:40.774418 2567 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 23:57:40.774442 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:57:40.774432 2567 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-szzgk: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 24 23:57:40.774571 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:57:40.774486 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2b391a4e-09c6-479b-956c-160bdacfada0-certificates podName:2b391a4e-09c6-479b-956c-160bdacfada0 nodeName:}" failed. No retries permitted until 2026-04-24 23:57:41.774467212 +0000 UTC m=+255.297751421 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/2b391a4e-09c6-479b-956c-160bdacfada0-certificates") pod "keda-operator-ffbb595cb-szzgk" (UID: "2b391a4e-09c6-479b-956c-160bdacfada0") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 24 23:57:40.875692 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:40.875653 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0d665b64-e024-4ae3-a8c4-9676eebc8cef-certificates\") pod \"keda-admission-cf49989db-698zl\" (UID: \"0d665b64-e024-4ae3-a8c4-9676eebc8cef\") " pod="openshift-keda/keda-admission-cf49989db-698zl" Apr 24 23:57:40.875888 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:40.875727 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lfdpv\" (UniqueName: \"kubernetes.io/projected/0d665b64-e024-4ae3-a8c4-9676eebc8cef-kube-api-access-lfdpv\") pod \"keda-admission-cf49989db-698zl\" (UID: \"0d665b64-e024-4ae3-a8c4-9676eebc8cef\") " pod="openshift-keda/keda-admission-cf49989db-698zl" Apr 24 23:57:40.875888 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:57:40.875829 2567 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 24 23:57:40.875888 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:57:40.875862 2567 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-698zl: secret "keda-admission-webhooks-certs" not found Apr 24 23:57:40.876048 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:57:40.875926 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0d665b64-e024-4ae3-a8c4-9676eebc8cef-certificates podName:0d665b64-e024-4ae3-a8c4-9676eebc8cef nodeName:}" failed. No retries permitted until 2026-04-24 23:57:41.375905512 +0000 UTC m=+254.899189717 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/0d665b64-e024-4ae3-a8c4-9676eebc8cef-certificates") pod "keda-admission-cf49989db-698zl" (UID: "0d665b64-e024-4ae3-a8c4-9676eebc8cef") : secret "keda-admission-webhooks-certs" not found Apr 24 23:57:40.886770 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:40.886740 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfdpv\" (UniqueName: \"kubernetes.io/projected/0d665b64-e024-4ae3-a8c4-9676eebc8cef-kube-api-access-lfdpv\") pod \"keda-admission-cf49989db-698zl\" (UID: \"0d665b64-e024-4ae3-a8c4-9676eebc8cef\") " pod="openshift-keda/keda-admission-cf49989db-698zl" Apr 24 23:57:41.178529 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:41.178499 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a93aa261-066c-4b8b-8140-c2ab0b175190-certificates\") pod \"keda-metrics-apiserver-7c9f485588-2gxdv\" (UID: \"a93aa261-066c-4b8b-8140-c2ab0b175190\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2gxdv" Apr 24 23:57:41.178909 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:57:41.178649 2567 secret.go:281] references non-existent secret key: tls.crt Apr 24 23:57:41.178909 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:57:41.178666 2567 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 23:57:41.178909 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:57:41.178684 2567 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-2gxdv: references non-existent secret key: tls.crt Apr 24 23:57:41.178909 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:57:41.178738 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a93aa261-066c-4b8b-8140-c2ab0b175190-certificates podName:a93aa261-066c-4b8b-8140-c2ab0b175190 nodeName:}" failed. No retries permitted until 2026-04-24 23:57:42.17872096 +0000 UTC m=+255.702005181 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/a93aa261-066c-4b8b-8140-c2ab0b175190-certificates") pod "keda-metrics-apiserver-7c9f485588-2gxdv" (UID: "a93aa261-066c-4b8b-8140-c2ab0b175190") : references non-existent secret key: tls.crt Apr 24 23:57:41.380269 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:41.380232 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0d665b64-e024-4ae3-a8c4-9676eebc8cef-certificates\") pod \"keda-admission-cf49989db-698zl\" (UID: \"0d665b64-e024-4ae3-a8c4-9676eebc8cef\") " pod="openshift-keda/keda-admission-cf49989db-698zl" Apr 24 23:57:41.383031 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:41.382995 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0d665b64-e024-4ae3-a8c4-9676eebc8cef-certificates\") pod \"keda-admission-cf49989db-698zl\" (UID: \"0d665b64-e024-4ae3-a8c4-9676eebc8cef\") " pod="openshift-keda/keda-admission-cf49989db-698zl" Apr 24 23:57:41.644254 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:41.644154 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-698zl" Apr 24 23:57:41.767880 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:41.767791 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-698zl"] Apr 24 23:57:41.770515 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:57:41.770480 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d665b64_e024_4ae3_a8c4_9676eebc8cef.slice/crio-4f94909c03e5da30b9a4aea61352c3cbea982d92d5e0fc525da425737b4973f9 WatchSource:0}: Error finding container 4f94909c03e5da30b9a4aea61352c3cbea982d92d5e0fc525da425737b4973f9: Status 404 returned error can't find the container with id 4f94909c03e5da30b9a4aea61352c3cbea982d92d5e0fc525da425737b4973f9 Apr 24 23:57:41.784136 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:41.784102 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2b391a4e-09c6-479b-956c-160bdacfada0-certificates\") pod \"keda-operator-ffbb595cb-szzgk\" (UID: \"2b391a4e-09c6-479b-956c-160bdacfada0\") " pod="openshift-keda/keda-operator-ffbb595cb-szzgk" Apr 24 23:57:41.784284 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:57:41.784258 2567 secret.go:281] references non-existent secret key: ca.crt Apr 24 23:57:41.784284 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:57:41.784279 2567 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 23:57:41.784410 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:57:41.784292 2567 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-szzgk: references non-existent secret key: ca.crt Apr 24 23:57:41.784410 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:57:41.784356 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2b391a4e-09c6-479b-956c-160bdacfada0-certificates podName:2b391a4e-09c6-479b-956c-160bdacfada0 nodeName:}" failed. No retries permitted until 2026-04-24 23:57:43.784336761 +0000 UTC m=+257.307620977 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/2b391a4e-09c6-479b-956c-160bdacfada0-certificates") pod "keda-operator-ffbb595cb-szzgk" (UID: "2b391a4e-09c6-479b-956c-160bdacfada0") : references non-existent secret key: ca.crt Apr 24 23:57:41.944294 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:41.944253 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-698zl" event={"ID":"0d665b64-e024-4ae3-a8c4-9676eebc8cef","Type":"ContainerStarted","Data":"4f94909c03e5da30b9a4aea61352c3cbea982d92d5e0fc525da425737b4973f9"} Apr 24 23:57:42.186874 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:42.186839 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a93aa261-066c-4b8b-8140-c2ab0b175190-certificates\") pod \"keda-metrics-apiserver-7c9f485588-2gxdv\" (UID: \"a93aa261-066c-4b8b-8140-c2ab0b175190\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2gxdv" Apr 24 23:57:42.187233 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:57:42.186938 2567 secret.go:281] references non-existent secret key: tls.crt Apr 24 23:57:42.187233 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:57:42.186951 2567 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 23:57:42.187233 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:57:42.186969 2567 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-2gxdv: references non-existent secret key: tls.crt Apr 24 23:57:42.187233 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:57:42.187024 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a93aa261-066c-4b8b-8140-c2ab0b175190-certificates podName:a93aa261-066c-4b8b-8140-c2ab0b175190 nodeName:}" failed. No retries permitted until 2026-04-24 23:57:44.187007831 +0000 UTC m=+257.710292039 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/a93aa261-066c-4b8b-8140-c2ab0b175190-certificates") pod "keda-metrics-apiserver-7c9f485588-2gxdv" (UID: "a93aa261-066c-4b8b-8140-c2ab0b175190") : references non-existent secret key: tls.crt Apr 24 23:57:43.802519 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:43.802483 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2b391a4e-09c6-479b-956c-160bdacfada0-certificates\") pod \"keda-operator-ffbb595cb-szzgk\" (UID: \"2b391a4e-09c6-479b-956c-160bdacfada0\") " pod="openshift-keda/keda-operator-ffbb595cb-szzgk" Apr 24 23:57:43.802983 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:57:43.802625 2567 secret.go:281] references non-existent secret key: ca.crt Apr 24 23:57:43.802983 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:57:43.802644 2567 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 23:57:43.802983 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:57:43.802655 2567 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-szzgk: references non-existent secret key: ca.crt Apr 24 23:57:43.802983 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:57:43.802727 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2b391a4e-09c6-479b-956c-160bdacfada0-certificates podName:2b391a4e-09c6-479b-956c-160bdacfada0 nodeName:}" failed. No retries permitted until 2026-04-24 23:57:47.802707056 +0000 UTC m=+261.325991276 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/2b391a4e-09c6-479b-956c-160bdacfada0-certificates") pod "keda-operator-ffbb595cb-szzgk" (UID: "2b391a4e-09c6-479b-956c-160bdacfada0") : references non-existent secret key: ca.crt Apr 24 23:57:44.205593 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:44.205562 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a93aa261-066c-4b8b-8140-c2ab0b175190-certificates\") pod \"keda-metrics-apiserver-7c9f485588-2gxdv\" (UID: \"a93aa261-066c-4b8b-8140-c2ab0b175190\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2gxdv" Apr 24 23:57:44.208206 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:44.208182 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a93aa261-066c-4b8b-8140-c2ab0b175190-certificates\") pod \"keda-metrics-apiserver-7c9f485588-2gxdv\" (UID: \"a93aa261-066c-4b8b-8140-c2ab0b175190\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2gxdv" Apr 24 23:57:44.430502 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:44.430465 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2gxdv" Apr 24 23:57:44.780677 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:44.780645 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-2gxdv"] Apr 24 23:57:44.783107 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:57:44.783077 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda93aa261_066c_4b8b_8140_c2ab0b175190.slice/crio-20f4f1e912aafae20afb56dc369381f3f1f694446882516946ef0e5a5fbf771b WatchSource:0}: Error finding container 20f4f1e912aafae20afb56dc369381f3f1f694446882516946ef0e5a5fbf771b: Status 404 returned error can't find the container with id 20f4f1e912aafae20afb56dc369381f3f1f694446882516946ef0e5a5fbf771b Apr 24 23:57:44.955254 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:44.955221 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-698zl" event={"ID":"0d665b64-e024-4ae3-a8c4-9676eebc8cef","Type":"ContainerStarted","Data":"1319fe28e1d3975a57901d93e8a8330f1708581c0a389c8b6bb1f123f5ff2562"} Apr 24 23:57:44.955709 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:44.955324 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-698zl" Apr 24 23:57:44.956308 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:44.956289 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2gxdv" event={"ID":"a93aa261-066c-4b8b-8140-c2ab0b175190","Type":"ContainerStarted","Data":"20f4f1e912aafae20afb56dc369381f3f1f694446882516946ef0e5a5fbf771b"} Apr 24 23:57:44.972598 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:44.972548 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-698zl" podStartSLOduration=2.020268768 podStartE2EDuration="4.97253441s" podCreationTimestamp="2026-04-24 23:57:40 +0000 UTC" firstStartedPulling="2026-04-24 23:57:41.771862088 +0000 UTC m=+255.295146290" lastFinishedPulling="2026-04-24 23:57:44.724127724 +0000 UTC m=+258.247411932" observedRunningTime="2026-04-24 23:57:44.971473489 +0000 UTC m=+258.494757714" watchObservedRunningTime="2026-04-24 23:57:44.97253441 +0000 UTC m=+258.495818634" Apr 24 23:57:47.835038 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:47.835001 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2b391a4e-09c6-479b-956c-160bdacfada0-certificates\") pod \"keda-operator-ffbb595cb-szzgk\" (UID: \"2b391a4e-09c6-479b-956c-160bdacfada0\") " pod="openshift-keda/keda-operator-ffbb595cb-szzgk" Apr 24 23:57:47.837706 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:47.837679 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2b391a4e-09c6-479b-956c-160bdacfada0-certificates\") pod \"keda-operator-ffbb595cb-szzgk\" (UID: \"2b391a4e-09c6-479b-956c-160bdacfada0\") " pod="openshift-keda/keda-operator-ffbb595cb-szzgk" Apr 24 23:57:47.919938 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:47.919894 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-szzgk" Apr 24 23:57:48.614125 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:48.614103 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-szzgk"] Apr 24 23:57:48.623448 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:57:48.623419 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b391a4e_09c6_479b_956c_160bdacfada0.slice/crio-3c73899275b14428bd635661ce0fb38786de2b32cb2d787c1cb3774066b7060d WatchSource:0}: Error finding container 3c73899275b14428bd635661ce0fb38786de2b32cb2d787c1cb3774066b7060d: Status 404 returned error can't find the container with id 3c73899275b14428bd635661ce0fb38786de2b32cb2d787c1cb3774066b7060d Apr 24 23:57:48.969508 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:48.969472 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-szzgk" event={"ID":"2b391a4e-09c6-479b-956c-160bdacfada0","Type":"ContainerStarted","Data":"3c73899275b14428bd635661ce0fb38786de2b32cb2d787c1cb3774066b7060d"} Apr 24 23:57:48.970816 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:48.970781 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2gxdv" event={"ID":"a93aa261-066c-4b8b-8140-c2ab0b175190","Type":"ContainerStarted","Data":"3cebbb8771235a9fa96069285cb39a2309563bf4200a4b33dba09b56c4aa18b5"} Apr 24 23:57:48.971030 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:48.971015 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2gxdv" Apr 24 23:57:48.988550 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:48.988495 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2gxdv" podStartSLOduration=5.216568312 podStartE2EDuration="8.988478234s" podCreationTimestamp="2026-04-24 23:57:40 +0000 UTC" firstStartedPulling="2026-04-24 23:57:44.784439769 +0000 UTC m=+258.307723972" lastFinishedPulling="2026-04-24 23:57:48.556349661 +0000 UTC m=+262.079633894" observedRunningTime="2026-04-24 23:57:48.98742616 +0000 UTC m=+262.510710382" watchObservedRunningTime="2026-04-24 23:57:48.988478234 +0000 UTC m=+262.511762457" Apr 24 23:57:52.986142 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:52.986102 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-szzgk" event={"ID":"2b391a4e-09c6-479b-956c-160bdacfada0","Type":"ContainerStarted","Data":"4d6728adb2e6a8719de4a97202d57955ba48cfa95121ca9faf47ef7c2846d6b9"} Apr 24 23:57:52.986570 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:52.986215 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-szzgk" Apr 24 23:57:53.005310 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:53.005256 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-szzgk" podStartSLOduration=9.344373218 podStartE2EDuration="13.005243501s" podCreationTimestamp="2026-04-24 23:57:40 +0000 UTC" firstStartedPulling="2026-04-24 23:57:48.624828615 +0000 UTC m=+262.148112818" lastFinishedPulling="2026-04-24 23:57:52.285698897 +0000 UTC m=+265.808983101" observedRunningTime="2026-04-24 23:57:53.003465385 +0000 UTC m=+266.526749609" watchObservedRunningTime="2026-04-24 23:57:53.005243501 +0000 UTC m=+266.528527726" Apr 24 23:57:59.979349 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:57:59.979315 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2gxdv" Apr 24 23:58:00.938020 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:58:00.937985 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2bmfl" Apr 24 23:58:05.962338 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:58:05.962303 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-698zl" Apr 24 23:58:13.992426 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:58:13.992394 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-szzgk" Apr 24 23:58:26.931500 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:58:26.931469 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7hz6p_281d90a3-293d-45ac-8b4c-87bdc25f3882/ovn-acl-logging/0.log" Apr 24 23:58:26.931950 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:58:26.931927 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7hz6p_281d90a3-293d-45ac-8b4c-87bdc25f3882/ovn-acl-logging/0.log" Apr 24 23:58:26.938666 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:58:26.938642 2567 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 23:58:43.186180 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:58:43.186141 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-64c4d9588d-jk6ql"] Apr 24 23:58:43.189564 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:58:43.189546 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-64c4d9588d-jk6ql" Apr 24 23:58:43.192212 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:58:43.192180 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 24 23:58:43.192212 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:58:43.192191 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 23:58:43.193172 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:58:43.193151 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-t76xb\"" Apr 24 23:58:43.193243 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:58:43.193174 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 23:58:43.200636 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:58:43.200614 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-64c4d9588d-jk6ql"] Apr 24 23:58:43.221225 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:58:43.221189 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-7j9xt"] Apr 24 23:58:43.224314 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:58:43.224297 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-7j9xt" Apr 24 23:58:43.227192 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:58:43.227169 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-6c5nr\"" Apr 24 23:58:43.227324 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:58:43.227307 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 24 23:58:43.233972 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:58:43.233950 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-7j9xt"] Apr 24 23:58:43.274298 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:58:43.274269 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c9a4ed91-83cd-4c2e-8861-b0e410a2e2e7-cert\") pod \"kserve-controller-manager-64c4d9588d-jk6ql\" (UID: \"c9a4ed91-83cd-4c2e-8861-b0e410a2e2e7\") " pod="kserve/kserve-controller-manager-64c4d9588d-jk6ql" Apr 24 23:58:43.274495 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:58:43.274353 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/e264160c-0e57-4a81-b81f-19161f89fec1-data\") pod \"seaweedfs-86cc847c5c-7j9xt\" (UID: \"e264160c-0e57-4a81-b81f-19161f89fec1\") " pod="kserve/seaweedfs-86cc847c5c-7j9xt" Apr 24 23:58:43.274495 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:58:43.274424 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-595pv\" (UniqueName: \"kubernetes.io/projected/c9a4ed91-83cd-4c2e-8861-b0e410a2e2e7-kube-api-access-595pv\") pod \"kserve-controller-manager-64c4d9588d-jk6ql\" (UID: \"c9a4ed91-83cd-4c2e-8861-b0e410a2e2e7\") " pod="kserve/kserve-controller-manager-64c4d9588d-jk6ql" Apr 24 23:58:43.274607 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:58:43.274492 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmckf\" (UniqueName: \"kubernetes.io/projected/e264160c-0e57-4a81-b81f-19161f89fec1-kube-api-access-jmckf\") pod \"seaweedfs-86cc847c5c-7j9xt\" (UID: \"e264160c-0e57-4a81-b81f-19161f89fec1\") " pod="kserve/seaweedfs-86cc847c5c-7j9xt" Apr 24 23:58:43.374930 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:58:43.374893 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/e264160c-0e57-4a81-b81f-19161f89fec1-data\") pod \"seaweedfs-86cc847c5c-7j9xt\" (UID: \"e264160c-0e57-4a81-b81f-19161f89fec1\") " pod="kserve/seaweedfs-86cc847c5c-7j9xt" Apr 24 23:58:43.375098 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:58:43.374945 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-595pv\" (UniqueName: \"kubernetes.io/projected/c9a4ed91-83cd-4c2e-8861-b0e410a2e2e7-kube-api-access-595pv\") pod \"kserve-controller-manager-64c4d9588d-jk6ql\" (UID: \"c9a4ed91-83cd-4c2e-8861-b0e410a2e2e7\") " pod="kserve/kserve-controller-manager-64c4d9588d-jk6ql" Apr 24 23:58:43.375098 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:58:43.374984 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jmckf\" (UniqueName: \"kubernetes.io/projected/e264160c-0e57-4a81-b81f-19161f89fec1-kube-api-access-jmckf\") pod \"seaweedfs-86cc847c5c-7j9xt\" (UID: \"e264160c-0e57-4a81-b81f-19161f89fec1\") " pod="kserve/seaweedfs-86cc847c5c-7j9xt" Apr 24 23:58:43.375098 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:58:43.375030 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c9a4ed91-83cd-4c2e-8861-b0e410a2e2e7-cert\") pod \"kserve-controller-manager-64c4d9588d-jk6ql\" (UID: \"c9a4ed91-83cd-4c2e-8861-b0e410a2e2e7\") " pod="kserve/kserve-controller-manager-64c4d9588d-jk6ql" Apr 24 23:58:43.375402 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:58:43.375347 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/e264160c-0e57-4a81-b81f-19161f89fec1-data\") pod \"seaweedfs-86cc847c5c-7j9xt\" (UID: \"e264160c-0e57-4a81-b81f-19161f89fec1\") " pod="kserve/seaweedfs-86cc847c5c-7j9xt" Apr 24 23:58:43.377418 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:58:43.377396 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c9a4ed91-83cd-4c2e-8861-b0e410a2e2e7-cert\") pod \"kserve-controller-manager-64c4d9588d-jk6ql\" (UID: \"c9a4ed91-83cd-4c2e-8861-b0e410a2e2e7\") " pod="kserve/kserve-controller-manager-64c4d9588d-jk6ql" Apr 24 23:58:43.383385 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:58:43.383349 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmckf\" (UniqueName: \"kubernetes.io/projected/e264160c-0e57-4a81-b81f-19161f89fec1-kube-api-access-jmckf\") pod \"seaweedfs-86cc847c5c-7j9xt\" (UID: \"e264160c-0e57-4a81-b81f-19161f89fec1\") " pod="kserve/seaweedfs-86cc847c5c-7j9xt" Apr 24 23:58:43.383706 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:58:43.383684 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-595pv\" (UniqueName: \"kubernetes.io/projected/c9a4ed91-83cd-4c2e-8861-b0e410a2e2e7-kube-api-access-595pv\") pod \"kserve-controller-manager-64c4d9588d-jk6ql\" (UID: \"c9a4ed91-83cd-4c2e-8861-b0e410a2e2e7\") " pod="kserve/kserve-controller-manager-64c4d9588d-jk6ql" Apr 24 23:58:43.501461 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:58:43.501380 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-64c4d9588d-jk6ql" Apr 24 23:58:43.535858 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:58:43.535824 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-7j9xt" Apr 24 23:58:43.629548 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:58:43.629520 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-64c4d9588d-jk6ql"] Apr 24 23:58:43.631971 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:58:43.631931 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9a4ed91_83cd_4c2e_8861_b0e410a2e2e7.slice/crio-acab37659d517a19f1e66f109b3b9abc64c55f7b5a210a344c668c41c1192a7d WatchSource:0}: Error finding container acab37659d517a19f1e66f109b3b9abc64c55f7b5a210a344c668c41c1192a7d: Status 404 returned error can't find the container with id acab37659d517a19f1e66f109b3b9abc64c55f7b5a210a344c668c41c1192a7d Apr 24 23:58:43.633248 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:58:43.633230 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 23:58:43.664437 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:58:43.664401 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-7j9xt"] Apr 24 23:58:43.667847 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:58:43.667817 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode264160c_0e57_4a81_b81f_19161f89fec1.slice/crio-c6d0638ff322e65690a21220ad4d660b954b522d9477b3675190981c329bb8e4 WatchSource:0}: Error finding container c6d0638ff322e65690a21220ad4d660b954b522d9477b3675190981c329bb8e4: Status 404 returned error can't find the container with id c6d0638ff322e65690a21220ad4d660b954b522d9477b3675190981c329bb8e4 Apr 24 23:58:44.143827 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:58:44.143785 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-64c4d9588d-jk6ql" event={"ID":"c9a4ed91-83cd-4c2e-8861-b0e410a2e2e7","Type":"ContainerStarted","Data":"acab37659d517a19f1e66f109b3b9abc64c55f7b5a210a344c668c41c1192a7d"} Apr 24 23:58:44.145041 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:58:44.145013 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-7j9xt" event={"ID":"e264160c-0e57-4a81-b81f-19161f89fec1","Type":"ContainerStarted","Data":"c6d0638ff322e65690a21220ad4d660b954b522d9477b3675190981c329bb8e4"} Apr 24 23:58:48.161116 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:58:48.161075 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-7j9xt" event={"ID":"e264160c-0e57-4a81-b81f-19161f89fec1","Type":"ContainerStarted","Data":"7ff7d0751cd02f467b2764ce77de0654913f84c3f58c4f0f81019cbda50cfa4e"} Apr 24 23:58:48.161606 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:58:48.161150 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-7j9xt" Apr 24 23:58:48.162439 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:58:48.162412 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-64c4d9588d-jk6ql" event={"ID":"c9a4ed91-83cd-4c2e-8861-b0e410a2e2e7","Type":"ContainerStarted","Data":"65d9709fe3399e11c22a2a2c4da0dcaa4c9222f448530563f2bc76c009707424"} Apr 24 23:58:48.162573 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:58:48.162534 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-64c4d9588d-jk6ql" Apr 24 23:58:48.179094 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:58:48.179045 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-7j9xt" podStartSLOduration=1.519480255 podStartE2EDuration="5.179032461s" podCreationTimestamp="2026-04-24 23:58:43 +0000 UTC" firstStartedPulling="2026-04-24 23:58:43.66913031 +0000 UTC m=+317.192414511" lastFinishedPulling="2026-04-24 23:58:47.328682496 +0000 UTC m=+320.851966717" observedRunningTime="2026-04-24 23:58:48.176964996 +0000 UTC m=+321.700249219" watchObservedRunningTime="2026-04-24 23:58:48.179032461 +0000 UTC m=+321.702316682" Apr 24 23:58:48.192120 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:58:48.192068 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-64c4d9588d-jk6ql" podStartSLOduration=1.592805104 podStartE2EDuration="5.192054049s" podCreationTimestamp="2026-04-24 23:58:43 +0000 UTC" firstStartedPulling="2026-04-24 23:58:43.633375379 +0000 UTC m=+317.156659594" lastFinishedPulling="2026-04-24 23:58:47.232624335 +0000 UTC m=+320.755908539" observedRunningTime="2026-04-24 23:58:48.19096777 +0000 UTC m=+321.714251994" watchObservedRunningTime="2026-04-24 23:58:48.192054049 +0000 UTC m=+321.715338273" Apr 24 23:58:54.168497 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:58:54.168465 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-7j9xt" Apr 24 23:59:18.296813 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:18.296730 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-64c4d9588d-jk6ql"] Apr 24 23:59:18.297336 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:18.296993 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-64c4d9588d-jk6ql" podUID="c9a4ed91-83cd-4c2e-8861-b0e410a2e2e7" containerName="manager" containerID="cri-o://65d9709fe3399e11c22a2a2c4da0dcaa4c9222f448530563f2bc76c009707424" gracePeriod=10 Apr 24 23:59:18.301615 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:18.301591 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-64c4d9588d-jk6ql" Apr 24 23:59:18.317630 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:18.317605 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-64c4d9588d-z25pd"] Apr 24 23:59:18.320626 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:18.320611 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-64c4d9588d-z25pd" Apr 24 23:59:18.330249 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:18.330223 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-64c4d9588d-z25pd"] Apr 24 23:59:18.456492 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:18.456448 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw2z4\" (UniqueName: \"kubernetes.io/projected/9804cf9b-0b68-451a-8af2-cf218e988412-kube-api-access-vw2z4\") pod \"kserve-controller-manager-64c4d9588d-z25pd\" (UID: \"9804cf9b-0b68-451a-8af2-cf218e988412\") " pod="kserve/kserve-controller-manager-64c4d9588d-z25pd" Apr 24 23:59:18.456723 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:18.456616 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9804cf9b-0b68-451a-8af2-cf218e988412-cert\") pod \"kserve-controller-manager-64c4d9588d-z25pd\" (UID: \"9804cf9b-0b68-451a-8af2-cf218e988412\") " pod="kserve/kserve-controller-manager-64c4d9588d-z25pd" Apr 24 23:59:18.534884 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:18.534862 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-64c4d9588d-jk6ql" Apr 24 23:59:18.557647 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:18.557566 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vw2z4\" (UniqueName: \"kubernetes.io/projected/9804cf9b-0b68-451a-8af2-cf218e988412-kube-api-access-vw2z4\") pod \"kserve-controller-manager-64c4d9588d-z25pd\" (UID: \"9804cf9b-0b68-451a-8af2-cf218e988412\") " pod="kserve/kserve-controller-manager-64c4d9588d-z25pd" Apr 24 23:59:18.557798 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:18.557654 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9804cf9b-0b68-451a-8af2-cf218e988412-cert\") pod \"kserve-controller-manager-64c4d9588d-z25pd\" (UID: \"9804cf9b-0b68-451a-8af2-cf218e988412\") " pod="kserve/kserve-controller-manager-64c4d9588d-z25pd" Apr 24 23:59:18.560351 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:18.560324 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9804cf9b-0b68-451a-8af2-cf218e988412-cert\") pod \"kserve-controller-manager-64c4d9588d-z25pd\" (UID: \"9804cf9b-0b68-451a-8af2-cf218e988412\") " pod="kserve/kserve-controller-manager-64c4d9588d-z25pd" Apr 24 23:59:18.566124 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:18.566100 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw2z4\" (UniqueName: \"kubernetes.io/projected/9804cf9b-0b68-451a-8af2-cf218e988412-kube-api-access-vw2z4\") pod \"kserve-controller-manager-64c4d9588d-z25pd\" (UID: \"9804cf9b-0b68-451a-8af2-cf218e988412\") " pod="kserve/kserve-controller-manager-64c4d9588d-z25pd" Apr 24 23:59:18.658687 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:18.658647 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-595pv\" (UniqueName: \"kubernetes.io/projected/c9a4ed91-83cd-4c2e-8861-b0e410a2e2e7-kube-api-access-595pv\") pod \"c9a4ed91-83cd-4c2e-8861-b0e410a2e2e7\" (UID: \"c9a4ed91-83cd-4c2e-8861-b0e410a2e2e7\") " Apr 24 23:59:18.658848 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:18.658725 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c9a4ed91-83cd-4c2e-8861-b0e410a2e2e7-cert\") pod \"c9a4ed91-83cd-4c2e-8861-b0e410a2e2e7\" (UID: \"c9a4ed91-83cd-4c2e-8861-b0e410a2e2e7\") " Apr 24 23:59:18.660809 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:18.660779 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9a4ed91-83cd-4c2e-8861-b0e410a2e2e7-cert" (OuterVolumeSpecName: "cert") pod "c9a4ed91-83cd-4c2e-8861-b0e410a2e2e7" (UID: "c9a4ed91-83cd-4c2e-8861-b0e410a2e2e7"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:59:18.660910 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:18.660819 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9a4ed91-83cd-4c2e-8861-b0e410a2e2e7-kube-api-access-595pv" (OuterVolumeSpecName: "kube-api-access-595pv") pod "c9a4ed91-83cd-4c2e-8861-b0e410a2e2e7" (UID: "c9a4ed91-83cd-4c2e-8861-b0e410a2e2e7"). InnerVolumeSpecName "kube-api-access-595pv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:59:18.684023 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:18.683996 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-64c4d9588d-z25pd" Apr 24 23:59:18.760286 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:18.759991 2567 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c9a4ed91-83cd-4c2e-8861-b0e410a2e2e7-cert\") on node \"ip-10-0-135-201.ec2.internal\" DevicePath \"\"" Apr 24 23:59:18.760286 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:18.760068 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-595pv\" (UniqueName: \"kubernetes.io/projected/c9a4ed91-83cd-4c2e-8861-b0e410a2e2e7-kube-api-access-595pv\") on node \"ip-10-0-135-201.ec2.internal\" DevicePath \"\"" Apr 24 23:59:18.804043 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:18.804022 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-64c4d9588d-z25pd"] Apr 24 23:59:18.806647 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:59:18.806623 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9804cf9b_0b68_451a_8af2_cf218e988412.slice/crio-dd647dbe281a1dd3b5b8008cc7800036bb2dcc6eb9ab9da3f3fcccc880c57df6 WatchSource:0}: Error finding container dd647dbe281a1dd3b5b8008cc7800036bb2dcc6eb9ab9da3f3fcccc880c57df6: Status 404 returned error can't find the container with id dd647dbe281a1dd3b5b8008cc7800036bb2dcc6eb9ab9da3f3fcccc880c57df6 Apr 24 23:59:19.261270 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:19.261241 2567 generic.go:358] "Generic (PLEG): container finished" podID="c9a4ed91-83cd-4c2e-8861-b0e410a2e2e7" containerID="65d9709fe3399e11c22a2a2c4da0dcaa4c9222f448530563f2bc76c009707424" exitCode=0 Apr 24 23:59:19.261475 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:19.261300 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-64c4d9588d-jk6ql" Apr 24 23:59:19.261475 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:19.261312 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-64c4d9588d-jk6ql" event={"ID":"c9a4ed91-83cd-4c2e-8861-b0e410a2e2e7","Type":"ContainerDied","Data":"65d9709fe3399e11c22a2a2c4da0dcaa4c9222f448530563f2bc76c009707424"} Apr 24 23:59:19.261475 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:19.261345 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-64c4d9588d-jk6ql" event={"ID":"c9a4ed91-83cd-4c2e-8861-b0e410a2e2e7","Type":"ContainerDied","Data":"acab37659d517a19f1e66f109b3b9abc64c55f7b5a210a344c668c41c1192a7d"} Apr 24 23:59:19.261475 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:19.261382 2567 scope.go:117] "RemoveContainer" containerID="65d9709fe3399e11c22a2a2c4da0dcaa4c9222f448530563f2bc76c009707424" Apr 24 23:59:19.262856 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:19.262833 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-64c4d9588d-z25pd" event={"ID":"9804cf9b-0b68-451a-8af2-cf218e988412","Type":"ContainerStarted","Data":"d09161bc3a5a4e2773732c7a6d4a5ad06538d4dfd931768add17f91d6dcd98e5"} Apr 24 23:59:19.262957 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:19.262867 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-64c4d9588d-z25pd" event={"ID":"9804cf9b-0b68-451a-8af2-cf218e988412","Type":"ContainerStarted","Data":"dd647dbe281a1dd3b5b8008cc7800036bb2dcc6eb9ab9da3f3fcccc880c57df6"} Apr 24 23:59:19.262957 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:19.262941 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-64c4d9588d-z25pd" Apr 24 23:59:19.269204 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:19.269185 2567 scope.go:117] "RemoveContainer" containerID="65d9709fe3399e11c22a2a2c4da0dcaa4c9222f448530563f2bc76c009707424" Apr 24 23:59:19.269516 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:59:19.269489 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65d9709fe3399e11c22a2a2c4da0dcaa4c9222f448530563f2bc76c009707424\": container with ID starting with 65d9709fe3399e11c22a2a2c4da0dcaa4c9222f448530563f2bc76c009707424 not found: ID does not exist" containerID="65d9709fe3399e11c22a2a2c4da0dcaa4c9222f448530563f2bc76c009707424" Apr 24 23:59:19.269589 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:19.269520 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65d9709fe3399e11c22a2a2c4da0dcaa4c9222f448530563f2bc76c009707424"} err="failed to get container status \"65d9709fe3399e11c22a2a2c4da0dcaa4c9222f448530563f2bc76c009707424\": rpc error: code = NotFound desc = could not find container \"65d9709fe3399e11c22a2a2c4da0dcaa4c9222f448530563f2bc76c009707424\": container with ID starting with 65d9709fe3399e11c22a2a2c4da0dcaa4c9222f448530563f2bc76c009707424 not found: ID does not exist" Apr 24 23:59:19.283700 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:19.283659 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-64c4d9588d-z25pd" podStartSLOduration=0.947715299 podStartE2EDuration="1.283648401s" podCreationTimestamp="2026-04-24 23:59:18 +0000 UTC" firstStartedPulling="2026-04-24 23:59:18.807791974 +0000 UTC m=+352.331076175" lastFinishedPulling="2026-04-24 23:59:19.143725071 +0000 UTC m=+352.667009277" observedRunningTime="2026-04-24 23:59:19.282652746 +0000 UTC m=+352.805936969" watchObservedRunningTime="2026-04-24 23:59:19.283648401 +0000 UTC m=+352.806932626" Apr 24 23:59:19.296990 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:19.296956 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-64c4d9588d-jk6ql"] Apr 24 23:59:19.301050 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:19.301024 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-64c4d9588d-jk6ql"] Apr 24 23:59:21.018932 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:21.018896 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9a4ed91-83cd-4c2e-8861-b0e410a2e2e7" path="/var/lib/kubelet/pods/c9a4ed91-83cd-4c2e-8861-b0e410a2e2e7/volumes" Apr 24 23:59:50.272326 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:50.272297 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-64c4d9588d-z25pd" Apr 24 23:59:51.072775 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:51.072738 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-6dbhf"] Apr 24 23:59:51.073189 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:51.073167 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c9a4ed91-83cd-4c2e-8861-b0e410a2e2e7" containerName="manager" Apr 24 23:59:51.073189 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:51.073189 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9a4ed91-83cd-4c2e-8861-b0e410a2e2e7" containerName="manager" Apr 24 23:59:51.073395 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:51.073284 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="c9a4ed91-83cd-4c2e-8861-b0e410a2e2e7" containerName="manager" Apr 24 23:59:51.076576 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:51.076557 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-6dbhf" Apr 24 23:59:51.079214 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:51.079190 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 24 23:59:51.079325 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:51.079288 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-mx9gg\"" Apr 24 23:59:51.085652 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:51.085549 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-6dbhf"] Apr 24 23:59:51.087458 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:51.087432 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-tm2ls"] Apr 24 23:59:51.090955 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:51.090936 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-tm2ls" Apr 24 23:59:51.093338 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:51.093321 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-7qnb6\"" Apr 24 23:59:51.093465 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:51.093320 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 24 23:59:51.097859 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:51.097836 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-tm2ls"] Apr 24 23:59:51.111684 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:51.111658 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrj64\" (UniqueName: \"kubernetes.io/projected/5560e36b-2edf-4de5-ac0a-211e70d3f96c-kube-api-access-xrj64\") pod \"model-serving-api-86f7b4b499-6dbhf\" (UID: \"5560e36b-2edf-4de5-ac0a-211e70d3f96c\") " pod="kserve/model-serving-api-86f7b4b499-6dbhf" Apr 24 23:59:51.111809 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:51.111702 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp2fs\" (UniqueName: \"kubernetes.io/projected/a92eeaad-f062-43cf-8f96-700aab2605ec-kube-api-access-kp2fs\") pod \"odh-model-controller-696fc77849-tm2ls\" (UID: \"a92eeaad-f062-43cf-8f96-700aab2605ec\") " pod="kserve/odh-model-controller-696fc77849-tm2ls" Apr 24 23:59:51.111809 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:51.111771 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a92eeaad-f062-43cf-8f96-700aab2605ec-cert\") pod \"odh-model-controller-696fc77849-tm2ls\" (UID: \"a92eeaad-f062-43cf-8f96-700aab2605ec\") " pod="kserve/odh-model-controller-696fc77849-tm2ls" Apr 24 23:59:51.111880 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:51.111863 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5560e36b-2edf-4de5-ac0a-211e70d3f96c-tls-certs\") pod \"model-serving-api-86f7b4b499-6dbhf\" (UID: \"5560e36b-2edf-4de5-ac0a-211e70d3f96c\") " pod="kserve/model-serving-api-86f7b4b499-6dbhf" Apr 24 23:59:51.213035 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:51.212997 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kp2fs\" (UniqueName: \"kubernetes.io/projected/a92eeaad-f062-43cf-8f96-700aab2605ec-kube-api-access-kp2fs\") pod \"odh-model-controller-696fc77849-tm2ls\" (UID: \"a92eeaad-f062-43cf-8f96-700aab2605ec\") " pod="kserve/odh-model-controller-696fc77849-tm2ls" Apr 24 23:59:51.213035 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:51.213048 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a92eeaad-f062-43cf-8f96-700aab2605ec-cert\") pod \"odh-model-controller-696fc77849-tm2ls\" (UID: \"a92eeaad-f062-43cf-8f96-700aab2605ec\") " pod="kserve/odh-model-controller-696fc77849-tm2ls" Apr 24 23:59:51.213275 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:51.213111 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5560e36b-2edf-4de5-ac0a-211e70d3f96c-tls-certs\") pod \"model-serving-api-86f7b4b499-6dbhf\" (UID: \"5560e36b-2edf-4de5-ac0a-211e70d3f96c\") " pod="kserve/model-serving-api-86f7b4b499-6dbhf" Apr 24 23:59:51.213275 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:59:51.213234 2567 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 24 23:59:51.213346 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:59:51.213306 2567 secret.go:189] Couldn't get secret kserve/model-serving-api-tls: secret "model-serving-api-tls" not found Apr 24 23:59:51.213408 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:59:51.213309 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a92eeaad-f062-43cf-8f96-700aab2605ec-cert podName:a92eeaad-f062-43cf-8f96-700aab2605ec nodeName:}" failed. No retries permitted until 2026-04-24 23:59:51.713288436 +0000 UTC m=+385.236572643 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a92eeaad-f062-43cf-8f96-700aab2605ec-cert") pod "odh-model-controller-696fc77849-tm2ls" (UID: "a92eeaad-f062-43cf-8f96-700aab2605ec") : secret "odh-model-controller-webhook-cert" not found Apr 24 23:59:51.213408 ip-10-0-135-201 kubenswrapper[2567]: E0424 23:59:51.213389 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5560e36b-2edf-4de5-ac0a-211e70d3f96c-tls-certs podName:5560e36b-2edf-4de5-ac0a-211e70d3f96c nodeName:}" failed. No retries permitted until 2026-04-24 23:59:51.713357739 +0000 UTC m=+385.236641947 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/5560e36b-2edf-4de5-ac0a-211e70d3f96c-tls-certs") pod "model-serving-api-86f7b4b499-6dbhf" (UID: "5560e36b-2edf-4de5-ac0a-211e70d3f96c") : secret "model-serving-api-tls" not found Apr 24 23:59:51.213496 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:51.213419 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xrj64\" (UniqueName: \"kubernetes.io/projected/5560e36b-2edf-4de5-ac0a-211e70d3f96c-kube-api-access-xrj64\") pod \"model-serving-api-86f7b4b499-6dbhf\" (UID: \"5560e36b-2edf-4de5-ac0a-211e70d3f96c\") " pod="kserve/model-serving-api-86f7b4b499-6dbhf" Apr 24 23:59:51.223897 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:51.223873 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrj64\" (UniqueName: \"kubernetes.io/projected/5560e36b-2edf-4de5-ac0a-211e70d3f96c-kube-api-access-xrj64\") pod \"model-serving-api-86f7b4b499-6dbhf\" (UID: \"5560e36b-2edf-4de5-ac0a-211e70d3f96c\") " pod="kserve/model-serving-api-86f7b4b499-6dbhf" Apr 24 23:59:51.224263 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:51.224242 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp2fs\" (UniqueName: \"kubernetes.io/projected/a92eeaad-f062-43cf-8f96-700aab2605ec-kube-api-access-kp2fs\") pod \"odh-model-controller-696fc77849-tm2ls\" (UID: \"a92eeaad-f062-43cf-8f96-700aab2605ec\") " pod="kserve/odh-model-controller-696fc77849-tm2ls" Apr 24 23:59:51.717906 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:51.717875 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5560e36b-2edf-4de5-ac0a-211e70d3f96c-tls-certs\") pod \"model-serving-api-86f7b4b499-6dbhf\" (UID: \"5560e36b-2edf-4de5-ac0a-211e70d3f96c\") " pod="kserve/model-serving-api-86f7b4b499-6dbhf" Apr 24 23:59:51.718281 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:51.717935 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a92eeaad-f062-43cf-8f96-700aab2605ec-cert\") pod \"odh-model-controller-696fc77849-tm2ls\" (UID: \"a92eeaad-f062-43cf-8f96-700aab2605ec\") " pod="kserve/odh-model-controller-696fc77849-tm2ls" Apr 24 23:59:51.720561 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:51.720535 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5560e36b-2edf-4de5-ac0a-211e70d3f96c-tls-certs\") pod \"model-serving-api-86f7b4b499-6dbhf\" (UID: \"5560e36b-2edf-4de5-ac0a-211e70d3f96c\") " pod="kserve/model-serving-api-86f7b4b499-6dbhf" Apr 24 23:59:51.720670 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:51.720535 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a92eeaad-f062-43cf-8f96-700aab2605ec-cert\") pod \"odh-model-controller-696fc77849-tm2ls\" (UID: \"a92eeaad-f062-43cf-8f96-700aab2605ec\") " pod="kserve/odh-model-controller-696fc77849-tm2ls" Apr 24 23:59:51.989577 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:51.989484 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-6dbhf" Apr 24 23:59:52.006220 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:52.006184 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-tm2ls" Apr 24 23:59:52.115936 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:52.115797 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-6dbhf"] Apr 24 23:59:52.119484 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:59:52.119442 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5560e36b_2edf_4de5_ac0a_211e70d3f96c.slice/crio-119e4e3688354f13b4c366180b9671809ea5ff4f77b345297e0fae2343a58aee WatchSource:0}: Error finding container 119e4e3688354f13b4c366180b9671809ea5ff4f77b345297e0fae2343a58aee: Status 404 returned error can't find the container with id 119e4e3688354f13b4c366180b9671809ea5ff4f77b345297e0fae2343a58aee Apr 24 23:59:52.138469 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:52.138445 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-tm2ls"] Apr 24 23:59:52.140378 ip-10-0-135-201 kubenswrapper[2567]: W0424 23:59:52.140342 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda92eeaad_f062_43cf_8f96_700aab2605ec.slice/crio-cf042929ac752f839f07831e12f903e769b5af59e8c8bbabdb9c1f2751b7d344 WatchSource:0}: Error finding container cf042929ac752f839f07831e12f903e769b5af59e8c8bbabdb9c1f2751b7d344: Status 404 returned error can't find the container with id cf042929ac752f839f07831e12f903e769b5af59e8c8bbabdb9c1f2751b7d344 Apr 24 23:59:52.370875 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:52.370784 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-tm2ls" event={"ID":"a92eeaad-f062-43cf-8f96-700aab2605ec","Type":"ContainerStarted","Data":"cf042929ac752f839f07831e12f903e769b5af59e8c8bbabdb9c1f2751b7d344"} Apr 24 23:59:52.371731 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:52.371704 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-6dbhf" event={"ID":"5560e36b-2edf-4de5-ac0a-211e70d3f96c","Type":"ContainerStarted","Data":"119e4e3688354f13b4c366180b9671809ea5ff4f77b345297e0fae2343a58aee"} Apr 24 23:59:55.386154 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:55.386113 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-tm2ls" event={"ID":"a92eeaad-f062-43cf-8f96-700aab2605ec","Type":"ContainerStarted","Data":"2961437f01caa93905f2e975fe32cf1e920e53dee6f8225f799abae0fb31397c"} Apr 24 23:59:55.386644 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:55.386257 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-tm2ls" Apr 24 23:59:55.387659 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:55.387628 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-6dbhf" event={"ID":"5560e36b-2edf-4de5-ac0a-211e70d3f96c","Type":"ContainerStarted","Data":"c2f1a54ac73dd02fc26daa571e428efd6097143d7474f2f1dd178567303ac523"} Apr 24 23:59:55.387803 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:55.387739 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-6dbhf" Apr 24 23:59:55.403520 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:55.403474 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-tm2ls" podStartSLOduration=1.594315161 podStartE2EDuration="4.403462579s" podCreationTimestamp="2026-04-24 23:59:51 +0000 UTC" firstStartedPulling="2026-04-24 23:59:52.141929715 +0000 UTC m=+385.665213917" lastFinishedPulling="2026-04-24 23:59:54.951077129 +0000 UTC m=+388.474361335" observedRunningTime="2026-04-24 23:59:55.401621123 +0000 UTC m=+388.924905476" watchObservedRunningTime="2026-04-24 23:59:55.403462579 +0000 UTC m=+388.926746802" Apr 24 23:59:55.418738 ip-10-0-135-201 kubenswrapper[2567]: I0424 23:59:55.418696 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-6dbhf" podStartSLOduration=1.639376833 podStartE2EDuration="4.418684161s" podCreationTimestamp="2026-04-24 23:59:51 +0000 UTC" firstStartedPulling="2026-04-24 23:59:52.121548724 +0000 UTC m=+385.644832939" lastFinishedPulling="2026-04-24 23:59:54.900856051 +0000 UTC m=+388.424140267" observedRunningTime="2026-04-24 23:59:55.417143402 +0000 UTC m=+388.940427627" watchObservedRunningTime="2026-04-24 23:59:55.418684161 +0000 UTC m=+388.941968385" Apr 25 00:00:00.145386 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:00.143597 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-pruner-29617920-v6swd"] Apr 25 00:00:00.148197 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:00.148171 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29617920-v6swd" Apr 25 00:00:00.154250 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:00.154209 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"pruner-dockercfg-s58vv\"" Apr 25 00:00:00.154250 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:00.154228 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"serviceca\"" Apr 25 00:00:00.156159 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:00.156136 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29617920-v6swd"] Apr 25 00:00:00.198109 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:00.198078 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/37e701b9-5be7-41e1-b320-f6de77b8d4c0-serviceca\") pod \"image-pruner-29617920-v6swd\" (UID: \"37e701b9-5be7-41e1-b320-f6de77b8d4c0\") " pod="openshift-image-registry/image-pruner-29617920-v6swd" Apr 25 00:00:00.198266 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:00.198142 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkkpz\" (UniqueName: \"kubernetes.io/projected/37e701b9-5be7-41e1-b320-f6de77b8d4c0-kube-api-access-jkkpz\") pod \"image-pruner-29617920-v6swd\" (UID: \"37e701b9-5be7-41e1-b320-f6de77b8d4c0\") " pod="openshift-image-registry/image-pruner-29617920-v6swd" Apr 25 00:00:00.299247 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:00.299209 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/37e701b9-5be7-41e1-b320-f6de77b8d4c0-serviceca\") pod \"image-pruner-29617920-v6swd\" (UID: \"37e701b9-5be7-41e1-b320-f6de77b8d4c0\") " pod="openshift-image-registry/image-pruner-29617920-v6swd" Apr 25 00:00:00.299434 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:00.299269 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jkkpz\" (UniqueName: \"kubernetes.io/projected/37e701b9-5be7-41e1-b320-f6de77b8d4c0-kube-api-access-jkkpz\") pod \"image-pruner-29617920-v6swd\" (UID: \"37e701b9-5be7-41e1-b320-f6de77b8d4c0\") " pod="openshift-image-registry/image-pruner-29617920-v6swd" Apr 25 00:00:00.299913 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:00.299892 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/37e701b9-5be7-41e1-b320-f6de77b8d4c0-serviceca\") pod \"image-pruner-29617920-v6swd\" (UID: \"37e701b9-5be7-41e1-b320-f6de77b8d4c0\") " pod="openshift-image-registry/image-pruner-29617920-v6swd" Apr 25 00:00:00.309300 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:00.309270 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkkpz\" (UniqueName: \"kubernetes.io/projected/37e701b9-5be7-41e1-b320-f6de77b8d4c0-kube-api-access-jkkpz\") pod \"image-pruner-29617920-v6swd\" (UID: \"37e701b9-5be7-41e1-b320-f6de77b8d4c0\") " pod="openshift-image-registry/image-pruner-29617920-v6swd" Apr 25 00:00:00.477660 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:00.477635 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29617920-v6swd" Apr 25 00:00:00.599376 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:00.599336 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29617920-v6swd"] Apr 25 00:00:00.601397 ip-10-0-135-201 kubenswrapper[2567]: W0425 00:00:00.601351 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37e701b9_5be7_41e1_b320_f6de77b8d4c0.slice/crio-c48c7836a67a7020be16e2c1fc62051c553d44c62349b1284e44df69bed3eed1 WatchSource:0}: Error finding container c48c7836a67a7020be16e2c1fc62051c553d44c62349b1284e44df69bed3eed1: Status 404 returned error can't find the container with id c48c7836a67a7020be16e2c1fc62051c553d44c62349b1284e44df69bed3eed1 Apr 25 00:00:01.409648 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:01.409613 2567 generic.go:358] "Generic (PLEG): container finished" podID="37e701b9-5be7-41e1-b320-f6de77b8d4c0" containerID="84d925bc19e6b3cf77b7d74b316ac99171e1772cc3731f1f152af70a3f196478" exitCode=0 Apr 25 00:00:01.410037 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:01.409704 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29617920-v6swd" event={"ID":"37e701b9-5be7-41e1-b320-f6de77b8d4c0","Type":"ContainerDied","Data":"84d925bc19e6b3cf77b7d74b316ac99171e1772cc3731f1f152af70a3f196478"} Apr 25 00:00:01.410037 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:01.409745 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29617920-v6swd" event={"ID":"37e701b9-5be7-41e1-b320-f6de77b8d4c0","Type":"ContainerStarted","Data":"c48c7836a67a7020be16e2c1fc62051c553d44c62349b1284e44df69bed3eed1"} Apr 25 00:00:02.554429 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:02.554404 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29617920-v6swd" Apr 25 00:00:02.622568 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:02.622538 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/37e701b9-5be7-41e1-b320-f6de77b8d4c0-serviceca\") pod \"37e701b9-5be7-41e1-b320-f6de77b8d4c0\" (UID: \"37e701b9-5be7-41e1-b320-f6de77b8d4c0\") " Apr 25 00:00:02.622736 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:02.622618 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkkpz\" (UniqueName: \"kubernetes.io/projected/37e701b9-5be7-41e1-b320-f6de77b8d4c0-kube-api-access-jkkpz\") pod \"37e701b9-5be7-41e1-b320-f6de77b8d4c0\" (UID: \"37e701b9-5be7-41e1-b320-f6de77b8d4c0\") " Apr 25 00:00:02.622978 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:02.622947 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37e701b9-5be7-41e1-b320-f6de77b8d4c0-serviceca" (OuterVolumeSpecName: "serviceca") pod "37e701b9-5be7-41e1-b320-f6de77b8d4c0" (UID: "37e701b9-5be7-41e1-b320-f6de77b8d4c0"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:00:02.624706 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:02.624675 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37e701b9-5be7-41e1-b320-f6de77b8d4c0-kube-api-access-jkkpz" (OuterVolumeSpecName: "kube-api-access-jkkpz") pod "37e701b9-5be7-41e1-b320-f6de77b8d4c0" (UID: "37e701b9-5be7-41e1-b320-f6de77b8d4c0"). InnerVolumeSpecName "kube-api-access-jkkpz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:00:02.723440 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:02.723409 2567 reconciler_common.go:299] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/37e701b9-5be7-41e1-b320-f6de77b8d4c0-serviceca\") on node \"ip-10-0-135-201.ec2.internal\" DevicePath \"\"" Apr 25 00:00:02.723440 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:02.723436 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jkkpz\" (UniqueName: \"kubernetes.io/projected/37e701b9-5be7-41e1-b320-f6de77b8d4c0-kube-api-access-jkkpz\") on node \"ip-10-0-135-201.ec2.internal\" DevicePath \"\"" Apr 25 00:00:03.417278 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:03.417187 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29617920-v6swd" Apr 25 00:00:03.417278 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:03.417197 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29617920-v6swd" event={"ID":"37e701b9-5be7-41e1-b320-f6de77b8d4c0","Type":"ContainerDied","Data":"c48c7836a67a7020be16e2c1fc62051c553d44c62349b1284e44df69bed3eed1"} Apr 25 00:00:03.417278 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:03.417231 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c48c7836a67a7020be16e2c1fc62051c553d44c62349b1284e44df69bed3eed1" Apr 25 00:00:06.393349 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:06.393319 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-tm2ls" Apr 25 00:00:06.395200 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:06.395178 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-6dbhf" Apr 25 00:00:29.305200 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:29.305166 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm"] Apr 25 00:00:29.305828 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:29.305513 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="37e701b9-5be7-41e1-b320-f6de77b8d4c0" containerName="image-pruner" Apr 25 00:00:29.305828 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:29.305525 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="37e701b9-5be7-41e1-b320-f6de77b8d4c0" containerName="image-pruner" Apr 25 00:00:29.305828 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:29.305582 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="37e701b9-5be7-41e1-b320-f6de77b8d4c0" containerName="image-pruner" Apr 25 00:00:29.308918 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:29.308899 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm" Apr 25 00:00:29.311633 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:29.311605 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 25 00:00:29.311754 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:29.311605 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 25 00:00:29.312746 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:29.312726 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\"" Apr 25 00:00:29.312746 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:29.312744 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-b9x42\"" Apr 25 00:00:29.312913 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:29.312726 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-1-predictor-serving-cert\"" Apr 25 00:00:29.317085 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:29.317063 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm"] Apr 25 00:00:29.355587 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:29.355516 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3ddc3bc0-d15e-456a-b81a-8ac66a64084d-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm\" (UID: \"3ddc3bc0-d15e-456a-b81a-8ac66a64084d\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm" Apr 25 00:00:29.355778 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:29.355614 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3ddc3bc0-d15e-456a-b81a-8ac66a64084d-proxy-tls\") pod \"isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm\" (UID: \"3ddc3bc0-d15e-456a-b81a-8ac66a64084d\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm" Apr 25 00:00:29.355778 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:29.355665 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgq7p\" (UniqueName: \"kubernetes.io/projected/3ddc3bc0-d15e-456a-b81a-8ac66a64084d-kube-api-access-mgq7p\") pod \"isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm\" (UID: \"3ddc3bc0-d15e-456a-b81a-8ac66a64084d\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm" Apr 25 00:00:29.355778 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:29.355695 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3ddc3bc0-d15e-456a-b81a-8ac66a64084d-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm\" (UID: \"3ddc3bc0-d15e-456a-b81a-8ac66a64084d\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm" Apr 25 00:00:29.456201 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:29.456148 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3ddc3bc0-d15e-456a-b81a-8ac66a64084d-proxy-tls\") pod \"isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm\" (UID: \"3ddc3bc0-d15e-456a-b81a-8ac66a64084d\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm" Apr 25 00:00:29.456407 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:29.456219 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mgq7p\" (UniqueName: \"kubernetes.io/projected/3ddc3bc0-d15e-456a-b81a-8ac66a64084d-kube-api-access-mgq7p\") pod \"isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm\" (UID: \"3ddc3bc0-d15e-456a-b81a-8ac66a64084d\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm" Apr 25 00:00:29.456407 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:29.456251 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3ddc3bc0-d15e-456a-b81a-8ac66a64084d-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm\" (UID: \"3ddc3bc0-d15e-456a-b81a-8ac66a64084d\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm" Apr 25 00:00:29.456407 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:29.456301 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3ddc3bc0-d15e-456a-b81a-8ac66a64084d-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm\" (UID: \"3ddc3bc0-d15e-456a-b81a-8ac66a64084d\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm" Apr 25 00:00:29.457070 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:29.457047 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3ddc3bc0-d15e-456a-b81a-8ac66a64084d-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm\" (UID: \"3ddc3bc0-d15e-456a-b81a-8ac66a64084d\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm" Apr 25 00:00:29.457263 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:29.457238 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3ddc3bc0-d15e-456a-b81a-8ac66a64084d-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm\" (UID: \"3ddc3bc0-d15e-456a-b81a-8ac66a64084d\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm" Apr 25 00:00:29.459808 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:29.459781 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3ddc3bc0-d15e-456a-b81a-8ac66a64084d-proxy-tls\") pod \"isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm\" (UID: \"3ddc3bc0-d15e-456a-b81a-8ac66a64084d\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm" Apr 25 00:00:29.468408 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:29.468380 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgq7p\" (UniqueName: \"kubernetes.io/projected/3ddc3bc0-d15e-456a-b81a-8ac66a64084d-kube-api-access-mgq7p\") pod \"isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm\" (UID: \"3ddc3bc0-d15e-456a-b81a-8ac66a64084d\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm" Apr 25 00:00:29.621520 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:29.621424 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm" Apr 25 00:00:29.754625 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:29.754583 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm"] Apr 25 00:00:29.758133 ip-10-0-135-201 kubenswrapper[2567]: W0425 00:00:29.758094 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ddc3bc0_d15e_456a_b81a_8ac66a64084d.slice/crio-2f4a64a16f73d60632ed9a5af82c6026cad587c3fffc4c530862d60b47f4a577 WatchSource:0}: Error finding container 2f4a64a16f73d60632ed9a5af82c6026cad587c3fffc4c530862d60b47f4a577: Status 404 returned error can't find the container with id 2f4a64a16f73d60632ed9a5af82c6026cad587c3fffc4c530862d60b47f4a577 Apr 25 00:00:30.345992 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:30.345948 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq"] Apr 25 00:00:30.350914 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:30.350890 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq" Apr 25 00:00:30.353496 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:30.353480 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\"" Apr 25 00:00:30.353594 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:30.353578 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-2-predictor-serving-cert\"" Apr 25 00:00:30.356796 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:30.356773 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq"] Apr 25 00:00:30.363941 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:30.363919 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d1878124-b07b-45fd-b342-a5ee83856e74-proxy-tls\") pod \"isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq\" (UID: \"d1878124-b07b-45fd-b342-a5ee83856e74\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq" Apr 25 00:00:30.364036 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:30.363972 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d1878124-b07b-45fd-b342-a5ee83856e74-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq\" (UID: \"d1878124-b07b-45fd-b342-a5ee83856e74\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq" Apr 25 00:00:30.364075 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:30.364033 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4vmj\" (UniqueName: \"kubernetes.io/projected/d1878124-b07b-45fd-b342-a5ee83856e74-kube-api-access-h4vmj\") pod \"isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq\" (UID: \"d1878124-b07b-45fd-b342-a5ee83856e74\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq" Apr 25 00:00:30.364126 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:30.364106 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d1878124-b07b-45fd-b342-a5ee83856e74-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq\" (UID: \"d1878124-b07b-45fd-b342-a5ee83856e74\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq" Apr 25 00:00:30.465159 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:30.465120 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d1878124-b07b-45fd-b342-a5ee83856e74-proxy-tls\") pod \"isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq\" (UID: \"d1878124-b07b-45fd-b342-a5ee83856e74\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq" Apr 25 00:00:30.465358 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:30.465188 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d1878124-b07b-45fd-b342-a5ee83856e74-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq\" (UID: \"d1878124-b07b-45fd-b342-a5ee83856e74\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq" Apr 25 00:00:30.465358 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:30.465220 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h4vmj\" (UniqueName: \"kubernetes.io/projected/d1878124-b07b-45fd-b342-a5ee83856e74-kube-api-access-h4vmj\") pod \"isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq\" (UID: \"d1878124-b07b-45fd-b342-a5ee83856e74\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq" Apr 25 00:00:30.465358 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:30.465289 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d1878124-b07b-45fd-b342-a5ee83856e74-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq\" (UID: \"d1878124-b07b-45fd-b342-a5ee83856e74\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq" Apr 25 00:00:30.465765 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:30.465731 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d1878124-b07b-45fd-b342-a5ee83856e74-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq\" (UID: \"d1878124-b07b-45fd-b342-a5ee83856e74\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq" Apr 25 00:00:30.466167 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:30.466135 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d1878124-b07b-45fd-b342-a5ee83856e74-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq\" (UID: \"d1878124-b07b-45fd-b342-a5ee83856e74\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq" Apr 25 00:00:30.468448 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:30.468427 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d1878124-b07b-45fd-b342-a5ee83856e74-proxy-tls\") pod \"isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq\" (UID: \"d1878124-b07b-45fd-b342-a5ee83856e74\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq" Apr 25 00:00:30.475036 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:30.475015 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4vmj\" (UniqueName: \"kubernetes.io/projected/d1878124-b07b-45fd-b342-a5ee83856e74-kube-api-access-h4vmj\") pod \"isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq\" (UID: \"d1878124-b07b-45fd-b342-a5ee83856e74\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq" Apr 25 00:00:30.516972 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:30.516936 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm" event={"ID":"3ddc3bc0-d15e-456a-b81a-8ac66a64084d","Type":"ContainerStarted","Data":"2f4a64a16f73d60632ed9a5af82c6026cad587c3fffc4c530862d60b47f4a577"} Apr 25 00:00:30.662434 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:30.662325 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq" Apr 25 00:00:30.808410 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:30.808383 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq"] Apr 25 00:00:30.810720 ip-10-0-135-201 kubenswrapper[2567]: W0425 00:00:30.810681 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1878124_b07b_45fd_b342_a5ee83856e74.slice/crio-b2c67c5eef9b1471b8d39cc9cc78a8a0c51b260043ec9bc4948df1433cc0c87a WatchSource:0}: Error finding container b2c67c5eef9b1471b8d39cc9cc78a8a0c51b260043ec9bc4948df1433cc0c87a: Status 404 returned error can't find the container with id b2c67c5eef9b1471b8d39cc9cc78a8a0c51b260043ec9bc4948df1433cc0c87a Apr 25 00:00:31.523710 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:31.523667 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq" event={"ID":"d1878124-b07b-45fd-b342-a5ee83856e74","Type":"ContainerStarted","Data":"b2c67c5eef9b1471b8d39cc9cc78a8a0c51b260043ec9bc4948df1433cc0c87a"} Apr 25 00:00:33.531639 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:33.531600 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm" event={"ID":"3ddc3bc0-d15e-456a-b81a-8ac66a64084d","Type":"ContainerStarted","Data":"d054796d96dc3e47d2e70803393be4285f89fc17f70192a58d872e7af3fdb44f"} Apr 25 00:00:33.533153 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:33.533114 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq" event={"ID":"d1878124-b07b-45fd-b342-a5ee83856e74","Type":"ContainerStarted","Data":"c5d72566a389eea8d044d52314ed7a8011b389e1ef9690bb48f60c37c142c5c7"} Apr 25 00:00:37.548460 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:37.548421 2567 generic.go:358] "Generic (PLEG): container finished" podID="d1878124-b07b-45fd-b342-a5ee83856e74" containerID="c5d72566a389eea8d044d52314ed7a8011b389e1ef9690bb48f60c37c142c5c7" exitCode=0 Apr 25 00:00:37.548840 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:37.548479 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq" event={"ID":"d1878124-b07b-45fd-b342-a5ee83856e74","Type":"ContainerDied","Data":"c5d72566a389eea8d044d52314ed7a8011b389e1ef9690bb48f60c37c142c5c7"} Apr 25 00:00:37.550009 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:37.549957 2567 generic.go:358] "Generic (PLEG): container finished" podID="3ddc3bc0-d15e-456a-b81a-8ac66a64084d" containerID="d054796d96dc3e47d2e70803393be4285f89fc17f70192a58d872e7af3fdb44f" exitCode=0 Apr 25 00:00:37.550009 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:37.549990 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm" event={"ID":"3ddc3bc0-d15e-456a-b81a-8ac66a64084d","Type":"ContainerDied","Data":"d054796d96dc3e47d2e70803393be4285f89fc17f70192a58d872e7af3fdb44f"} Apr 25 00:00:54.642070 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:54.642004 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm" event={"ID":"3ddc3bc0-d15e-456a-b81a-8ac66a64084d","Type":"ContainerStarted","Data":"1f14a895037cceb341c1a3d7729bdfb7969561ca36c9fbfed709dcd2e2b7e4ba"} Apr 25 00:00:54.644738 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:54.644706 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq" event={"ID":"d1878124-b07b-45fd-b342-a5ee83856e74","Type":"ContainerStarted","Data":"8c9128bb25e0a98055d1b2682a0d0f3ef781c286631fb02fe2fa9215b63b343d"} Apr 25 00:00:57.659310 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:57.659272 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm" event={"ID":"3ddc3bc0-d15e-456a-b81a-8ac66a64084d","Type":"ContainerStarted","Data":"c265a5d4eba41006c28bea0ab18f5c932997f0b3cd663e49be1a70cf77180fa1"} Apr 25 00:00:57.659800 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:57.659446 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm" Apr 25 00:00:57.661323 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:57.661301 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq" event={"ID":"d1878124-b07b-45fd-b342-a5ee83856e74","Type":"ContainerStarted","Data":"e304bb616def90b6f8cee403baf28a0fa72853556f9d03adcb93bedd91935bc6"} Apr 25 00:00:57.661474 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:57.661462 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq" Apr 25 00:00:57.680256 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:57.680204 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm" podStartSLOduration=1.648441593 podStartE2EDuration="28.680191125s" podCreationTimestamp="2026-04-25 00:00:29 +0000 UTC" firstStartedPulling="2026-04-25 00:00:29.760260638 +0000 UTC m=+423.283544840" lastFinishedPulling="2026-04-25 00:00:56.792010167 +0000 UTC m=+450.315294372" observedRunningTime="2026-04-25 00:00:57.678740888 +0000 UTC m=+451.202025112" watchObservedRunningTime="2026-04-25 00:00:57.680191125 +0000 UTC m=+451.203475349" Apr 25 00:00:57.698308 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:57.698257 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq" podStartSLOduration=1.730753945 podStartE2EDuration="27.69824528s" podCreationTimestamp="2026-04-25 00:00:30 +0000 UTC" firstStartedPulling="2026-04-25 00:00:30.81357277 +0000 UTC m=+424.336856986" lastFinishedPulling="2026-04-25 00:00:56.781064101 +0000 UTC m=+450.304348321" observedRunningTime="2026-04-25 00:00:57.696396029 +0000 UTC m=+451.219680254" watchObservedRunningTime="2026-04-25 00:00:57.69824528 +0000 UTC m=+451.221529504" Apr 25 00:00:58.665888 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:58.665848 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm" Apr 25 00:00:58.666262 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:58.665898 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq" Apr 25 00:00:58.666919 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:58.666892 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm" podUID="3ddc3bc0-d15e-456a-b81a-8ac66a64084d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 25 00:00:58.666995 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:58.666934 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq" podUID="d1878124-b07b-45fd-b342-a5ee83856e74" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 25 00:00:59.669412 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:59.669349 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm" podUID="3ddc3bc0-d15e-456a-b81a-8ac66a64084d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 25 00:00:59.669833 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:00:59.669454 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq" podUID="d1878124-b07b-45fd-b342-a5ee83856e74" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 25 00:01:04.673321 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:01:04.673291 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq" Apr 25 00:01:04.673797 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:01:04.673637 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm" Apr 25 00:01:04.673797 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:01:04.673701 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq" podUID="d1878124-b07b-45fd-b342-a5ee83856e74" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 25 00:01:04.674086 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:01:04.674055 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm" podUID="3ddc3bc0-d15e-456a-b81a-8ac66a64084d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 25 00:01:14.674569 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:01:14.674531 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm" podUID="3ddc3bc0-d15e-456a-b81a-8ac66a64084d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 25 00:01:14.674931 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:01:14.674537 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq" podUID="d1878124-b07b-45fd-b342-a5ee83856e74" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 25 00:01:24.673863 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:01:24.673825 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq" podUID="d1878124-b07b-45fd-b342-a5ee83856e74" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 25 00:01:24.674254 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:01:24.674043 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm" podUID="3ddc3bc0-d15e-456a-b81a-8ac66a64084d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 25 00:01:34.674094 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:01:34.674048 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm" podUID="3ddc3bc0-d15e-456a-b81a-8ac66a64084d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 25 00:01:34.674685 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:01:34.674045 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq" podUID="d1878124-b07b-45fd-b342-a5ee83856e74" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 25 00:01:44.674384 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:01:44.674337 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq" podUID="d1878124-b07b-45fd-b342-a5ee83856e74" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 25 00:01:44.674750 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:01:44.674337 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm" podUID="3ddc3bc0-d15e-456a-b81a-8ac66a64084d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 25 00:01:54.674123 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:01:54.674085 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm" podUID="3ddc3bc0-d15e-456a-b81a-8ac66a64084d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 25 00:01:54.674592 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:01:54.674088 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq" podUID="d1878124-b07b-45fd-b342-a5ee83856e74" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 25 00:02:04.675229 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:04.675197 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq" Apr 25 00:02:04.675752 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:04.675261 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm" Apr 25 00:02:39.328585 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:39.328488 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq"] Apr 25 00:02:39.329166 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:39.329056 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq" podUID="d1878124-b07b-45fd-b342-a5ee83856e74" containerName="kserve-container" containerID="cri-o://8c9128bb25e0a98055d1b2682a0d0f3ef781c286631fb02fe2fa9215b63b343d" gracePeriod=30 Apr 25 00:02:39.329166 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:39.329112 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq" podUID="d1878124-b07b-45fd-b342-a5ee83856e74" containerName="kube-rbac-proxy" containerID="cri-o://e304bb616def90b6f8cee403baf28a0fa72853556f9d03adcb93bedd91935bc6" gracePeriod=30 Apr 25 00:02:39.390772 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:39.390659 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm"] Apr 25 00:02:39.391052 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:39.391011 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm" podUID="3ddc3bc0-d15e-456a-b81a-8ac66a64084d" containerName="kserve-container" containerID="cri-o://1f14a895037cceb341c1a3d7729bdfb7969561ca36c9fbfed709dcd2e2b7e4ba" gracePeriod=30 Apr 25 00:02:39.391121 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:39.391064 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm" podUID="3ddc3bc0-d15e-456a-b81a-8ac66a64084d" containerName="kube-rbac-proxy" containerID="cri-o://c265a5d4eba41006c28bea0ab18f5c932997f0b3cd663e49be1a70cf77180fa1" gracePeriod=30 Apr 25 00:02:39.669916 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:39.669829 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq" podUID="d1878124-b07b-45fd-b342-a5ee83856e74" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.33:8643/healthz\": dial tcp 10.132.0.33:8643: connect: connection refused" Apr 25 00:02:39.670057 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:39.669834 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm" podUID="3ddc3bc0-d15e-456a-b81a-8ac66a64084d" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.32:8643/healthz\": dial tcp 10.132.0.32:8643: connect: connection refused" Apr 25 00:02:40.011567 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:40.011528 2567 generic.go:358] "Generic (PLEG): container finished" podID="d1878124-b07b-45fd-b342-a5ee83856e74" containerID="e304bb616def90b6f8cee403baf28a0fa72853556f9d03adcb93bedd91935bc6" exitCode=2 Apr 25 00:02:40.011786 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:40.011600 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq" event={"ID":"d1878124-b07b-45fd-b342-a5ee83856e74","Type":"ContainerDied","Data":"e304bb616def90b6f8cee403baf28a0fa72853556f9d03adcb93bedd91935bc6"} Apr 25 00:02:40.013570 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:40.013544 2567 generic.go:358] "Generic (PLEG): container finished" podID="3ddc3bc0-d15e-456a-b81a-8ac66a64084d" containerID="c265a5d4eba41006c28bea0ab18f5c932997f0b3cd663e49be1a70cf77180fa1" exitCode=2 Apr 25 00:02:40.013706 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:40.013582 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm" event={"ID":"3ddc3bc0-d15e-456a-b81a-8ac66a64084d","Type":"ContainerDied","Data":"c265a5d4eba41006c28bea0ab18f5c932997f0b3cd663e49be1a70cf77180fa1"} Apr 25 00:02:44.030948 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:44.030914 2567 generic.go:358] "Generic (PLEG): container finished" podID="d1878124-b07b-45fd-b342-a5ee83856e74" containerID="8c9128bb25e0a98055d1b2682a0d0f3ef781c286631fb02fe2fa9215b63b343d" exitCode=0 Apr 25 00:02:44.031295 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:44.030983 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq" event={"ID":"d1878124-b07b-45fd-b342-a5ee83856e74","Type":"ContainerDied","Data":"8c9128bb25e0a98055d1b2682a0d0f3ef781c286631fb02fe2fa9215b63b343d"} Apr 25 00:02:44.120405 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:44.120356 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq" Apr 25 00:02:44.219283 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:44.219245 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d1878124-b07b-45fd-b342-a5ee83856e74-proxy-tls\") pod \"d1878124-b07b-45fd-b342-a5ee83856e74\" (UID: \"d1878124-b07b-45fd-b342-a5ee83856e74\") " Apr 25 00:02:44.219473 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:44.219354 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d1878124-b07b-45fd-b342-a5ee83856e74-kserve-provision-location\") pod \"d1878124-b07b-45fd-b342-a5ee83856e74\" (UID: \"d1878124-b07b-45fd-b342-a5ee83856e74\") " Apr 25 00:02:44.219473 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:44.219437 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d1878124-b07b-45fd-b342-a5ee83856e74-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"d1878124-b07b-45fd-b342-a5ee83856e74\" (UID: \"d1878124-b07b-45fd-b342-a5ee83856e74\") " Apr 25 00:02:44.219473 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:44.219465 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4vmj\" (UniqueName: \"kubernetes.io/projected/d1878124-b07b-45fd-b342-a5ee83856e74-kube-api-access-h4vmj\") pod \"d1878124-b07b-45fd-b342-a5ee83856e74\" (UID: \"d1878124-b07b-45fd-b342-a5ee83856e74\") " Apr 25 00:02:44.219736 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:44.219708 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1878124-b07b-45fd-b342-a5ee83856e74-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d1878124-b07b-45fd-b342-a5ee83856e74" (UID: "d1878124-b07b-45fd-b342-a5ee83856e74"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:02:44.219843 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:44.219810 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1878124-b07b-45fd-b342-a5ee83856e74-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-2-kube-rbac-proxy-sar-config") pod "d1878124-b07b-45fd-b342-a5ee83856e74" (UID: "d1878124-b07b-45fd-b342-a5ee83856e74"). InnerVolumeSpecName "isvc-sklearn-graph-2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:02:44.221767 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:44.221745 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1878124-b07b-45fd-b342-a5ee83856e74-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d1878124-b07b-45fd-b342-a5ee83856e74" (UID: "d1878124-b07b-45fd-b342-a5ee83856e74"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:02:44.221854 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:44.221774 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1878124-b07b-45fd-b342-a5ee83856e74-kube-api-access-h4vmj" (OuterVolumeSpecName: "kube-api-access-h4vmj") pod "d1878124-b07b-45fd-b342-a5ee83856e74" (UID: "d1878124-b07b-45fd-b342-a5ee83856e74"). InnerVolumeSpecName "kube-api-access-h4vmj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:02:44.233576 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:44.233557 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm" Apr 25 00:02:44.320598 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:44.320566 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3ddc3bc0-d15e-456a-b81a-8ac66a64084d-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") pod \"3ddc3bc0-d15e-456a-b81a-8ac66a64084d\" (UID: \"3ddc3bc0-d15e-456a-b81a-8ac66a64084d\") " Apr 25 00:02:44.320774 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:44.320609 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgq7p\" (UniqueName: \"kubernetes.io/projected/3ddc3bc0-d15e-456a-b81a-8ac66a64084d-kube-api-access-mgq7p\") pod \"3ddc3bc0-d15e-456a-b81a-8ac66a64084d\" (UID: \"3ddc3bc0-d15e-456a-b81a-8ac66a64084d\") " Apr 25 00:02:44.320774 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:44.320638 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3ddc3bc0-d15e-456a-b81a-8ac66a64084d-proxy-tls\") pod \"3ddc3bc0-d15e-456a-b81a-8ac66a64084d\" (UID: \"3ddc3bc0-d15e-456a-b81a-8ac66a64084d\") " Apr 25 00:02:44.320774 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:44.320682 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3ddc3bc0-d15e-456a-b81a-8ac66a64084d-kserve-provision-location\") pod \"3ddc3bc0-d15e-456a-b81a-8ac66a64084d\" (UID: \"3ddc3bc0-d15e-456a-b81a-8ac66a64084d\") " Apr 25 00:02:44.320939 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:44.320820 2567 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d1878124-b07b-45fd-b342-a5ee83856e74-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-135-201.ec2.internal\" DevicePath \"\"" Apr 25 00:02:44.320939 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:44.320835 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h4vmj\" (UniqueName: \"kubernetes.io/projected/d1878124-b07b-45fd-b342-a5ee83856e74-kube-api-access-h4vmj\") on node \"ip-10-0-135-201.ec2.internal\" DevicePath \"\"" Apr 25 00:02:44.320939 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:44.320844 2567 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d1878124-b07b-45fd-b342-a5ee83856e74-proxy-tls\") on node \"ip-10-0-135-201.ec2.internal\" DevicePath \"\"" Apr 25 00:02:44.320939 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:44.320853 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d1878124-b07b-45fd-b342-a5ee83856e74-kserve-provision-location\") on node \"ip-10-0-135-201.ec2.internal\" DevicePath \"\"" Apr 25 00:02:44.321116 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:44.320962 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ddc3bc0-d15e-456a-b81a-8ac66a64084d-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-1-kube-rbac-proxy-sar-config") pod "3ddc3bc0-d15e-456a-b81a-8ac66a64084d" (UID: "3ddc3bc0-d15e-456a-b81a-8ac66a64084d"). InnerVolumeSpecName "isvc-sklearn-graph-1-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:02:44.321116 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:44.321075 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ddc3bc0-d15e-456a-b81a-8ac66a64084d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3ddc3bc0-d15e-456a-b81a-8ac66a64084d" (UID: "3ddc3bc0-d15e-456a-b81a-8ac66a64084d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:02:44.322747 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:44.322727 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ddc3bc0-d15e-456a-b81a-8ac66a64084d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "3ddc3bc0-d15e-456a-b81a-8ac66a64084d" (UID: "3ddc3bc0-d15e-456a-b81a-8ac66a64084d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:02:44.322820 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:44.322765 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ddc3bc0-d15e-456a-b81a-8ac66a64084d-kube-api-access-mgq7p" (OuterVolumeSpecName: "kube-api-access-mgq7p") pod "3ddc3bc0-d15e-456a-b81a-8ac66a64084d" (UID: "3ddc3bc0-d15e-456a-b81a-8ac66a64084d"). InnerVolumeSpecName "kube-api-access-mgq7p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:02:44.421534 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:44.421447 2567 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3ddc3bc0-d15e-456a-b81a-8ac66a64084d-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") on node \"ip-10-0-135-201.ec2.internal\" DevicePath \"\"" Apr 25 00:02:44.421534 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:44.421480 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mgq7p\" (UniqueName: \"kubernetes.io/projected/3ddc3bc0-d15e-456a-b81a-8ac66a64084d-kube-api-access-mgq7p\") on node \"ip-10-0-135-201.ec2.internal\" DevicePath \"\"" Apr 25 00:02:44.421534 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:44.421496 2567 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3ddc3bc0-d15e-456a-b81a-8ac66a64084d-proxy-tls\") on node \"ip-10-0-135-201.ec2.internal\" DevicePath \"\"" Apr 25 00:02:44.421534 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:44.421507 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3ddc3bc0-d15e-456a-b81a-8ac66a64084d-kserve-provision-location\") on node \"ip-10-0-135-201.ec2.internal\" DevicePath \"\"" Apr 25 00:02:45.035919 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:45.035891 2567 generic.go:358] "Generic (PLEG): container finished" podID="3ddc3bc0-d15e-456a-b81a-8ac66a64084d" containerID="1f14a895037cceb341c1a3d7729bdfb7969561ca36c9fbfed709dcd2e2b7e4ba" exitCode=0 Apr 25 00:02:45.036276 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:45.035970 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm" event={"ID":"3ddc3bc0-d15e-456a-b81a-8ac66a64084d","Type":"ContainerDied","Data":"1f14a895037cceb341c1a3d7729bdfb7969561ca36c9fbfed709dcd2e2b7e4ba"} Apr 25 00:02:45.036276 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:45.036010 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm" event={"ID":"3ddc3bc0-d15e-456a-b81a-8ac66a64084d","Type":"ContainerDied","Data":"2f4a64a16f73d60632ed9a5af82c6026cad587c3fffc4c530862d60b47f4a577"} Apr 25 00:02:45.036276 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:45.036028 2567 scope.go:117] "RemoveContainer" containerID="c265a5d4eba41006c28bea0ab18f5c932997f0b3cd663e49be1a70cf77180fa1" Apr 25 00:02:45.036276 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:45.035980 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm" Apr 25 00:02:45.037834 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:45.037810 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq" event={"ID":"d1878124-b07b-45fd-b342-a5ee83856e74","Type":"ContainerDied","Data":"b2c67c5eef9b1471b8d39cc9cc78a8a0c51b260043ec9bc4948df1433cc0c87a"} Apr 25 00:02:45.037834 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:45.037831 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq" Apr 25 00:02:45.044919 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:45.044900 2567 scope.go:117] "RemoveContainer" containerID="1f14a895037cceb341c1a3d7729bdfb7969561ca36c9fbfed709dcd2e2b7e4ba" Apr 25 00:02:45.052162 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:45.052142 2567 scope.go:117] "RemoveContainer" containerID="d054796d96dc3e47d2e70803393be4285f89fc17f70192a58d872e7af3fdb44f" Apr 25 00:02:45.054482 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:45.054460 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq"] Apr 25 00:02:45.059294 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:45.059272 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-76c859ccb-xxwlq"] Apr 25 00:02:45.059998 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:45.059980 2567 scope.go:117] "RemoveContainer" containerID="c265a5d4eba41006c28bea0ab18f5c932997f0b3cd663e49be1a70cf77180fa1" Apr 25 00:02:45.060225 ip-10-0-135-201 kubenswrapper[2567]: E0425 00:02:45.060209 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c265a5d4eba41006c28bea0ab18f5c932997f0b3cd663e49be1a70cf77180fa1\": container with ID starting with c265a5d4eba41006c28bea0ab18f5c932997f0b3cd663e49be1a70cf77180fa1 not found: ID does not exist" containerID="c265a5d4eba41006c28bea0ab18f5c932997f0b3cd663e49be1a70cf77180fa1" Apr 25 00:02:45.060277 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:45.060233 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c265a5d4eba41006c28bea0ab18f5c932997f0b3cd663e49be1a70cf77180fa1"} err="failed to get container status \"c265a5d4eba41006c28bea0ab18f5c932997f0b3cd663e49be1a70cf77180fa1\": rpc error: code = NotFound desc = could not find container \"c265a5d4eba41006c28bea0ab18f5c932997f0b3cd663e49be1a70cf77180fa1\": container with ID starting with c265a5d4eba41006c28bea0ab18f5c932997f0b3cd663e49be1a70cf77180fa1 not found: ID does not exist" Apr 25 00:02:45.060277 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:45.060252 2567 scope.go:117] "RemoveContainer" containerID="1f14a895037cceb341c1a3d7729bdfb7969561ca36c9fbfed709dcd2e2b7e4ba" Apr 25 00:02:45.060491 ip-10-0-135-201 kubenswrapper[2567]: E0425 00:02:45.060470 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f14a895037cceb341c1a3d7729bdfb7969561ca36c9fbfed709dcd2e2b7e4ba\": container with ID starting with 1f14a895037cceb341c1a3d7729bdfb7969561ca36c9fbfed709dcd2e2b7e4ba not found: ID does not exist" containerID="1f14a895037cceb341c1a3d7729bdfb7969561ca36c9fbfed709dcd2e2b7e4ba" Apr 25 00:02:45.060544 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:45.060501 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f14a895037cceb341c1a3d7729bdfb7969561ca36c9fbfed709dcd2e2b7e4ba"} err="failed to get container status \"1f14a895037cceb341c1a3d7729bdfb7969561ca36c9fbfed709dcd2e2b7e4ba\": rpc error: code = NotFound desc = could not find container \"1f14a895037cceb341c1a3d7729bdfb7969561ca36c9fbfed709dcd2e2b7e4ba\": container with ID starting with 1f14a895037cceb341c1a3d7729bdfb7969561ca36c9fbfed709dcd2e2b7e4ba not found: ID does not exist" Apr 25 00:02:45.060544 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:45.060521 2567 scope.go:117] "RemoveContainer" containerID="d054796d96dc3e47d2e70803393be4285f89fc17f70192a58d872e7af3fdb44f" Apr 25 00:02:45.060749 ip-10-0-135-201 kubenswrapper[2567]: E0425 00:02:45.060734 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d054796d96dc3e47d2e70803393be4285f89fc17f70192a58d872e7af3fdb44f\": container with ID starting with d054796d96dc3e47d2e70803393be4285f89fc17f70192a58d872e7af3fdb44f not found: ID does not exist" containerID="d054796d96dc3e47d2e70803393be4285f89fc17f70192a58d872e7af3fdb44f" Apr 25 00:02:45.060786 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:45.060754 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d054796d96dc3e47d2e70803393be4285f89fc17f70192a58d872e7af3fdb44f"} err="failed to get container status \"d054796d96dc3e47d2e70803393be4285f89fc17f70192a58d872e7af3fdb44f\": rpc error: code = NotFound desc = could not find container \"d054796d96dc3e47d2e70803393be4285f89fc17f70192a58d872e7af3fdb44f\": container with ID starting with d054796d96dc3e47d2e70803393be4285f89fc17f70192a58d872e7af3fdb44f not found: ID does not exist" Apr 25 00:02:45.060786 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:45.060768 2567 scope.go:117] "RemoveContainer" containerID="e304bb616def90b6f8cee403baf28a0fa72853556f9d03adcb93bedd91935bc6" Apr 25 00:02:45.068124 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:45.068105 2567 scope.go:117] "RemoveContainer" containerID="8c9128bb25e0a98055d1b2682a0d0f3ef781c286631fb02fe2fa9215b63b343d" Apr 25 00:02:45.070414 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:45.070389 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm"] Apr 25 00:02:45.071880 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:45.071860 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-cd88dc55b-j42rm"] Apr 25 00:02:45.075559 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:45.075544 2567 scope.go:117] "RemoveContainer" containerID="c5d72566a389eea8d044d52314ed7a8011b389e1ef9690bb48f60c37c142c5c7" Apr 25 00:02:47.019180 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:47.019146 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ddc3bc0-d15e-456a-b81a-8ac66a64084d" path="/var/lib/kubelet/pods/3ddc3bc0-d15e-456a-b81a-8ac66a64084d/volumes" Apr 25 00:02:47.019674 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:02:47.019661 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1878124-b07b-45fd-b342-a5ee83856e74" path="/var/lib/kubelet/pods/d1878124-b07b-45fd-b342-a5ee83856e74/volumes" Apr 25 00:03:26.953283 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:03:26.953256 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7hz6p_281d90a3-293d-45ac-8b4c-87bdc25f3882/ovn-acl-logging/0.log" Apr 25 00:03:26.954980 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:03:26.954958 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7hz6p_281d90a3-293d-45ac-8b4c-87bdc25f3882/ovn-acl-logging/0.log" Apr 25 00:08:26.977628 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:08:26.977539 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7hz6p_281d90a3-293d-45ac-8b4c-87bdc25f3882/ovn-acl-logging/0.log" Apr 25 00:08:26.981091 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:08:26.981059 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7hz6p_281d90a3-293d-45ac-8b4c-87bdc25f3882/ovn-acl-logging/0.log" Apr 25 00:13:27.001282 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:13:27.001251 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7hz6p_281d90a3-293d-45ac-8b4c-87bdc25f3882/ovn-acl-logging/0.log" Apr 25 00:13:27.003730 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:13:27.003708 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7hz6p_281d90a3-293d-45ac-8b4c-87bdc25f3882/ovn-acl-logging/0.log" Apr 25 00:18:27.029895 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:18:27.029861 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7hz6p_281d90a3-293d-45ac-8b4c-87bdc25f3882/ovn-acl-logging/0.log" Apr 25 00:18:27.033429 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:18:27.033399 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7hz6p_281d90a3-293d-45ac-8b4c-87bdc25f3882/ovn-acl-logging/0.log" Apr 25 00:23:27.053657 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:23:27.053550 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7hz6p_281d90a3-293d-45ac-8b4c-87bdc25f3882/ovn-acl-logging/0.log" Apr 25 00:23:27.057776 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:23:27.056801 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7hz6p_281d90a3-293d-45ac-8b4c-87bdc25f3882/ovn-acl-logging/0.log" Apr 25 00:28:27.075495 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:28:27.075386 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7hz6p_281d90a3-293d-45ac-8b4c-87bdc25f3882/ovn-acl-logging/0.log" Apr 25 00:28:27.079484 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:28:27.078795 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7hz6p_281d90a3-293d-45ac-8b4c-87bdc25f3882/ovn-acl-logging/0.log" Apr 25 00:33:27.097468 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:33:27.097338 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7hz6p_281d90a3-293d-45ac-8b4c-87bdc25f3882/ovn-acl-logging/0.log" Apr 25 00:33:27.102897 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:33:27.102875 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7hz6p_281d90a3-293d-45ac-8b4c-87bdc25f3882/ovn-acl-logging/0.log" Apr 25 00:38:27.118952 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:38:27.118833 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7hz6p_281d90a3-293d-45ac-8b4c-87bdc25f3882/ovn-acl-logging/0.log" Apr 25 00:38:27.125135 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:38:27.125117 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7hz6p_281d90a3-293d-45ac-8b4c-87bdc25f3882/ovn-acl-logging/0.log" Apr 25 00:40:07.317235 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:07.317201 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-6bc8h_96c2ed0b-d4e5-4737-9f3e-52e6828f930d/global-pull-secret-syncer/0.log" Apr 25 00:40:07.408219 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:07.408184 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-ckn8x_217a9b6d-2b5e-4ed8-87d7-5820bac053c5/konnectivity-agent/0.log" Apr 25 00:40:07.535572 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:07.535543 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-135-201.ec2.internal_32797e584e4864d985182231bd63814e/haproxy/0.log" Apr 25 00:40:11.216132 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:11.216103 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-ghjcw_14d409e9-bd22-416d-934a-018d672f2b6b/cluster-monitoring-operator/0.log" Apr 25 00:40:11.557231 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:11.557152 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-wfnsq_59f163d5-53f4-4bdf-b71d-c0dd23cc0261/node-exporter/0.log" Apr 25 00:40:11.579260 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:11.579232 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-wfnsq_59f163d5-53f4-4bdf-b71d-c0dd23cc0261/kube-rbac-proxy/0.log" Apr 25 00:40:11.601150 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:11.601122 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-wfnsq_59f163d5-53f4-4bdf-b71d-c0dd23cc0261/init-textfile/0.log" Apr 25 00:40:13.347193 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:13.347167 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-7p88r_a94b1d1a-a27a-4b4f-8bec-ad4468a49f04/networking-console-plugin/0.log" Apr 25 00:40:13.778586 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:13.778552 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vbnzj_a707c1f4-d5ea-444b-9ab4-37d50135c3c4/console-operator/0.log" Apr 25 00:40:14.542792 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:14.542766 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-j5w2w_6e0c7dfd-78db-4373-bdab-9fddbacaac5d/volume-data-source-validator/0.log" Apr 25 00:40:14.700591 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:14.700563 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6sp6k/perf-node-gather-daemonset-spdpd"] Apr 25 00:40:14.700909 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:14.700897 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d1878124-b07b-45fd-b342-a5ee83856e74" containerName="storage-initializer" Apr 25 00:40:14.700951 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:14.700912 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1878124-b07b-45fd-b342-a5ee83856e74" containerName="storage-initializer" Apr 25 00:40:14.700951 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:14.700925 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3ddc3bc0-d15e-456a-b81a-8ac66a64084d" containerName="kube-rbac-proxy" Apr 25 00:40:14.700951 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:14.700931 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ddc3bc0-d15e-456a-b81a-8ac66a64084d" containerName="kube-rbac-proxy" Apr 25 00:40:14.700951 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:14.700941 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3ddc3bc0-d15e-456a-b81a-8ac66a64084d" containerName="kserve-container" Apr 25 00:40:14.700951 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:14.700947 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ddc3bc0-d15e-456a-b81a-8ac66a64084d" containerName="kserve-container" Apr 25 00:40:14.701182 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:14.700958 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3ddc3bc0-d15e-456a-b81a-8ac66a64084d" containerName="storage-initializer" Apr 25 00:40:14.701182 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:14.700963 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ddc3bc0-d15e-456a-b81a-8ac66a64084d" containerName="storage-initializer" Apr 25 00:40:14.701182 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:14.700974 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d1878124-b07b-45fd-b342-a5ee83856e74" containerName="kserve-container" Apr 25 00:40:14.701182 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:14.700979 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1878124-b07b-45fd-b342-a5ee83856e74" containerName="kserve-container" Apr 25 00:40:14.701182 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:14.700986 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d1878124-b07b-45fd-b342-a5ee83856e74" containerName="kube-rbac-proxy" Apr 25 00:40:14.701182 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:14.700991 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1878124-b07b-45fd-b342-a5ee83856e74" containerName="kube-rbac-proxy" Apr 25 00:40:14.701182 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:14.701038 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="d1878124-b07b-45fd-b342-a5ee83856e74" containerName="kube-rbac-proxy" Apr 25 00:40:14.701182 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:14.701044 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="3ddc3bc0-d15e-456a-b81a-8ac66a64084d" containerName="kube-rbac-proxy" Apr 25 00:40:14.701182 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:14.701052 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="d1878124-b07b-45fd-b342-a5ee83856e74" containerName="kserve-container" Apr 25 00:40:14.701182 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:14.701059 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="3ddc3bc0-d15e-456a-b81a-8ac66a64084d" containerName="kserve-container" Apr 25 00:40:14.704000 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:14.703985 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-spdpd" Apr 25 00:40:14.706529 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:14.706509 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6sp6k\"/\"openshift-service-ca.crt\"" Apr 25 00:40:14.707602 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:14.707587 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6sp6k\"/\"kube-root-ca.crt\"" Apr 25 00:40:14.707602 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:14.707598 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-6sp6k\"/\"default-dockercfg-r2q97\"" Apr 25 00:40:14.713300 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:14.713277 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6sp6k/perf-node-gather-daemonset-spdpd"] Apr 25 00:40:14.755907 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:14.755876 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0662367c-4f55-4284-bcaa-139d72ea269d-proc\") pod \"perf-node-gather-daemonset-spdpd\" (UID: \"0662367c-4f55-4284-bcaa-139d72ea269d\") " pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-spdpd" Apr 25 00:40:14.756064 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:14.755942 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwkqg\" (UniqueName: \"kubernetes.io/projected/0662367c-4f55-4284-bcaa-139d72ea269d-kube-api-access-cwkqg\") pod \"perf-node-gather-daemonset-spdpd\" (UID: \"0662367c-4f55-4284-bcaa-139d72ea269d\") " pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-spdpd" Apr 25 00:40:14.756064 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:14.755983 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0662367c-4f55-4284-bcaa-139d72ea269d-lib-modules\") pod \"perf-node-gather-daemonset-spdpd\" (UID: \"0662367c-4f55-4284-bcaa-139d72ea269d\") " pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-spdpd" Apr 25 00:40:14.756064 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:14.756002 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0662367c-4f55-4284-bcaa-139d72ea269d-podres\") pod \"perf-node-gather-daemonset-spdpd\" (UID: \"0662367c-4f55-4284-bcaa-139d72ea269d\") " pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-spdpd" Apr 25 00:40:14.756064 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:14.756056 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0662367c-4f55-4284-bcaa-139d72ea269d-sys\") pod \"perf-node-gather-daemonset-spdpd\" (UID: \"0662367c-4f55-4284-bcaa-139d72ea269d\") " pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-spdpd" Apr 25 00:40:14.857488 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:14.857397 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0662367c-4f55-4284-bcaa-139d72ea269d-lib-modules\") pod \"perf-node-gather-daemonset-spdpd\" (UID: \"0662367c-4f55-4284-bcaa-139d72ea269d\") " pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-spdpd" Apr 25 00:40:14.857488 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:14.857454 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0662367c-4f55-4284-bcaa-139d72ea269d-podres\") pod \"perf-node-gather-daemonset-spdpd\" (UID: \"0662367c-4f55-4284-bcaa-139d72ea269d\") " pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-spdpd" Apr 25 00:40:14.857683 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:14.857493 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0662367c-4f55-4284-bcaa-139d72ea269d-sys\") pod \"perf-node-gather-daemonset-spdpd\" (UID: \"0662367c-4f55-4284-bcaa-139d72ea269d\") " pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-spdpd" Apr 25 00:40:14.857683 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:14.857528 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0662367c-4f55-4284-bcaa-139d72ea269d-proc\") pod \"perf-node-gather-daemonset-spdpd\" (UID: \"0662367c-4f55-4284-bcaa-139d72ea269d\") " pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-spdpd" Apr 25 00:40:14.857683 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:14.857584 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0662367c-4f55-4284-bcaa-139d72ea269d-podres\") pod \"perf-node-gather-daemonset-spdpd\" (UID: \"0662367c-4f55-4284-bcaa-139d72ea269d\") " pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-spdpd" Apr 25 00:40:14.857683 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:14.857584 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0662367c-4f55-4284-bcaa-139d72ea269d-lib-modules\") pod \"perf-node-gather-daemonset-spdpd\" (UID: \"0662367c-4f55-4284-bcaa-139d72ea269d\") " pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-spdpd" Apr 25 00:40:14.857683 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:14.857614 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0662367c-4f55-4284-bcaa-139d72ea269d-sys\") pod \"perf-node-gather-daemonset-spdpd\" (UID: \"0662367c-4f55-4284-bcaa-139d72ea269d\") " pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-spdpd" Apr 25 00:40:14.857683 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:14.857618 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cwkqg\" (UniqueName: \"kubernetes.io/projected/0662367c-4f55-4284-bcaa-139d72ea269d-kube-api-access-cwkqg\") pod \"perf-node-gather-daemonset-spdpd\" (UID: \"0662367c-4f55-4284-bcaa-139d72ea269d\") " pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-spdpd" Apr 25 00:40:14.857683 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:14.857648 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0662367c-4f55-4284-bcaa-139d72ea269d-proc\") pod \"perf-node-gather-daemonset-spdpd\" (UID: \"0662367c-4f55-4284-bcaa-139d72ea269d\") " pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-spdpd" Apr 25 00:40:14.865511 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:14.865482 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwkqg\" (UniqueName: \"kubernetes.io/projected/0662367c-4f55-4284-bcaa-139d72ea269d-kube-api-access-cwkqg\") pod \"perf-node-gather-daemonset-spdpd\" (UID: \"0662367c-4f55-4284-bcaa-139d72ea269d\") " pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-spdpd" Apr 25 00:40:15.014344 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:15.014303 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-spdpd" Apr 25 00:40:15.136590 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:15.136566 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6sp6k/perf-node-gather-daemonset-spdpd"] Apr 25 00:40:15.139122 ip-10-0-135-201 kubenswrapper[2567]: W0425 00:40:15.139090 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0662367c_4f55_4284_bcaa_139d72ea269d.slice/crio-7136417d2907e311ec495dc223417aa79bff1c0b01775b4c8a2dfcbd67acf8da WatchSource:0}: Error finding container 7136417d2907e311ec495dc223417aa79bff1c0b01775b4c8a2dfcbd67acf8da: Status 404 returned error can't find the container with id 7136417d2907e311ec495dc223417aa79bff1c0b01775b4c8a2dfcbd67acf8da Apr 25 00:40:15.140565 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:15.140550 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 25 00:40:15.307435 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:15.307400 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-wwswl_f7461a6a-d0ad-4be6-93d1-2197570b77bb/dns/0.log" Apr 25 00:40:15.321357 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:15.321325 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-spdpd" event={"ID":"0662367c-4f55-4284-bcaa-139d72ea269d","Type":"ContainerStarted","Data":"98bbd8725681b040589f9b919ad3f2b04d698442ac0b1f025b8472a1a4c99120"} Apr 25 00:40:15.321564 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:15.321390 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-spdpd" event={"ID":"0662367c-4f55-4284-bcaa-139d72ea269d","Type":"ContainerStarted","Data":"7136417d2907e311ec495dc223417aa79bff1c0b01775b4c8a2dfcbd67acf8da"} Apr 25 00:40:15.321564 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:15.321453 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-spdpd" Apr 25 00:40:15.327690 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:15.327668 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-wwswl_f7461a6a-d0ad-4be6-93d1-2197570b77bb/kube-rbac-proxy/0.log" Apr 25 00:40:15.336869 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:15.336825 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-spdpd" podStartSLOduration=1.336809079 podStartE2EDuration="1.336809079s" podCreationTimestamp="2026-04-25 00:40:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:40:15.336047176 +0000 UTC m=+2808.859331400" watchObservedRunningTime="2026-04-25 00:40:15.336809079 +0000 UTC m=+2808.860093303" Apr 25 00:40:15.351934 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:15.351913 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-75bcc_323b58fb-a204-4400-a836-973ccf33cd8e/dns-node-resolver/0.log" Apr 25 00:40:15.754733 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:15.754680 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-pruner-29617920-v6swd_37e701b9-5be7-41e1-b320-f6de77b8d4c0/image-pruner/0.log" Apr 25 00:40:15.843538 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:15.843513 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-cjk8s_49ea4681-f36b-4e20-a2b5-d76f46611b7a/node-ca/0.log" Apr 25 00:40:16.577524 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:16.577491 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-546cdcb66d-p7rsx_3ac1b4af-a2a2-4bcf-b2ea-9456a735084d/router/0.log" Apr 25 00:40:16.882259 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:16.882188 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-7gvnb_6522895a-2de1-4503-81b5-929a4a7a71b2/serve-healthcheck-canary/0.log" Apr 25 00:40:17.267762 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:17.267732 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-lrmcq_10b8e145-3ffa-4b6b-bc9e-ca257d2ed1bd/insights-operator/0.log" Apr 25 00:40:17.435190 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:17.435162 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-vv7gt_905b783e-82e9-4f73-8977-416326429b44/kube-rbac-proxy/0.log" Apr 25 00:40:17.461272 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:17.461243 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-vv7gt_905b783e-82e9-4f73-8977-416326429b44/exporter/0.log" Apr 25 00:40:17.482669 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:17.482643 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-vv7gt_905b783e-82e9-4f73-8977-416326429b44/extractor/0.log" Apr 25 00:40:19.350022 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:19.349979 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-64c4d9588d-z25pd_9804cf9b-0b68-451a-8af2-cf218e988412/manager/0.log" Apr 25 00:40:19.394136 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:19.394105 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-6dbhf_5560e36b-2edf-4de5-ac0a-211e70d3f96c/server/0.log" Apr 25 00:40:19.631273 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:19.631203 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-tm2ls_a92eeaad-f062-43cf-8f96-700aab2605ec/manager/0.log" Apr 25 00:40:19.676948 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:19.676917 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-7j9xt_e264160c-0e57-4a81-b81f-19161f89fec1/seaweedfs/0.log" Apr 25 00:40:21.334738 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:21.334708 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-spdpd" Apr 25 00:40:23.751101 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:23.751066 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-xf6dv_9cfc1266-6758-44b5-9cbf-386bf602a3fc/kube-storage-version-migrator-operator/0.log" Apr 25 00:40:24.998402 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:24.998302 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mmdsm_1134624f-34d1-4f6e-8821-435df2b54c9b/kube-multus-additional-cni-plugins/0.log" Apr 25 00:40:25.018260 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:25.018224 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mmdsm_1134624f-34d1-4f6e-8821-435df2b54c9b/egress-router-binary-copy/0.log" Apr 25 00:40:25.037858 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:25.037821 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mmdsm_1134624f-34d1-4f6e-8821-435df2b54c9b/cni-plugins/0.log" Apr 25 00:40:25.058091 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:25.058060 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mmdsm_1134624f-34d1-4f6e-8821-435df2b54c9b/bond-cni-plugin/0.log" Apr 25 00:40:25.078588 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:25.078563 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mmdsm_1134624f-34d1-4f6e-8821-435df2b54c9b/routeoverride-cni/0.log" Apr 25 00:40:25.097584 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:25.097552 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mmdsm_1134624f-34d1-4f6e-8821-435df2b54c9b/whereabouts-cni-bincopy/0.log" Apr 25 00:40:25.116868 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:25.116840 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mmdsm_1134624f-34d1-4f6e-8821-435df2b54c9b/whereabouts-cni/0.log" Apr 25 00:40:25.349924 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:25.349850 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-x9qdz_55dc1a60-4f7d-4366-aca1-303ce72f4d84/kube-multus/0.log" Apr 25 00:40:25.372250 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:25.372221 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-7wg4q_4a7b82bd-bf6c-4091-8f48-64cea3e964a8/network-metrics-daemon/0.log" Apr 25 00:40:25.392747 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:25.392725 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-7wg4q_4a7b82bd-bf6c-4091-8f48-64cea3e964a8/kube-rbac-proxy/0.log" Apr 25 00:40:26.293145 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:26.293119 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7hz6p_281d90a3-293d-45ac-8b4c-87bdc25f3882/ovn-controller/0.log" Apr 25 00:40:26.310667 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:26.310635 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7hz6p_281d90a3-293d-45ac-8b4c-87bdc25f3882/ovn-acl-logging/0.log" Apr 25 00:40:26.322291 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:26.322268 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7hz6p_281d90a3-293d-45ac-8b4c-87bdc25f3882/ovn-acl-logging/1.log" Apr 25 00:40:26.339610 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:26.339591 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7hz6p_281d90a3-293d-45ac-8b4c-87bdc25f3882/kube-rbac-proxy-node/0.log" Apr 25 00:40:26.360039 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:26.360020 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7hz6p_281d90a3-293d-45ac-8b4c-87bdc25f3882/kube-rbac-proxy-ovn-metrics/0.log" Apr 25 00:40:26.379815 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:26.379799 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7hz6p_281d90a3-293d-45ac-8b4c-87bdc25f3882/northd/0.log" Apr 25 00:40:26.399787 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:26.399720 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7hz6p_281d90a3-293d-45ac-8b4c-87bdc25f3882/nbdb/0.log" Apr 25 00:40:26.419418 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:26.419400 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7hz6p_281d90a3-293d-45ac-8b4c-87bdc25f3882/sbdb/0.log" Apr 25 00:40:26.515146 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:26.515115 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7hz6p_281d90a3-293d-45ac-8b4c-87bdc25f3882/ovnkube-controller/0.log" Apr 25 00:40:28.085942 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:28.085917 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-s5dsn_c6962ead-9647-463e-ad57-5aec7076f198/check-endpoints/0.log" Apr 25 00:40:28.108509 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:28.108483 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-f9vfv_c44399ed-9019-430d-83d7-8cde0e6f0d03/network-check-target-container/0.log" Apr 25 00:40:29.049142 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:29.049110 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-2m9kc_119f77e1-fd03-4239-a927-07a04816a852/iptables-alerter/0.log" Apr 25 00:40:29.669565 ip-10-0-135-201 kubenswrapper[2567]: I0425 00:40:29.669534 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-nb82k_50d11a09-c912-418a-ab1b-e1f5272b1d2f/tuned/0.log"